If the market has a favourite word right now – besides ‘unprecedented’, of course – it’s probably the word ‘normal’. Either we’re adjusting to the ‘new normal’, attempting to continue ‘normal operations’ or, most often, planning what we’ll do when all this is over and things go ‘back to normal’. 

For many, Monday’s news that the UK could be up, running and free of all legal social restrictions by 21 June is a sign that this long-awaited swing back to pre-COVID normality is on its way. Reunions are being planned, shopping trips plotted, and holiday bookings surged 500% overnight; seemingly every consumer in the country is currently deciding where they’ll take their lockdown savings once restrictions ease. 

This is definitely good news. But moving from economic stagnation to a huge boom of economic activity over such a short timespan is anything but normal. No brand has ever managed a transition like this before. 

At this stage, it’s not even clear which sectors of the economy will see the bulk of the benefit as the economy opens up. Will shoppers flock to high streets, eager to finally pick up a new product that doesn’t come wrapped in Amazon-branded cardboard? Or will they head to the local pubs to meet friends, or hop on the first train in search of new scenery? 

The threat of new variants and disparate impact of the pandemic across socioeconomic groups also means that the spectre of COVID and re-lockdowns will be with us long after summer 2021.

Predicting the impact of all these uncertainties so far in advance is almost impossible. But if brands want to navigate the uncertainty successfully, it’s vital that they set themselves up properly when it comes to the way they use data. 

At Faculty, we’ve spent much of the last year helping organisations deal with data-related uncertainty – either predicting how data will change, or responding to issues caused by shifts in the underlying data that brands use for their AI models. 

In that time, we’ve identified three things that brands need to handle data-related uncertainty. 

1. Alternative approaches to demand forecasting 

Most demand forecasting approaches are heavily reliant on large panels of historical data, using past trends and cycles in purchase behaviour to predict future trends. 

So it’s not great when your 2020 data looks wildly different from your 2019 data – and will probably look equally different from Q2 2021 data. 

As Thomas Davenport, a professor at Babson University told MIT Sloan in January: ‘It’s hard to get good data about the future, so we have to use data from the past. And if the past is no longer a guide to the future, we’re going to have a tough time doing any sort of predictive analytics.’

Companies that are overly-reliant on off-the-shelf demand forecasting tools will face significant challenges reacting to the changes that the recovery will bring – ultimately risking either stock-outs or excess inventory.

Brands should focus on understanding the underlying factors that drive demand for their products. We help our customers incorporate external sources of data (such as high-street footfall, consumer search activity or web interactions) into their demand forecasts. Further to this, hierarchical modelling techniques that can learn the subtle trends between products and geographies are often highly valuable in making more accurate, more granular forecasts during times of uncertainty.

With that knowledge, brands aren’t restricted to long-term forecasting based on a high-level understanding of customer behaviour. Instead, they can monitor these more granular metrics in real time, forecast short term demand fluctuations, and react accordingly. That real-time insight will be vital in the most uncertain period, after restrictions ease but before we understand how the market as a whole is performing.

2. Taking an incrementality mindset to marketing

How do you avoid marketing to a whole bunch of customers who are likely to come back to your business anyway? How do you know who’s already decided they’ll be spending their lockdown savings on pints at the pub and definitely isn’t going to be buying from your latest loungewear range?

How many travel companies were funnelling money into huge marketing campaigns, only to find that their sales jumped 500% overnight without them having to lift a finger? 

It’s likely that some brands will waste a significant amount of money in the next few months, either by marketing to ‘sure things’ who are already hammering down the doors to buy, or by spending money on consumers who have already decided to spend their money elsewhere. 

Old favourites like RFM segmentation are much less useful in the wake of widespread shifts in consumer behaviour, since they only tell you who’s already spending, not who might spend. Instead, optimising for the predicted incrementality of your marketing – how each penny you spend is likely to influence each customer’s behaviour – becomes vital. 

Brands should look to AI approaches like uplift modelling that can better model human behaviour and separate correlation from causation to pinpoint the customers you can influence with marketing. 

3. Robust AI models 

As McKinsey noted in September of 2020, global lockdowns exposed a fatal flaw in many machine learning models: their brittleness in the face of changing environments. Machine learning predicts future behaviour from past patterns. But past patterns aren’t much use in unprecedented situations.

By the middle of last year, it was clear that major shifts in consumer behaviour had sent seemingly solid machine learning models into a downward spiral of inaccurate predictions and out-of-touch recommendations. One energy company realised that its hyper-powerful trading model fell apart when confronted with crude oil prices that had dropped below zero; the data scientists that built it had assumed prices would never fall into negative numbers, so they hadn’t trained the model to respond to such drops. 

Most brands have, by now, caught up to this flaw. Many will have created a ‘pandemic edition’ of their models, trained on this new dataset. 

But now the dataset is about to change again. And, when it does, we’ll see which companies have built their models back stronger and more robust to change – and which have just slapped some duct tape over their newly exposed weaknesses. 

As we have written about in the past, measuring and monitoring the robustness of models is something that all businesses deploying machine learning in the real world should be doing.

Robustness tools can validate that models are still working even when the dynamics of the underlying data change, and  also advise when outputs might not be reliable. Brands that have invested here will see a clear advantage in adaptability and responsiveness in their use of data versus competitors. 

Embracing the ‘new normal’ opportunity 

In almost every walk of life, the COVID-19 pandemic has exposed weaknesses in our systems. The way that brands use technology, market to customers and predict demand is no exception.

But it’s also given brands an opportunity to build back stronger; not just to wait for ‘normal’ to resume, but to actively work to build a ‘new normal’ that’s more robust, more flexible, and built on a solid foundation of data analysis. 


At Faculty, we work with brands across the Retail, eCommerce, Consumer Goods, and Travel, Leisure and Hospitality industries to help them understand, predict and influence consumer behaviour – even in times of economic and data uncertainty. If you’d like to find out more about our demand forecasting, marketing optimisation or AI robustness tools, do get in touch. 


Recent Blogs

Subscribe

Subscribe to our newsletter and never miss out on updates from our experts.