Despite significant investments in data infrastructure, many businesses haven’t realised its measurable value. It’s time to shift focus from mere cost savings to using AI for enhanced business performance, with ‘value’ as the guiding star.
A smart and experienced Chief Data and Analytics Officer (CDAO) recently told me that he is “putting value back as the North Star” of the data organisation he leads at a large financial services company.
At first glance it may seem slightly extraordinary that a serious part of a large organisation is not already focused on how they deliver value. After all without value what else is there?
But his perceived need to refocus, chimes with a wider note of frustration amongst many executives I talk to about where they are currently at in the cycle of data investment.
They’ve taken a lot of time and spent a lot of money building data infrastructure; wading through all manner of migration, transformation and replatforming. But in many cases, they haven’t yet got anything to show for it that creates measurable value for the business, its customers or its employees.
“Where are the returns on our data infrastructure?”
The cost savings that usually comprise the primary business case for migrations are real, but take time to realise. A half-complete migration often involves running both new and legacy infrastructure at once, and savings only arrive once the latter can be turned off.
It is at this moment, when investments have been made but returns are yet to flow, that patience is stretched the most. Those who are making the investments, Chief Information Officers (CIOs) and Chief Data Officers (CDOs), are acutely aware of the delicate situation this leaves them in.
The good news for them is that just because cost savings have taken a long time to achieve, doesn’t mean the same can be said for the improvements to wider business performance facilitated by data infrastructure. Because whether they were deliberate about it or not at the time they were laying data foundations, the foundations they laid will enable their businesses to quickly capitalise on the AI era.
Capitalising on your data with AI
As a result, it is now time to move from data infrastructure as a route to cost saving, to AI as a pathway to improved business performance. The bulk of change energy from here should be focused on how to deliver this return. This means building applications on top of the data that can optimise current business processes today, and reinvent them into new forms tomorrow. Here you can view countless case studies of how we’ve made this work in practice.
But to ensure that technology organisations feel ready to start down this path it is worth taking a moment to reflect on what has driven the focus on cost, rather than improvement and the resulting prolonged cycles of investment without a clear path to RoI.
In our experience, a set of myths and misconceptions sit at the root. The myth of perfect data. And the misconception that data infrastructure needs to be in some way ‘complete’ before it can be useful for anything.
Yes, your data is ‘good enough’ to get started
The truth is, data is never perfect. And data infrastructure is never finished. Both exist in permanent states of imperfection, and constant evolution. Resisting this reality, and pursuing some end state is a futile endeavour.
To illustrate this point, Marc, our CEO at Faculty, likes to play a game when giving a talk where he asks the audience to raise their hand if they think their data is “ready”. So far, out of a cumulative audience comfortably in the 1000s, we have seen one hand (and I suspect that guy was just being contrarian…).
But while imperfections in the data stack exist everywhere, they are no reason to delay the pursuit of business value from data. Because (and this may seem heretical from someone working in AI) in many respects the data should come last, not first.
The right approach is as follows:
• Start by identifying where, among your strategic goals, there are opportunities to deploy AI to solve an important problem that improves performance.
• Then figure out what data you need to power each of these possibilities, and whether you have it.
• In those cases where you do have the data, determine what kind of infrastructure is necessary to store and serve data in a way that services the application that uses it.
• Then prioritise the infrastructure that you build accordingly.
This sequence creates the tightest possible feedback loop between the data, its infrastructure and the application that generates the value. By building in increments, it avoids the neverending task of trying to build the whole thing end-to-end at once. And avoids wasting effort on infrastructure whose ultimate utility is not clearly understood and specified.
In short… it prioritises ‘value’ as the north star of the data organisation.
Interested in finding out how Faculty can help you begin the journey towards re-establishing ‘value’ as your North Star? Get in touch today.