Lesson 09

NCA

Business strategy trumps AI strategy.
background placeholder

Good strategy is built on identifying what’s most important. When a huge trove of confidential intelligence reached the National Crime Agency, it threatened to swamp their ability to analyse it using traditional methods. A team of dedicated professionals had to find a way to process the information faster than ever before, in order to track down the worst offenders and stop them.

󠀠

󠀠
Cerys Evans looks the opposite of dangerous.

󠀠

Bright, self-deprecating, and ever so slightly geeky, she exudes warmth and positive energy. She lights up when talking about her dog. If you sat opposite her on a train, you might guess she worked in publishing, or maybe a trendy branch of academia (she is, in fact, doing a PhD). Only - if you were paranoid - you might notice her spectacles, outsize gold-rimmed lenses like a pair of magnifying glasses. Almost as if she was watching you. If you’re a certain type of criminal, she is watching you.

󠀠

Cerys works for the National Crime Agency (NCA), the UK police organisation charged with leading the fight against serious and organised crime, and tackling the UK’s most dangerous criminals. Established in 2013 and quickly dubbed ‘Britain’s FBI’, the NCA works at the leading edge of law enforcement to build the best possible intelligence picture of criminal threats, and develop innovative capabilities for other partners to use.

Although often underestimated, serious and organised crime is one of the most acute threats facing the UK today. It blights communities, ruins lives, and is estimated to cost the country at least £37 billion each year. It affects more citizens, more frequently, than any other national security threat; and leads to more deaths in the UK than terrorism, war and natural disasters combined.

NCA officers are in the frontline against that threat. In recent years they’ve broken open networks that smuggle guns, drugs, money and people. They’ve tracked down fugitive criminals to their hiding places overseas, and also disrupted the gangs that supplied the runaways with their fraudulent passports. At home, their specialist officers support local police forces with complex investigations by providing niche expertise. 

The NCA is also the lead agency dealing with the worst cases of child sexual abuse. The team is made up of a range of experienced, diligent professionals dedicated to protecting children. Cerys is one of those people.

Online exploitation is a growing threat


Child sexual abuse and exploitation means forcing or inciting a person under the age of 18 to engage in sexual activity. It includes physical sexual abuse, as well as online offences such as grooming, incitement, sexual communication, and creating or sharing child sex abuse imagery. Most people will naturally recoil from the subject, but vulnerable children rely on adults like Cerys not looking away.

It would be reassuring to think that these are fringe crimes, the work of a tiny, depraved minority, but the numbers tell a depressingly different story. According to the NCA’s 2021 National Strategic Assessment, there are estimated to be between 550,000 and 850,000 people who pose a sexual risk to children in the UK alone.

According to the National Center for Missing and Exploited Children’s 2023 annual report, there was a 300% increase in online enticement between 2021-23, a figure that Cerys succinctly describes as ‘insane’. And as horrifying as these figures are, even worse is the fact that more cases of child sexual abuse remain unidentified and under-reported.

󠀠
󠀠

‘When I first started working in this threat area,’ says Cerys, ‘there was a belief that you could arrest your way out of the issue.’ If you put enough of the bad guys behind bars, you’d solve the problem and keep children safe. Years of hard experience have exposed that hope as painfully over-optimistic. Not only has the issue grown, but new technology has helped it develop in dynamic and disturbing ways. Children, who are often among the most enthusiastic early adopters of new technology, are particularly vulnerable to online exploitation by offenders, who use the internet to locate and groom potential victims for abuse.

󠀠

‘Historically, there was always this focus on, “We must look for the contact offenders. We must look for the offenders that are going to sexually assault a child in person,”’ Cerys recalls. ‘And what we've learned, as technology and offending behaviour have developed, is that people can do this by proxy. They can direct somebody else to do it on the other side of the world, and they are causing harm to that child, even though they're never going to be in the same room as them. And in the same way, people can engage one on one with a child virtually and cause physical and emotional harm to that child, without ever touching them. And the volume and the scale and the complexity of the offending is ever growing.’

In this formidably bleak landscape, the NCA faces off against its targets with the limitations that are common to almost every public body: a finite set of resources with which to take on an overwhelming demand for their services. As Cerys arrived at the NCA, that demand was about to go to a whole new level.

Building AI into the NCA’s strategy

As early as 2018, the NCA had recognised that AI had the potential to help them achieve their mission. Within the UK public sector, this was impressively early to be thinking about AI: back then plenty of public bodies were running experiments, but relatively few of them were trying to build AI into their core operations.

But for the NCA, it wasn’t just about integrating it into their workflow. They wanted to build it into their strategy. ‘It starts with the business problem,’ says Claire Smith, the NCA’s Chief Operating Officer and a 25-year veteran of the policing and security sector. ‘You always want to have a really strong business voice involved, and when I have seen innovation really work, that has been one of the key ingredients.’

Testimonials

“When we all got together we were one team with a shared goal. And that made it a really targeted development process."

Cerys Evans
G3 Intelligence Manager, NCA

As AI rises up the corporate agenda, most organisations will consider (if they haven’t already) what their AI Strategy should be. Many will spin up AI Strategy programmes, ranging across the organisation, looking for all of the things that AI could possibly do. Consultants’ two-by-two matrices abound, ranking lists of possible use cases according to ‘technical feasibility’ and ‘impact’.  Leadership teams are presented with the outcomes like a menu in a restaurant, with some recommended dishes: ‘quick wins’ for starter, ’low hanging fruit’ for main, and if there’s room left at the end then maybe ‘longer term bets’ for dessert.

The challenge with this bottom-up approach is that, in the final analysis, when an AI strategy comes up against the actual business strategy, there is only one winner.

󠀠
󠀠

Every good organisation already has a business strategy. The people there know what is most important to them - and to the boss. They can tell you the three or four priorities that the CEO cares about, and they fastidiously track the KPIs against which everything and everyone will ultimately be judged.

󠀠

Being strategic about AI means using it to accelerate the things you already know to be most important. The instruction CEOs should give their teams is not ‘design our AI strategy’. but ‘test whether AI can help us meet our top three priorities.’ If AI won’t do that, then ignore it and focus on technologies that will. But if AI can help with those core goals, then you already have the answer to where and how to prioritise it.

󠀠

At the NCA, a highly complicated organisation, their business strategy is clear and their priorities cleanly stated. Their main objective is the relentless disruption of serious and organised crime through targeted action against the highest harm offenders and networks, together with a statutory obligation to safeguard children from harm. The second priority is to minimise the number of victims and the level of harm caused.

By definition, their organisation only deals with cases that are really, really important. But with thousands of case referrals each day, they still don’t have the resource to tackle all of them. So making their strategy work boils down to finding the most important needles in a haystack where every blade of grass is important, and then throwing their resource at those cases. It’s painstaking work for the officers who do it, and also deeply stressful, knowing that somewhere in the pile there might be victims of serious crime they could help, if only they can find them in time.

Is that the sort of task AI can help with? Absolutely. So, with a minimum of fuss, this became one of the first priority areas for the NCA to focus their AI programme. And because time is of the essence in everything they do, they wanted to get to work quickly. For Cerys and her team, it couldn’t come quickly enough.

A tool to make a material difference from day one

In 2020, the team was handed an unprecedented trove of intelligence material from a confidential source. There were thousands of referrals, each one pointing to a case where children might have been harmed, and possibly still be at risk. And the only way to prioritise them was for human experts to methodically go through them one by one, painstakingly noting the key information and cross-referencing them with other sources of intelligence. With the number of cases they’d just been given, it would have taken them literally years to process. They needed it done much, much faster. 

As it happened, Faculty were already in the building. ‘We’d actually been engaged on a different project, to develop a different type of tool,’ says Nijma Khan, who runs Faculty’s Government and Public Sector practice. Then Paul Aspinall, the NCA’s Intelligence Operations Manager - universally known as Asp - came calling.

‘At the time I was responsible for developing innovation,’ Asp explains. ‘I looked at what we had, and what I knew we could do from my experience in data exploitation and intelligence, and I basically presented that to Faculty to say, “This is the challenge. This is what we start off with, and this is what we need to do.”’

󠀠
󠀠

‘We spent a few days with their different teams around the country,’ says Nijma, ‘and sat with them and tried to shadow them as much as we could, and walk through their day-to-day processes. We kept asking them the question, “What needs to be true for this to be an easy tool for you to use every day?” And through that process, we created a tool that was easily deployed, and made a material difference from day one.’

󠀠

Claire backs that up. ‘The magic happens when you put technical people together with people who understand the threat and the business, and you could see that with this group.’ ‘It didn't feel like we were working across multiple departments and agencies,’ Cerys adds. ‘When we all got together we were one team with a shared goal. And that made it a really targeted development process.’

󠀠

It speaks to the team’s ethos that when it comes to taking the credit, everyone involved is keen to point the finger elsewhere. ‘Asp poured his heart and soul into the project,’ says Cerys. ‘He was constantly keeping things ticking along, and drawing us back to what the NCA tech infrastructure could handle, which of my big dreams were feasible or not. The successes and wins wouldn't have been possible without his constant drive and passion for the project.’

When pushed, Asp admits that it was originally his idea, but is quick to credit Cerys and her colleagues for how it turned out. ‘I didn’t need to be dealing with the intelligence development, the prioritisation side. I could leave that to Cerys, because she’s kind of Champions League level on that. So she did that with Faculty, while I worked with them on the infrastructure and the commercial and legal stuff, all the horrible project stuff that nobody loves.’

And both Cerys and Asp are quick to heap praise on the wider team, including Faculty (Cerys in fact starts listing names that would fill the rest of this chapter). ‘Faculty understood the value of what we were trying to achieve, and that it was clearly a challenge,’ says Asp. ‘And I’ve come to understand that data scientists love the challenge, almost over anything else. They were all about how to solve the problem, rather than selling a product.’

Once the data has been extracted, VIPER identifies suspects and then links those people to information from additional data sources which might add context to help the NCA assess the risk they pose.

󠀠

For good measure, Asp made sure the Faculty team fully understood how their work fitted into the NCA’s broader purpose, briefing them as if he was onboarding new officers. ‘They almost had to feel the pain of what the officer is trying to do, before they even got to the subject matter,’ he says. Though thankfully, the Faculty personnel were kept away from having to see any of the actual content. ‘The team did struggle with the work,’ admits Nijma, ‘because it was hard to have those conversations. But then that became almost the motivation to do the work.’

Project 52 becomes VIPER

The tool that Faculty built, originally codenamed Project 52, eventually became known as VIPER: the Volume Intelligence Prioritisation and EnRichment tool. Though the name was suggested by one of Cerys’ advisors, it was apt in more ways than one: the ancient Greek word for viper is ‘aspis’, or asp. And Asp’s brainchild was about to start biting.

The technology consists of a suite of utilities that work together, enriching bulk data to triage cases and provide actionable intelligence. ‘It’s all about building up that investigation picture, building up the evidence base, and identifying where the harm sits,’ explains Nijma. ‘Work out if there’s a potential for harm, extract insights from the data, and use the power of AI to work out where there's potential for harm far quicker than humans can alone.’

󠀠
󠀠

‘What's important is that we can garner sufficient information to help us make that assessment,’ Cerys adds. ‘But we need to do that rapidly, because we want to respond quickly. We want to make the correct judgement, and put our resources where they're most needed, so we’re safeguarding the most children from the most egregious harm that we can. And so the goal with VIPER was to obtain critical information rapidly and use that information to inform our assessment of priority.’

󠀠

‘The mantra for me was always deliver while you develop,’ says Asp. ‘Instead of using dummy data, we're using real data, and working with real risk. And the benefit of that is that you get real results very, very quickly.

󠀠

‘And within a matter of weeks,’ concludes Nijma, ‘we had something that they could use that made a material difference to their day-to-day jobs, but also a material difference to the safety of children around the country.’

Keeping humans in the loop

VIPER works by extracting key data from the referrals, which come in all sorts of different formats. ‘There were a lot of complexities around those different data sources, and some really innovative methods Faculty had to use to solve that,’ says Asp.

Once the data has been extracted, VIPER identifies suspects and then links those people to information from additional data sources which might add context to help the NCA assess the risk they pose. That, too, goes faster with VIPER. Before it was automated, officers would have to make requests to data providers for each case that they were working on, and it might take weeks to come back. With VIPER, those same checks can be compiled at scale, across hundreds of investigations simultaneously, and take a couple of days.

That means fewer officers are able to do more with less. They can tap these additional data sources for 500 times more referrals than previously, building a richer picture of the risk and providing more timely intelligence. And all those different bits of information give Cerys’ team a sense of how risky the person might be, and how quickly the NCA needs to either investigate them, or pass them on to a local force to go and knock on front doors.

What used to take 45 minutes is now done in four. This reduction in time on a case by case basis means that entire operations can be processed in weeks or months, rather than the years large scale operations or data dumps would take previously.

󠀠

‘You can’t arrest an identity; you have to find a real person behind that identity,’ says Asp, his professorial demeanour belying the steel in his voice. ‘Conversely, you can’t protect an identity, you can only protect people, children. You have to find the real people behind those identities.’

‘And it’s not just about building up that picture and automating the task. It’s also about increasing the accuracy,’ says Nijma. Before VIPER, a lot of information would be manually transcribed and entered into the system, introducing more scope for error, more ways in which connections might be missed - or innocent people incorrectly drawn into the net. The NCA have always had safeguards built in to prevent that, but the automation gave them additional peace of mind, and again made it quicker. VIPER uses fuzzy matching to help spot duplicate ‘entities’, to reduce wasted effort and make sure all the right information is being linked to the right person.

But there’s still a human in the loop, at critical points throughout the process, and making the final judgement. Given that the original problem was an overwhelming amount of data, simply adding more of it isn’t going to help them.

󠀠
󠀠

So as a final step, when all the analysis and enrichment is complete, the software completes a prioritisation assessment (using an academically accredited framework) and highlights the key intelligence in each case to the reviewing officer. This allows them to check whether they agree with the prioritisation a whole lot faster.

󠀠

What used to take 45 minutes is now done in four. This reduction in time on a case by case basis means that entire operations can be processed in weeks or months, rather than the years large scale operations or data dumps would take previously. And there’s still more to do.

󠀠

‘We’re still in the foothills of using AI as an agency,’ says Claire. ‘As we look across the organisation, you can see so many use cases. We can’t just keep throwing people at the problem, the threat is too big. The data is too big and too partial. So we can just be so much more efficient if we use these technologies.’

In addition to her day job (on top of having recently organised her wedding), Cerys is now pursuing a PhD looking at how child sexual abuse cases are risk-assessed and prioritised. She’s aiming to establish an even more rigorous evidence base for the process, which will address limitations in the current research base and can then be built into later iterations of the VIPER algorithm and support more meaningful prioritisation at scale.

Because it’s an arms race. AI is affecting this dark part of the world just as much as everywhere else. In July 2024, the Internet Watch Foundation revealed that it was encountering so much child sexual abuse imagery generated by AI-tools, it had reached a ‘tipping point’ where authorities could no longer tell if an image involved a real child needing help. Ironically, the solution might also be AI.

‘The nature of the challenge is already changing because of generative AI,’ says Nijma. ‘The thing you need to identify at the end of the day is: is there a real child in this picture?’ Her team have already started work on a classifier that can analyse online imagery to identify illegal content relating to child sex abuse. If adopted, it would reduce the human workload needed to take it down by a third. ‘That’s my dream,’ she confides.

But VIPER’s capabilities aren’t limited to tackling child sexual abuse. ‘The tool is basically threat agnostic,’ says Claire. ‘The capability that they built for taking online identifiers, then processing that against other data and knowledge that we have, that is absolutely going to be repeatable across other threat areas. Tools like this will massively help in terms of surfacing risk quickly, and enabling that to be acted on.’

‘Criminality is criminality,’ says Asp. ‘Criminals all generate data. They all leave footprints. The VIPER process would readily apply to absolutely any threat where you want to effectively identify targets from within data that you've been supplied with, or that you hold. And,’ he concludes, ‘we know that people have been targeted, in part or possibly even wholly, because of the work that we’ve done.’ 

‘You do it because you want to safeguard children’

The NCA’s clarity upfront around the priorities that they wanted AI to support - putting AI at the service of their business strategy - meant that Faculty were able to get to work quickly, and have an impact almost immediately.

‘The first time we put something through, start to end, it was just really exciting, because it meant we’d hit that minimum viable product,’ Cerys recalls. ‘We've deployed this operationally through development, and that’s a real strength of the product, to trial these processes on live investigations while we were doing them.’

󠀠
󠀠

But although the technology is transformative, ultimately it’s about the people. ‘You don't work in this space on a whim,’ says Cerys. ‘You do it because you love it. You do it because you want to safeguard children. People really care so much about what they do, and are so dedicated, but because we have so much volume, we have to make sure that we have the most impact we can in terms of safeguarding children from harm. This technology helps us to do that a little bit better and make the right decisions.’

󠀠

Nijma’s been struck by it as well. ‘The high point of the whole engagement has been the ability to work with such passionate people who do such an important job, and make sure that what we’re building delivers for them,’ she enthuses. ‘Knowing the individuals who are using the tool, knowing that you've actually made a difference to their day-to-day, is really valuable. You rarely get that when you're building these kinds of tools for large organisations, but here we were able to get to know the teams, and so you can put a name and a face to the person whose job you're improving.’

󠀠

Cerys is clear-eyed about the challenges that remain. ‘The sad reality is that we get too many referrals to action every single one of them. So if we can't action every single referral, we damn sure need to make sure that we start from the highest harm and work our way down.’

As for the impact, Nijma points to a plaque that hangs in the front entrance to Faculty’s Old Street offices. On the front is the NCA shield, which features a griffin and a leopard flanking a gold portcullis. The griffin symbolises courage and vigilance; the leopard fierceness and bravery.  But if you turn it over, on the back of the plaque where visitors will never see it, is a handwritten message from Asp and Cerys.

‘Thank you for all your hard work. You have helped to safeguard hundreds of children.’

The lesson in summary
Business strategy trumps AI strategy.
  • For a small share of businesses, advances in AI will render their current strategy obsolete. If you are one of those businesses, then it may well be worth reconsidering from first principles how to succeed in a new operating environment.
  • For the vast majority, the challenge is how AI can be used to accelerate you down the path you have already laid out. Rather than trying to come up with a separate AI strategy, you should test how AI can help deliver your existing priorities. If it can’t, then ignore it and focus on the things that can. AI is not a worthwhile investment for every business.
  • The opposite approach, a bottom up exercise, is common. It tends to involve lots of interviews with people from around the business, culminating in a menu of all the possible ways AI could be used, stack ranked against each other. More often than not, this results in AI projects that operate at the margins. They rarely gather enough energy or interest to make a difference.
  • The outcomes of AI programmes and teams should be judged primarily against commercial metrics. You need to make sure that you understand the top level business priority you are targeting, and the cause and effect pathway that allows you to influence them. At no point should the number of use cases delivered ever be mistaken for something important.
  • AI programmes should always invest in properly baselining ex-ante performance, and measuring impact against that. This is often overlooked, making it impossible to build the feedback loops that allow performance to improve over time.
background placeholder

Did you enjoy this story? There are nine others just like it, told from the perspectives of nine of our other inspiring customers, in the full book 'Ten Lessons From Ten Years of Applied AI'. Just leave your details below to get instant access to your copy of the book.

󠀠

󠀠

󠀠󠀠With contributions from:

Ten Lessons From Ten Years of Applied AI

Download the eBook

Get instant access to ten examples of AI solving the world's biggest challenges, told through the stories of ten of our most brilliant customers.