Why we’ve chosen to work in defence

Below, our CEO and co-founder, Dr Marc Warner, explains how our defence business unit started, why we chose to work in the sector, and how we ensure all our work is safe and ethical.

2025-01-21Defence
Dr Marc Warner
Chief Executive Officer & Co-Founder

Like many who come from academia, I once viewed defence work with deep skepticism. The industry seemed distant from my values, perhaps even at odds with them. This perspective wasn’t unique – it reflected a broader sentiment in both academic and technology circles, where defence work often carries a stigma. 

A turning point occurred when I was invited to a conference on the British aircraft carrier in New York Harbour in 2018. The deck was filled with people who knew a thing or two about where the defence industry was heading.

Onboard, in a safe space, the defence personnel opened up. They shared how the West had been able to protect themselves against invasions and war due to a substantial technological edge. However, they made it clear this edge was shrinking, largely due to the lack of innovative companies who were willing to work in defence. 

Straight away the realisation set in. If every company turned their backs, the defence industry wouldn’t have the support to provide the best protection. So when I stepped off the deck I knew I needed to learn more. I immersed myself in learning and discussions, to see if AI could help transform the industry with human-centric AI solutions. 

Recognising the need for action

Through books and conversations, I realised the situation outlined to me on the aircraft carrier seemed to be true. The period after the end of World War II in 1945 has been the most peaceful period in the history of the world. If that sounds surprising, check out Better Angels of Our Nature, by Steven Pinker. The West, particularly America, enforced a Pax Americana, also called the “Long Peace”, via technological superiority. And while the execution was very imperfect, the aspiration was to spread ‘universal human right’ widely. 

The phrase “universal human right” obviously gives the sense of boundless application: all people, all time. But the reality is very different. I learned that human rights were only really a concept from the 17th century, and were only codified in the modern form after 1945. Their full adoption was limited to a select group of countries – primarily Western democracies – and it was this same group that took on the responsibility of defending them. So while universal human rights was great branding, and the right aspiration, a more accurate description would be “western democratic rights”, with the associated connotations.

Once I learned this, I was in a tricky situation. In my mind, it seemed that human rights were important, and defended by a set of countries with a diminishing ability to do so, in part because companies like ours were choosing not to help. 

Wrestling with the complexity of defence and a duty to safeguard

The shortage of technology companies wanting to step in isn’t unreasonable. Defence is morally complex, since the world isn’t neatly divided into goodies and baddies. And at the time, the strong anti-defence sentiment in tech and academia was a challenge because these were exactly the people defence companies wanted to work with (although, without being unfair to the people I spoke to, I think it’s fair to say most of this was quite reactionary. When I chatted to people about why, they would readily admit the amount they’d read or thought about it was quite small).

I also knew that the procurement systems were famously difficult, and there were many horror stories of companies trying and failing to help. Overall, there was little expectation that this would ever form a significant part of Faculty’s business.

So, what should we do? On the one hand, it felt like a moral obligation. How could we live in a society that was clearly protected by the defence establishment, enjoying the rights that rested on their efforts and lives, and actively avoiding to help? On the other, it felt bad for business and worse for recruitment.

After wrestling with it, we took the issue to a leadership meeting. In our old office on Welbeck Street, where the basement room was dominated by a massive chalkboard that stretched across an entire wall, we debated the pros and cons. As the hours ticked by, we came to a vote. I expected it to be close. But, in fact, it was unanimous. Everyone agreed we had a moral duty to step up and do the work. So we started reaching out to the UK government to see whether we could help. The answer was yes. 

Ethics: the backbone of our decision-making 

Before we even considered working in the defence industry, we had already laid the groundwork to ensure it would be a thoughtful and responsible fit.

To help us ensure the deployment of safe, ethical and explainable AI, we had built the ethics panel and the framework around it. So when we consider whether to take on a project, we verify its ethical basis. This helped us navigate the complex and high-stakes challenges involved in the defence industry. It ensures Faculty, as a whole, has ownership of ethical choices. And after many years of application, it has blocked multiple projects that didn’t meet our standard. Some in defence, some outside. 

We prioritised creating an environment where individuals had the freedom to choose which projects they contributed to. So no one in our team ever has to work on something that makes them uncomfortable. 

These foundations allowed us to approach defence work with confidence, knowing we were prepared to navigate its complexities responsibly.

Navigating the risks and criticisms

We knew that this work was going to be done in the real world. We also knew that no institution, anywhere, ever, has an unblemished record. Defence organisations are no different. They will make mistakes, or have people in them that make mistakes and their mistakes have much bigger consequences. We don’t take this lightly. Whenever there are mistakes, we seek to understand, and ensure that processes are in place that stop those mistakes happening again.

Given all this, we knew we’d get criticised (we did). We thought we might lose team members (we didn’t, at least, we were never told that this was the primary reason). We knew that we would face hard choices, ambiguous information, and organisation complexity (we have). But we also feel like we’re doing the right thing, and that makes the difficulties surmountable. 

Seeing the bigger picture: protecting the water we swim in

A couple of years later, there was a key transformation in the wider world. Putin invaded Ukraine, and with tanks rolling across Europe, many more people understood that this work was important. They could see that the ‘end of history’ consensus of the early 2000s was breaking. The idea that every country was progressing nicely to a peaceful democracy was much less obvious. And defence became much more fashionable. Although the realities of working in it remained very similar.

Since then, Faculty has done some great work in the defence space, as well as in policing and national security. We’re proud of it, while genuinely recognising the obligations that it brings and the fact that others might disagree. 

We completely recognise that our experience isn’t everyone’s experience, and this doesn’t represent everyone’s beliefs. Mostly, this is just my story of the way my thinking developed. And hopefully, the focus on individual freedom at Faculty means that everyone can make their own decisions.

But, if I were trying to convince you, the late David Foster Wallace’s commencement speech called This Is Water springs to mind. An old fish swims past two younger fish and says "How’s the water?" The two young fish swim on for a bit and then eventually one of them looks over at the other and asks: "What the hell is water?" Our most important rights and values are so embedded in our lives that we scarcely notice - they are the water we swim in. To me, that seems worth defending.

At Faculty, we give our customers confidence to deploy AI in the most high impact operational environments. If you’d like to learn more, get in touch with our team.