ON THE SCREEN in front of me is a mountain range. Moving toward my troops from the top-right corner is an ominous yellow dot. I suspect it’s an enemy drone, but it could be a bird or a civilian aircraft, so I ask my long-range camera to home in on it. Within seconds, it returns a snapshot of a wide-winged military drone. The incoming dot turns from yellow to red, signifying a threat.
This might sound like a video game, but it’s not. This is a technology designed to be used by real militaries. And it is the first time defense-tech company Helsing AI has shown a journalist what the software it is selling actually looks like. Helsing’s flagship system absorbs huge amounts of data generated by the sensors (electro-optical, infrared, sonar) and weapons systems (fighter jets, drones, helicopters) used in modern warfare. Algorithms then distill that information into a video-game-style visualization to show how events are unfolding in real time on the battlefield. What I’m looking at is a simulation of what I would see if I worked for a military that used Helsing’s system.
Torsten Reil, 49, is one of the company’s two CEOs. With a background in gaming—he previously founded development studio NaturalMotion—Reil is preoccupied with the user experience and making the platform intuitive for its military clientele. Then there’s co-CEO Gundbert Scherf, 41, a former special adviser to Germany’s military of defense, who talks fluently about how European militaries work and what he feels they need to do to modernize. And finally there’s the in-house AI expert and chief product officer, 31-year-old Niklas Köhler, the youngest of the three. Köhler was using machine learning to solve medical problems when he started to be approached by figures in the defense sector—prompting him to change direction. “Applications like detecting drones are, in terms of method, not so dissimilar from how you would find cancer in large CT scans,” he says.
In modern warfare, every second counts. And the Helsing founders say their software can give Western militaries an information edge. Its system, they claim, will help soldiers make faster, better-informed decisions and will be accessible on a variety of devices, so soldiers in frontline trenches can see the same information as commanders in control centers. “Now, all of this is done manually: phone calls, reading things, drawing stuff on maps,” says Köhler. “Understanding how many systems are there, what they are doing, what is their intent—this is an AI problem.”
Helsing is not the first company to try to build an operating system for war. Military types have been advocating for the idea since the 1990s. But traditional defense firms have struggled to deliver, creating an opportunity for tech companies to step in.
California-based Anduril, the company launched by Oculus cofounder Palmer Luckey, has developed software that connects multiple military systems. And Palantir, headquartered in Colorado, has been using the war in Ukraine to release details about its own military AI. But Helsing is the only visible European startup making this type of software. Experts say what’s notable about the company is the way it maps the electromagnetic spectrum, the invisible space where different machines send electronic signals between one another to communicate.
“Using artificial intelligence applications to help analyze what part of the electromagnetic spectrum frequency to use, and what part to jam, is an extremely powerful technique,” says Ben Jensen, a senior fellow for future war at the Center for Strategic and International Studies, a think tank.
That Helsing was willing to show me its technology at all reflects how much attitudes around the defense industry in Europe have changed over the company’s two-year existence. When Spotify founder Daniel Ek invested €100 million ($109 million) into Helsing in 2021, the same
year the company launched, many Spotify users expressed outrage that their subscription dollars were being used to fund the arms industry. But that was before Russia’s full-scale invasion of Ukraine in February 2022, which has prompted a recalibration of public attitudes toward defense.
Today, billions of dollars in military aid is flowing from European countries into Ukraine. Military spending on the continent jumped to “Cold War levels” in 2022, witnessing the sharpest yearly increase in 30 years, according to the Stockholm International Peace Research Institute. Venture capital is following suit, with funding for UK defense tech doubling from 2021 to 2022—from €500 million ($550 million) to €1 billion—according to PitchBook data shared with WIRED.
“It would have been easy to be discouraged at the beginning,” says Reil. Before the war, people told him it would be a struggle to raise money for defense tech and that he definitely wouldn’t be able to hire the talent he needed—the best engineers won’t work in defense. “That didn't turn out to be the case at all.”
Helsing now has 220 employees spread across four offices: in London, Paris, Berlin, and Munich. That doesn’t mean it has been easy to hire engineers. Köhler describes an interview process that sounds more like a debate about the motivations of the defense industry: “Most people are slightly biased against the sector, particularly before Ukraine.” Köhler gets it. He used to be skeptical himself, he says, an attitude he attributes partly to living in Germany, where the military prompts uncomfortable associations with the country’s Nazi past. “In Germany, public opinion is even worse than in the UK.” If attitudes to the defense industry have changed in Europe, in Germany they have transformed. For years, the country resisted US pressure to increase defense spending. Four months after the Russian invasion, the German parliament approved a €100 billion military revamp.
Germany is just one country Helsing claims military contracts with, along with the French and British. And since the Russian invasion of Ukraine, the company has also signed partnership deals with some of Europe’s most established defense contractors, such as Rheinmetall and
Saab, to integrate AI into existing weapons systems. Reil describes Helsing as “quite involved” in the war in Ukraine, although he declines to share details about what the company is doing there, saying only that staff regularly travel back and forth. When WIRED asked a Ukrainian government official—who asked not to be named because they are not authorized to speak publicly—to confirm Helsing is operating in Ukraine, they said they had been made aware of the company’s plans to “engage” in the country.
Helsing’s corporate slogan is “AI to serve democracies.” This is meant to illustrate the company’s pledge to never sell to autocratic governments such as Russia and North Korea, says Reil. But what counts as a democracy? When I ask whether the founders would sell to countries like Poland or Hungary—countries within the EU where governments have stripped judges of their independence and cracked down on LGBTQ rights—I do not get an answer.
Instead that slogan says less about what the company does and more about why it’s doing it. Helsing’s job adverts brim with idealism, calling for people with a conviction that “democratic values are worth protecting.”
Helsing’s three founders speak about Russia’s invasion of Crimea in 2014 as a wake-up call that the whole of Europe needed to be ready to respond to Russian aggression. “I became increasingly concerned that we are falling behind the key technologies in our open societies,” Reil says. That feeling grew as he watched, in 2018, Google employees protest against a deal with the Pentagon, in which Google would have helped the military use AI to analyze drone footage. More than 4,000 staff signed a letter arguing it was morally and ethically irresponsible for Google to aid military surveillance, and its potentially lethal outcomes. In response, Google said it wouldn’t renew the contract.
“I just didn't understand the logic of it,” Reil says. “If we want to live in open and free societies, be who we want to be and say what we want to say, we need to be able to protect them. We can't take them for granted.” He worried that if Big Tech, with all its resources, were dissuaded from working with the defense industry, then the West would inevitably fall behind. “I felt like if they're not doing it, if the best Google engineers are not prepared to work on this, who is?”
It’s usually hard to tell if defense products work the way their creators say they do. Companies selling them, Helsing included, claim it would compromise their tools’ effectiveness to be transparent about the details. But as we talk, the founders try to project an image of what makes its AI compatible with the democratic regimes it wants to sell to. “We really, really value privacy and freedom a lot, and we would never do things like face recognition,” says Scherf, claiming that the company wants to help militaries recognize objects, not people. “There's certain things that are not necessary for the defense mission.”
But creeping automation in a deadly industry like defense still raises thorny issues. If all Helsing’s systems offer is increased battlefield awareness that helps militaries understand where targets are, that doesn’t pose any problems, says Herbert Lin, a senior research scholar at Stanford University’s Center for International Security and Cooperation. But once this system is in place, he believes, decisionmakers will come under pressure to connect it with autonomous weapons. “Policymakers have to resist the idea of doing that,” Lin says, adding that humans, not machines, need to be accountable when mistakes happen. If AI “kills a tractor rather than a truck or a tank, that's bad. Who's going to be held responsible for that?”
Riel insists that Helsing does not make autonomous weapons. “We make the opposite,” he says. “We make AI systems that help humans better understand the situation.”
Although operators can use Helsing’s platform to take down a drone, right now it is a human that makes that decision, not the AI. But there are questions about how much autonomy humans really have when they work closely with machines. “The less you make users understand the tools they're working with, they treat them like magic,” says Jensen of the Center for Strategic and International Studies, claiming this means military users can either trust AI too much or too little.
Helsing has been testing ways to address such concerns. “Everybody talks about the human in the loop,” Reil says, referring to the term used to describe systems that ensure that the final decision is made by people, not machines. But Helsing tries to make sure those humans are not just “in the loop” but also engaged—not distracted or overwhelmed by information, he adds. The company does that, he says, by building pauses into the system so the operator doesn’t just respond to prompts as if they were on autopilot. Instead, Helsing’s system creates a moment for the operators to reflect on the information they have been given and ask them: Are you sure you want to follow the AI’s recommendation?
Helsing’s success so far—measured in its ability to secure contracts and hire staff—is entwined with a change in mood from Europe. And for now, the Silicon Valley idealism it is introducing into the local defense sector seems to be working. When the company launched, Reil says, he was braced for the fact a lot of people wouldn’t agree with what he was trying to do. But even he has been surprised by how much attitudes have changed. “The pendulum has swung almost entirely the other way,” he says.
This article has been updated to clarify that both Torsten Reil and Gundbert Scherf are now co-CEOs.
No comments:
Post a Comment