17 September 2015
In the fight against terrorism, defence and security agencies are turning to behaviour prediction software - but just how far can this technology go and will it ever be able to truly predict the unpredictable?
The use of prediction software as a security tool has been labelled 'predictive policing'. This form of counterterrorism has increasingly been on the up, with US-based Intelligent Software Solutions' (ISS) behaviour analysis tool used in more than 40 countries to help determine where the next terror attack might be.
Known as Dfuze, the software was used to investigate the bombings in Boston in April 2013, and also at the London 2012 Olympics - where UK police forces increased the security presence at areas that the software indicated could be at risk of attack.
Dfuze works by analysing past attacks to give an idea of where attack hotspots might be in the future, as well as the types of explosive devices that might be used and how.
"A lot of it is based on previous MOs, on historic data," says Neil Fretwell, operations director and subject matter expert at ISS Global, a subsidiary of ISS that works with military, counter-improvised explosive devices (IED), public safety and law enforcement agencies.
"In the past a lot of that work was done by a team of analysts in an office trawling through reams and reams of data, initially on paper but then on computer systems.
"If you bolt on a computer algorithm that can actually analyse that data for you, based on a number of parameters that you give it, it will break it down into the percentages and likelihood of something happening in a certain place at a certain time."
The US military's Dugway Proving Ground laboratory shipped live samples of anthrax to laboratories across the world.
In 2014, it was announced that the UK's National Counter Terrorism Network would use the Dfuze intelligence management tool, which will allow its counterterrorism units to store and maintain data related to incidents involving IEDs, criminal gangs, and terrorist and firearm incidents.
This data is shown in a centralised view, allowing experts to work through streams and streams of critical information, using analytical tools to help predict future trends and patterns.
'The wave of the future'
"[It's] about pulling together big data [and] lots of different data sources to give you the best chance of getting your best guess of where something is going to happen and when," adds Fretwell.
This path to a "best guess" or 'predictive policing' has been hailed by New York City Police Commissioner William Bratton as "the wave of the future". He also declared that "the Minority Report of 2002 is the reality of today" - a reference to the film where psychics predict future crimes.
While this candid and somewhat optimistic view may not be shared by everyone, it is certainly true that other organisations are seeing the benefit and potential of tapping into predictive analysis.
Tech giant Microsoft announced in January that it had partnered with a number of cities to develop the domain awareness system, which analyses public safety and open source data. The Pacific Northwest National Laboratory has also developed its own form of predictive analysis.
Another solution is PredictifyMe, launched in 2014, which, in partnership with the UN, is using data to help protect schools in Pakistan, Nigeria and Lebanon.
"[We] have the largest data set on earth when it comes to suicide bombings"
PredictifyMe co-founder Dr Zeeshan-ul-Hassan Usmani was quoted as saying by CNN Money: "[We] have the largest data set on earth when it comes to suicide bombings," while Rob Burns, the company's CEO and co-founder, told the same publication: "We're sitting here with technology that's easy to deploy and can help predict an attack and secure schools against it."
It works by analysing 200 indicators, such as the weather, sporting events, holidays, attacks in nearby countries, and the release of videos on social media platforms, to predict the likelihood of a suicide bombing attack. According to Usmani, it can predict an attack within three days with 72% accuracy.
Problem: bad data in equals bad data out
There is clearly a need for additional tools to combat terrorism as agencies try to stem the flow and threat of groups such as the Islamic State (IS), who are very active on social media.
But Fretwell, who previously worked at Scotland Yard's Bomb Data Centre, says it's important to throw in a pinch of realism.
"I am a big believer in predictive software but only if you have somebody with experience - once the software has predicted it - to give it a critical eye," he explains.
"The limitation is you are relying on data. If it's bad data in it could be bad data out, so you have to be very sure of your data source.
This viewed is shared by Margaret Gilmore, a senior associate fellow at the Royal United Services Institute for Defence and Security Studies, who adds that "software may not see that one tiny different clue that someone thinking laterally about what they are looking at might spot".
"Also there is a limit to what technology can learn about terrorist groups. Infiltration of a terrorist organisation, as happened eventually with the IRA, can undermine everything a terrorist group is planning.
Bioweapons such as Anthrax, Botulism and Variola have been studied as weapons, engineered and in some cases even deployed.
"Nothing beats human intelligence and human instinct."
The importance of this is described by Fretwell based on a previous encounter he had while working in a live control room.
"Somebody sent in an image of something that had been found in a search. Before I knew it or had a chance to look at it, somebody had dispatched an explosives officer to the scene.
"When I looked at it, it was nothing of the sort, but only because I had that experience, where upon the first person didn't.
"It [prediction] can take you so far but it still needs a human decision at the end of it, to [decide] whether or not [the] prediction is likely to happen or not."
An 'increasingly sophisticated' cyber battle
As mentioned earlier, groups such as IS have made much headway on social media platforms, using instantaneous communication to propagate their images, videos, and ultimately terror. Last year, it was reported that IS was planning to create a 'cyber caliphate' using its own encryption software.
In addition, a Sky News investigation in August uncovered the sheer scale of the cyber operation when a freelance journalist and Sky created two fictional online characters on Twitter and chatrooms. They soon received 'terror guidebooks' - including information on how to raise funds and make weapons - as well as millions of other messages.
"I believe advanced technologies must have a role to play in counterterrorism efforts - the terrorists are increasingly using the internet to recruit, plan attacks, communicate and spread propaganda," says Gilmore.
"Some of this is done openly [and] some is happening via increasingly sophisticated encrypted methods and our intelligence and police agencies need to keep across all this. They need to keep ahead of the game technologically against a terrorist enemy who is trying to do the same and has considerable expertise in this field."
"I believe advanced technologies must have a role to play in counterterrorism efforts - the terrorists are increasingly using the internet to recruit, plan attacks, communicate and spread propaganda."
Unsurprisingly, ISS' prediction software is tapping into social media, looking for trends and attempting to second guess what is going to happen next.
And, with more and more avenues of data to explore, Fretwell envisages an ever expanding role for predictive software.
"There are so many issues going on worldwide that you don't have enough manpower to do it without the aid of prediction software.
"[It is seen] as a very good tool to sort out the wheat from the chaff and leave the nuggets at the end, which they will then look at with a critical eye."
However, Fretwell adds that caution must be taken when looking at its future, adding that he is unaware of anyone who sees predictive software as the "all singing all dancing [solution]" that will be able to predict to the "Nth degree" what will happen.
"I don't think prediction software will ever make decisions for you; there will always be a critical eye at the end of it. I could be proved wrong, but I would be surprised," he says.
"I don't think [it] can predict the unpredictable. It can predict what may happen, within a reasonable timeframe."
No comments:
Post a Comment