August 14, 2014
ARLINGTON: “Big data” is big business nowadays. Defense contractor Lockheed Martin,for example, boasts their analytical tools have successfully predicted everything from Arab Spring uprisings to the onset of sepsis in hospital patients.
But big data can also go wrong in big ways. If you set a powerful program loose on a large enough data set, it can come up with spectacularly specious correlations that have nothing to do with cause and effect. More people tend to drown in swimming pools, for example, in years when Nicholas Cage appears in multiple movies. That example is easily caught by common sense, but far more dangerous are the correlations that look plausible to policymakers while still being wrong.
“It’s like baseball statistics,” said Jason O’Connor, vice-president of analysis at Lockheed’s Defense & Intelligence Solutions (DIS) division. Number-crunching fans can come up with fallacies like “It’s a sunny day in July, left hander, on an astroturf field, [so] he’s going to hit a double.”
“We hear a lot about big data,” O’Connor told reporters yesterday. “I tend to think about it slightly differently[:] data, algorithms, and tradecraft. All three pieces are critical.”
Data refers to the quality and quantity of the information itself. “Garbage in, garbage out,” but, O’Connor said, one advantage of big data is the ability to use so many different sources that their errors cancel each other out, letting the underlying patterns show through. “The data doesn’t have to be as good if it’s big,” he said.
Algorithms refers to the quality of the software that analyses the data. (Lockheed’s marquee product is immodestly called “LM Wisdom,” which analyzes social media and other online “open source” data). The quality of the software, however, ultimately depends on the human beings writing and testing it.
Finally, tradecraft refers to the old-fashioned intelligence analysis skills of the human beings using the software. No matter how beautifully the algorithms are written, you can’t just dump the output on a decision-maker’s desk. It takes expertise and experience to make sense of big-data findings, just as with any other source of intelligence. The goal of LM Wisdom and related tools is “to enable the analyst,” not to replace him or her.
“We’re not suggesting that the human be out of the loop,” O’Connor. “We’re not suggesting that an algorithm is the ultimate outcome. It is an input to the tradecraft and the tradecraft is the human being.”
The real value of the algorithm is to “scour endless data” that drive human beings blind or mad with boredom – from social media feeds to satellite imagery – and highlight whatmight be worth further investigation, either by the analyst or by additional intelligence collection resources.
Lockheed is naturally leery of discussing just what they do for the intelligence community. They do claim one version of their software had a 100 percent success rate in predicting which Arab governments would avoid uprisings, make concessions to them, or be overthrown – although they only ran this analysis once the Arab Spring was underway, and only as an internal test of the software rather than on behalf of any intelligence agency.
When it comes to real-world applications, Lockheed cybersecurity teams use a version of LM Wisdom to track so-called insider threats – disgruntled or traitorous employees – at Lockheed and client companies. (It’s unclear how this software handles employees’ privacy concerns). Lockheed also adapted counterterrorist techniques to help an unnamed “large pharmaceutical firm” track down the criminal ring counterfeiting their products, identifying key players and tracing money laundering flows that the client then turned over to US authorities. And Lockheed uses automated analytics for the Missile Defense Agency’s Command, Control, Battle Management and Communications (C2BMC) system, which must correlate inputs from widely scattered sensors of different types: infra-red satellites detect the flame of a launch, ground-based and ship-based radars track the missile, and commanders must quickly make sense of the data and decide whether to expended some of their limited supply of interceptors – or, in the future, fire a laser or rail gun.
Remarkably, Lockheed adapted those missile defense algorithms for medical purposes: by looking for patterns in temperature, blood pressure, white blood cell count, and other vital signs, O’Connor said, they were able to provide doctors warning that a patient was going septic 14 hours earlier than traditional methods. That’s not to say that doctors and nurses can’t do the job the old-fashioned way, nor that the algorithms can somehow replace them, but that the humans and the software together work better and faster than either alone.
“That 14 hours can be life-saving,” O’Connor said. That’s the goal for the national security applications of the software as well.
No comments:
Post a Comment