Pages

13 November 2016

The Risk to Civil Liberties of Fighting Crime With Big Data


NOV. 6, 2016

SAN FRANCISCO — People talk about online security as a cat-and-mouse game of good guys and bad guys. It’s true for good old-fashioned crime, too.

Technology, particularly rapid analysis and sharing of data, is helping the police be more efficient and predict possible crimes. Some would argue that it has even contributed to an overall drop in crime in recent years.

But this type of technology also raises issues of civil liberties, as digital information provided by social media or the sensors of the internet of things is combined with criminal data by companies that sell this information to law enforcement agencies.

The American Civil Liberties Union, citing reports that the Chicago Police Department used a computer analysis to create a “heat list” that unfairly associated innocent people with criminal behavior, has warned about the dangers of the police using big data. Even companies that make money doing this sort of work warn that it comes with civil rights risks.

“We’re heading to a world where every trash can has an identifier. Even I get shocked at the comprehensiveness of what data providers sell,” saidCourtney Bowman, who leads the privacy and civil liberties practice at Palantir Technologies, a company in Palo Alto, Calif., that sells data analysis tools. He has lectured on the hazards of predictive policing and the need to prove in court that predictive models follow understandable logic and do not reinforce stereotypes.

Some of this shift to data-based policing seems to be a matter of simple automation. The RELX Group, formerly Reed Elsevier, has for some years been buying and building up databases of police information. One product, called Coplogic, is used by 5,000 police departments in the United States.

Coplogic automates filling out accident reports. When a police officer enters a license plate number, many other fields on the report, like the registered address associated with the car, are automatically filled in. The company says this can halve the time an officer spends in traffic filing a report.

Cities can also use the service to identify their most dangerous traffic spots or, in much the way driving maps predict the fastest route home, predict where road repairs are needed.

“This frees up time and resources for higher-value activities, like predictive policing,” said Roy Marler, vice president of Coplogic. “The state can use this data to get federal funding for roadway improvements.”

RELX has become something like the Ticketmaster of insurance reporting. The company processes about 500,000 requests a month for digital accident reports, mostly from insurance companies, charging a $7 “convenience fee” to provide the information. Cities also receive a cut of the $7 for the distribution.

Thomson Reuters and Dun & Bradstreet also do a big business selling data to law enforcement.

In much the way combining different databases has helped people who place online ads gain insight and make predictions, traffic data now provides a window into crime.

“Criminals are citizens, too,” said William Hatfield, a former Secret Service agent working with RELX. “Even with an outstanding warrant, their car is their pride and joy. When they file with an insurance company, they give accurate information about their address that police can use to find them.”

Sharing data, both among the parts of a big police department and between the police and the private sector, “is a force multiplier,” he said.

Companies working with the military and intelligence agencies have long practiced these kinds of techniques, which the companies are bringing to domestic policing, in much the way surplus military gear has beefed up American SWAT teams.

Palantir first built up its business by offering products like maps of social networks of extremist bombers and terrorist money launderers, and figuring out efficient driving routes to avoid improvised explosive devices.

Palantir used similar data-sifting techniques in New Orleans to spot individuals most associated with murders. Law enforcement departments around Salt Lake City used Palantir to allow common access to 40,000 arrest photos, 520,000 case reports and information like highway and airport data — building human maps of suspected criminal networks.

People in the predictive business sometimes compare what they do to controlling the other side’s “OODA loop,” a term first developed by a fighter pilot and military strategist named John Boyd.

OODA stands for “observe, orient, decide, act” and is a means of managing information in battle.

“Whether it’s war or crime, you have to get inside the other side’s decision cycle and control their environment,” said Robert Stasio, a project manager for cyberanalysis at IBM, and a former United States government intelligence official. “Criminals can learn to anticipate what you’re going to do and shift where they’re working, employ more lookouts.”

IBM sells tools that also enable police to become less predictable, for example, by taking different routes into an area identified as a crime hot spot. It has also conducted studies that show changing tastes among online criminals — for example, a move from hacking retailers’ computers to stealing health care data, which can be used to file for federal tax refunds.

But there are worries about what military-type data analysis means for civil liberties, even among the companies that get rich on it.

“It definitely presents challenges to the less sophisticated type of criminal, but it’s creating a lot of what is called ‘Big Brother’s little helpers,’” Mr. Bowman said. For now, he added, much of the data abundance problem is that “most police aren’t very good at this.”

No comments:

Post a Comment