Pages

22 January 2019

The Week in Tech: How Google and Facebook Spawned Surveillance Capitalism

By Natasha Singer

Each week, technology reporters and columnists from The New York Times review the week’s news, offering analysis and maybe a joke or two about the most important developments in the tech industry. Want this newsletter in your inbox? Sign up here.

Greetings, I’m Natasha Singer, your resident privacy reporter. And I’m writing to you from wintry New York City as the government shutdown increases financial pressure on federal workers and the tech elites jet off to Davos, Switzerland, to hobnob at the World Economic Forum.

For the last few years, the forum has been heralding the “Fourth Industrial Revolution.” That’s the idea that today’s digital innovationsare generating entire new industries — in much the way electricity enabled the mass production of the Model T Ford in the early 20th century.

But a provocative new book, “The Age of Surveillance Capitalism,” by Shoshana Zuboff, a professor emerita at the Harvard Business School, offers a more sobering counternarrative.

Published on Tuesday, the book argues that digital services developed by the likes of Google and Facebook should not be viewed as the latest iteration of industrialization. Instead, Dr. Zuboff writes, they represent a new and problematic market form that trades in predicting and influencing human behavior.

“Surveillance capitalism has taken human experience, specifically private human experience, and unilaterally claimed it as something to be bought and sold in the marketplace,” Dr. Zuboff told me during a visit to The Times’s office. “This new kind of marketplace trades in behavioral futures. It’s like a form of derivative. But it’s about us.”

Yet most of us are not aware that platforms like Google and Facebook may track and analyze our every search, location, like, video, photo, post and punctuation mark the better to try to sway us, she said.

In fact, a new study on Facebook from the Pew Research Centerillustrates how opaque this behavior marketplace can be to consumers.

The study, my colleague Sapna Maheswari writes, reported that about three-fourths of Facebook users were unaware that the social network maintained lists of their personal interests, such as their political leanings, for advertisers. And about half of users who looked at their “ad preferences” — the Facebook pages displaying these details — said they were uncomfortable with the company’s creating lists of categories about them.

The technologies that power the behavior speculation market, of course, have spread far beyond online ads.

They enable auto insurers to surveil drivers and offer discounts based on their driving performance. They allow workplace wellness programs to charge higher health insurance premiums to employees who decline to wear fitness trackers. They helped Kremlin-linked groups mount political influence campaigns on Facebook (although, as my colleague John Herrman pointed out this past week, we have yet to learn how effective those campaigns were).

The flash-trading in human behavioral data was not inevitable.

In her book, Dr. Zuboff describes how Google, in its early days, used the keywords that people typed in to improve its search engine even as it paid scant attention to the collateral data — like users’ keyword phrasing, click patterns and spellings — that came with it. Pretty soon, however, Google began harvesting this surplus information, along with other details like users’ web-browsing activities, to infer their interests and target them with ads.

The model was later adopted by Facebook.

The companies’ pivot — from serving to surveilling their users — pushed Google and Facebook to harvest more and more data, Dr. Zuboff writes. In doing so, the companies sometimes bypassed privacy settings or made it difficult for users to opt out of data-sharing.

“We saw these digital services were free, and we thought, you know, ‘We’re making a reasonable trade-off with giving them valuable data,’” Dr. Zuboff told me. “But now that’s reversed. They’ve decided that we’re free, that they can take our experience for free and translate it into behavioral data. And so we are just the source of raw material.”

Of course, tech companies tend to bristle at the word “surveillance.” They associate it with government spying on individuals — not with their own snooping on users and trying to sway them at scale.

“When organizations do surveillance, people don’t have control over that,” Mark Zuckerberg, Facebook’s chief, said in April during a Senate hearing on Cambridge Analytica, the voter-profiling company that improperly harvested the data of millions of Facebook users. “But on Facebook, everything that you share, you have control over.”

Interested in All Things Tech?

Surveillance, however, simply means observation or supervision, often with the intent of channeling the surveilled in a particular direction. As Dr. Zuboff’s book points out, that is at the core of Facebook’s panopticon of a business model.

In other news:

■ Microsoft pledged nearly $500 million in loans to help build affordable housing in the Seattle area. The money is “the most ambitious effort by a tech company to directly address the inequality that has spread” in its backyard, Karen Weise writes.

■ Facebook is facing continued pressure to more aggressively counter the spread of divisive disinformation and user manipulation on its platform, Adam Satariano writes. The company said Thursday that it had identified two new Russian-linked misinformation campaigns. “Up to 1,200 people expressed interest in attending one of the roughly 190 events organized by those behind the fake pages,” the article notes. “Facebook couldn’t say whether any of the events had taken place.”

■ Legislators must find a new way to regulate social media, Natasha Tusikov, a criminology professor, and Blayne Haggart, a political-science professor, argue in The Conversation. One suggestion, they write: “It’s time to consider noncommercial ownership of social-media entities — including nonprofit or some form of public ownership.”

■ And finally, how might countries grapple with a global labor market in which there are few protections for many digital workers? Read the opinion piece in the New Statesman by Mark Graham, a professor at the Oxford Internet Institute.

No comments:

Post a Comment