Pages

25 June 2019

Who Can Stop Facebook? Limiting the Power of Social Media


Venture capitalist Roger McNamee was an early investor in Facebook and claims he mentored founder Mark Zuckerberg. He still sees value in the platform, but he’s concerned about the growing economic and political power behind Facebook and other social media sites. He shares his concerns in his new book, Zucked: Waking Up to the Facebook Catastrophe. McNamee joined the Knowledge@Wharton radio show on SiriusXM to talk about why he’s speaking out against the company he once championed. (Listen to the podcast at the top of this page). “I became an activist because I was among the first to see a catastrophe unfolding, and my history with the company made me a credible voice,” he wrote in an opinion piece for Time. McNamee is co-founder of Elevation Partners private equity firm and co-founder of the Center for Humane Technology.

An edited transcript of the conversation follows.

Knowledge@Wharton: Obviously, Facebook has been a very much talked about company over the last couple of years — what do you think has gone wrong?

Roger McNamee: I’d spent 34 years as a tech investor and a tech optimist. I was lucky enough to get there before the personal computer industry was really starting. When I met Mark Zuckerberg in 2006, he was 22. The company was two years old. He had a huge business challenge he was facing, and I was asked to come in and give him some advice. It happened that I was able to help him through that problem, and that began a three-year period where I was one of the advisers who was close to him.

I thought he was amazing. I thought that Facebook had absolutely solved a core network issue. What we had discovered in the past was that when networks get larger than tens of thousands of people, if you allow anonymity, then trolls — bad guys — basically take over and bully everybody else. Mark [established] real identity and real privacy controls. The problem was, he also had an opportunity to be the largest network in the history of humanity. The goal of connecting everyone in the world conflicted with that original design criterion of requiring real identity and having real privacy, so those things went away.

By the way, when I first saw the issues in 2016, and I reached out to Mark and Sheryl Sandberg in October of 2016, I had thought that the issues were specific to Facebook and it was around the business model. There was this notion of advertising, where what you’re really trying to do is to manipulate people’s attention, using things like likes and notifications in order to entice them to come back, and appealing to fear and anger in the way that you organize the news feed in order to get people to share stuff. That model was not unique to Facebook…. It exists on Instagram and WhatsApp, to a degree. It exists on YouTube and parts of Google as well.

It’s actually an industry-wide problem. The business model has been so successful that these companies have legitimate economic power, but they’ve also gained political power where they dominate the public square in every democracy. That’s really unhealthy because they’re not elected and not accountable, and they haven’t yet shown the kind of maturity to deal with that kind of responsibility to protect the people who use their products. I don’t blame them as people. I really think that this is a combination of the larger culture that Silicon Valley has evolved to, and these business models that created so much success that they got in their own little bubble and just didn’t realize the damage they were doing.

“There’s no regulatory restraint. There are no countervailing forces. No government is in a position to change the behavior of these companies.”

Knowledge@Wharton: In the beginning, social media sites were not considered to be part of the capitalist system, but now they are. Can you see them also changing that dynamic?

McNamee: I think they came along at a moment in time when things were possible that had never been possible before. You could never build a global network like Facebook until exactly the time they showed up. And we did not have an economic environment before then, either, where literally anything goes. There’s no regulatory restraint. There are no countervailing forces. No government is in a position to change the behavior of these companies.

The truth is, they’ve evolved from what I would describe as capitalism to some new business model. In advertising, the person who reads the magazine or watches the TV show isn’t the customer. They’re the product, and the customer is the advertiser. In the model that Facebook and Google have evolved, they’ve gone beyond that. Now they gather data not to make their product and service better for the person who’s using it, but rather they gather the data to do other things that the person whose data they’re using may not even get a benefit from. There’s a Harvard scholar named Shoshana Zuboff who calls this surveillance capitalism. Whatever word you want to put to it, it’s a new business model. And it has this fundamental flaw in that the vast majority of the population doesn’t enjoy enough of the benefits to justify the harm that’s being done.

Knowledge@Wharton: Looking at the future, what does all of this mean for society?

McNamee: This is the real thing that scares the heck out of me. I think there are four classes of problems that we need to address. I started at the beginning thinking these guys were perfect, right? I was a cheerleader. I’m a little bit like Jimmy Stewart in Rear Window. I see something that looks totally wrong, and I pull on the thread. The book describes my journey of discovery and helps you, as the reader, understand what’s going on.

“It’s impossible to do a startup today to compete in their markets, and we’re long past the time when there should be dozens of people creating alternative visions.”

There are four problems. One relates to public health. Essentially, these products manipulate attention. In really little kids, the consequences are horrible. In teenagers, there’s a lot of bullying on products like Instagram. With adults, you have what are known as filter bubbles, which is this concept of reinforcing the things you already like to the point that you live in your own reality. It has really aggravated the nature of conflict in the U.S. so that left and right can no longer talk to each other.

The second class of problems just deal with democracy. It comes directly from that problem of public health because these products essentially allow politics to change from politicians persuading us to now using the advertising tools of these companies to manipulate, to appeal to the weakest issue for each voter. You can either suppress votes or, in some cases, change votes by appealing to things that are unique to each voter, and that may have absolutely nothing to do with the electoral process going on.

The third issue is privacy. Privacy is the thing that we’ve had the most conversation about, and people really don’t understand it. They say, “Hey, my data’s already out there.” They say, “I’ve got nothing to hide.” Those things are demonstrably true, but they’re not the point. The problem is the way they use your data to affect other people, and that’s a big problem.

The last thing is these guys are acting like monopolists. It’s impossible to do a startup today to compete in their markets, and we’re long past the time when there should be dozens of people creating alternative visions — more decentralized, more user-friendly, more humane approaches to this technology.

Knowledge@Wharton: One of the other big questions surrounding Facebook and other social media platforms is whether they will be regulated by the U.S. government. What do you think will happen?

McNamee: I think we have to look at a lot of different things. The good news is that we, the human beings formerly known as users, have more power in this thing than we realize. We do have the power to change our behaviors. I was hopelessly addicted. I still love Facebook as a product. But we do have the power to change our behavior in ways that reduce their power over us. And we have the ability to influence our elected representatives to do the right thing.

“The good news is that we, the human beings formerly known as users, have more power in this thing than we realize.”

The challenge we have here, from a regulatory point of view, is that the current products — Facebook and Google and the properties they own — have created the equivalent of digital chemical spills. They’re artificially profitable because there are side effects from their success that they are not paying the cost of. As with the chemical industry, we’re going to have to match the costs better to the people who create them. I think that’s a form of regulation that really matters.

We have to really look at how data is used. Why is it legal to collect data on minors? Why is it legal to sell credit card information or your transaction records? We’ve never had a debate about that. Essentially, these companies have taken a notion like eminent domain and said, “We have this data, so we own it.” I’m not sure that’s right. I’m not sure it’s right to sell geolocation data. Because between the credit card data and geolocation, you can create a really, really high-resolution map of anybody, even people who are not on Facebook and Google. I find that very troubling.

The other issues are the ones looking forward. With artificial intelligence, we’ve had all these examples of AI products that carry over the biases of the real world because they’ve trained the system with data from the real world. So, they don’t correct for implicit biases. Recruiting apps on AI have gender bias and racial bias. Mortgage applications often have the equivalent of redlining, where people of certain religions and certain races aren’t able to get loans in certain neighborhoods. That’s ridiculous. It’s totally unnecessary and shows a lack of care and judgment. I think we should require a technology solution that demonstrates safety, efficacy and lack of bias. The things have to be testable in real time. We have to be able to understand how they work. This is much easier than it would be with pharmaceuticals because we can do this in code and make it standard for everybody.

When you look at products like Alexa or Google Home, they’re collecting data in places we’ve never been before. And there’s already evidence that the systems are hackable. Our military is very concerned about the manufacturers of some of the hardware. Even if you trust Amazon and Google, there is at least some evidence that you want to be cautious. There are a lot of other failure modes in these things, and the value they provide is relatively small in comparison to the potential risk.

Lastly, I just want to remind everybody that our kids are being subjected to forces that their developing brains are just not ready for. Now that we’ve had a whole generation of kids exposed super-early to technology, pediatricians are saying, “Hang on, maybe that wasn’t such a good idea.” Maybe computers in a classroom are not a good idea, except for special-needs kids. You want to use the classroom to teach kids how to pay attention, to teach them socialization skills. You want to keep little kids off screens entirely because their brains can’t handle the overstimulation.

So, there are a lot of different issues. Take the issue that matters to you — whether it’s kids, whether it’s democracy, whether it’s privacy, whether it’s entrepreneurship and innovation — and let’s start conversations with lots of people. And let’s figure out how to solve these problems.

“I just wish the teams at those companies were a little bit more aware that they live in a big world and have a duty to protect the people who use their products.”

Knowledge@Wharton: You were around in 2005, when the idea of Facebook being sold to Yahoo had come up on the market. If that had happened, would Facebook be any different than it is now?

McNamee: It was actually 2006 when I first met Mark. The nature of our conversation, which I describe in the book as part of my Jimmy Stewart journey from stupidity to figuring this thing out, is that yes, I thought Yahoo and Microsoft and others were likely to buy the company. It turns out the first offer came from Yahoo. Had any other large company bought Facebook, it would not have become the company we see today. It took Mark’s genius and that extraordinary team and the fact that they were so intensely motivated. If you had stuck it inside a place with lots of corporate bureaucracy, it would have caused a lot of bad decisions to get made along the way.

In fairness, I think a lot less harm would have been done. But I like the good parts of Facebook and Google, and I want to find a way to preserve those things. I just wish the teams at those companies were a little bit more aware that they live in a big world and have a duty to protect the people who use their products

No comments:

Post a Comment