By LAURIE SEGALL
Techies pride themselves on their optimism. It couldn’t be any other way. After all, Silicon Valley was built on the idea that technology is a force for good. In the nearly nine years I’ve spent covering tech, we’ve watched these dreamers and idealists transform society. Underlying it all is the notion that information wants to be free, that the more we share, the better off we are. That was the deal we made with companies that promised to connect the world.
Things haven’t quite worked out that way.
Given all the bad news that’s come out of the Valley, it’s worth taking a moment to marvel at the profound changes we’ve seen in just a generation. Apple made the smartphone ubiquitous. Facebook connected over 2 billion people. Google put the answer to almost any question just a few keystrokes away. Companies like Amazon, Netflix and Uber upended entire industries and made themselves indispensable to many.
But these changes brought unintended consequences. We’ve seen hackers steal our data and identities. We’ve witnessed the weaponization of social media to influence elections and sow discord. There are mounting fears about the impact of artificial intelligence and facial recognition.
Things may get worse before they get better as we grapple with this era of unintended consequences. Even as innovation races ahead at an ever faster pace, ethicists and futurists are asking profound questions about where we’re headed. What will it mean to be human as technology becomes an extension of us? How will the ubiquitous systems shape our children’s growth and capacity to love?
The good news is that some in Silicon Valley are joining academics, policymakers and consumers in pondering these essential questions. The sooner we start participating in a conversation about the bad, the better chance we have at coding a future that’s good. It’ll take a diverse group of people to help solve what’s coming beyond today’s techlash. Everyone must ask: Where should we draw boundaries? What are the next set of problems we should anticipate? Here I asked a few key thinkers about the challenges ahead.
Segall is senior technology correspondent for CNN Business. Her special report “Facebook at 15: It’s Complicated” will debut on the network in February in conjunction with the 15th anniversary of Facebook’s launch
Facial recognition
This powerful technology is already creating new and important benefits for people around the world, but we must be clear-eyed about its risks. So far, the technology is outpacing the ability of governments to keep up. Governments must adopt new laws to protect against discrimination, threats to privacy and the potential impact on democratic rights. We need a future that doesn’t force companies to choose between social responsibility and business success.
Brad Smith, Microsoft president
Neural inequality
Up to now we have experienced inequality in areas such as finances and opportunity. Neural inequality could be next. This would mean that some people would be able to enhance their thinking with a chip implanted in the brain, making themselves disproportionately smarter than the average. There could also be the risk of thought manipulation. With neuroscientists getting better at accessing brains and altering thinking, we are reaching a world where we could also change your mind. It’s a scary idea that someone could “write” into our mind, create thoughts and ideas in our own brain that we won’t be able to distinguish from ones that we generated ourselves. Would there even be any difference?
Moran Cerf, neuroscientist and business professor, Northwestern University
Protecting kids
Every aspect of human development, health and well-being depends on our ability to navigate and form loving social relationships. Several recent studies, however, suggest that adults are compromising those relationships when they divert their attention from their infants to their cell phones. In one, infants were more negative and less exploratory when parents picked up their phones. Society’s 12-year unintended experiment since the iPhone was introduced may be the culprit for tweens who are less socially attuned and for the 74% of pre-K-to-8 principals who lamented that their biggest concern was the stark increase in children who suffer from emotional problems. Our digital habits might be getting in the way of our interpersonal relationships.
Kathy Hirsh-Pasek, Ph.D., Temple
Joshua Sparrow, M.D., Harvard
Hospital safety
I launched a volunteer hacker collective to save lives through security research. We’ve had a profound impact on hospital safety–most notably with changes to FDA guidance and to medical devices. But risks remain. So now, I’ve started killing patients. Not literally, of course, but in hacking simulations with physicians, we are able to shut down vital medical equipment, destroy necessary patient data and create enough havoc–like delays in time-sensitive treatments–so that patients would die if this were real. We’re doing this to catalyze medical reform before there is real patient harm. Doctors implicitly trust technology, but there is a cost to connectivity. Hospitals need to conduct similar exercises and adapt to these emerging risks.
Joshua Corman, cybersecurity strategist
National security
Will American technology companies work to provide the U.S. and its allies with the best defense technology, or will they allow China and Russia to take the lead? If we don’t do the job, others will have the power in the defense space to set ethical norms opposed to our fundamental values.
Palmer Luckey, founder of Anduril, a defense company
Everyday ethics
What good was law when rogue Chinese scientist He Jiankui claims to have edited the DNA of twin girls? Innovation-friendly tech regulation is critical but often ineffective. Ethics requires thinking first and acting second. Ethics should be a habit of decision and commitment, driving every choice at all levels of an organization.
No comments:
Post a Comment