Contents

  • Data Privacy
  • Addiction
  • Culture

Friends,

It is a matter of life or death.

It is the most extreme question being asked in Silicon Valley today, and thus the best place to start in a discussion of Silicon Valley ethics. It is the question that is asked when programming autonomous cars: in the case of an accident, whose life should the car try to save?

Does it save the driver? The passenger? Pedestrians?

In order to make such a judgement, AI will need to understand human values.

But whose values?

In a survey of 2.3 million people around the world, researchers asked who should be saved if an autonomous car were forced to make a decision. The answers varied greatly by region, influenced by political and religious ideologies, as well as political and economic realities.

That’s a ton of decisions on the design side. And even if automakers were to find the perfect algorithm, hackers might just do that killing themselves.

This is a real-life version of a 50-year-old thought experiment in ethics, the trolley problem.

With autonomous cars we are dealing with human lives, but ethical dilemmas behind them are the same that other technologies face: the ethics of the technology itself, the ethics of the culture that technology is used in, then what happens when what bad actors get ahold of the tech.

While the dilemmas posed by autonomous cars might be the most explicit, many areas of tech are under ethical scrutiny. February’s newsletter will focus on a few other examples of the intersection of ethics and Silicon Valley technology.

Data

Data privacy has been a concern for a long time, but Cambridge Analytica and its fallout put it center stage. Millions of words have been spilled over this issue, so instead of writing any more about it, we encourage you to watch Stealing UR Feelings, a fun six-minute documentary that shows how apps can (are?) using your camera to analyze your emotional reactions … to be better able to profit off them. To manipulate you, in other words. It’s better to go into the video spoiler-free, but trust us, the video is fascinating and frightening at the same time.

Addiction

In critiques of social media, you hear about the dopamine hit all the time. Any time you get a Facebook Like, a RT, or come across a piece of cool content somewhere, a bit of dopamine is released. Even High Times, a magazine dedicated to drug culture is writing about it. That’s, um, ironic.

Making your app or site addictive is actually the subject of the best-selling book, Hooked: How to Build Habit-Forming Products. (The book’s intro, which spells out the methodology, is free to read in the link.)

Addiction is not the only neurological problem. Research is beginning to suggest that smartphone use might affect the brain’s function and even change its structure. That is not good.

Culture

According to The Ringer, Stanford University Computer Science students are questioning the ethics behind the big tech companies they entered college wanting to work for. This is important for several reasons:

  • Stanford is the primary feeder school for Silicon Valley
  • it Silicon Valley’s intellectual and technical capital
  • as an elite school, Stanford is a prime vehicle of class ascension/confirmation.

So, even if the students The Ringer profiles do not end up changing the world, the notion that one should at least try to will probably become part of Gen Z’s upper-class aspirations, just like the ok-boomers who preceded them.

The students profiled the Ringer article face a problem most of humanity never will – ethics vs. prestige. But anyone with an internet can access this excellent free intro/first course of Stanford’s Continuing Unit’s class,The Ethics of Technological Disruption: A Conversation with Silicon Valley Leaders and Beyond. It spells out the major contemporary concerns, and is an excellent primer for anyone wanting to wade into the murky issues that Silicon Valley faces today.

Also of interest:

Why your new work colleague could be a robot

‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Ethical Intelligence: What does an ‘AI ethicist’ do, anyway?

MIT administrators knew about Jeffrey Epstein’s $850K in donations

Silicon Valley’s Crisis of Conscience

While some of these topics require a much broader societal discussion, SVSG can help you build more ethical technology.

Contact us to speak with one of our CTOs about how.