Then-CEO, now Chairman of Google Eric Schmidt with cofounders Sergey Brin and Larry Page in 2008 (Joi Ito)

A recent Radiolab episode considered the ethical implications of self-driving cars. These vehicles are usually lauded for how safe they’ll be: human error is removed, and precise programs will safely coordinate the high-speed movement of literal tons of metal and glass.

But a complication arises: even with this technology, it’s not possible to avoid all loss of life, so cars need to be programmed to minimize deaths when preventing them isn’t an option. Sometimes, for example, they’ll have to “decide” whether to run into a tree (likely killing the driver) or into a pedestrian, or between slamming into a bus full of kids or a four-door sedan. And because these programs will be written by humans, the developers have to do some moral math: how many kids outweigh one adult? What about a family vs. a group of friends? Locals vs. out-of-towners? Someone who looks healthy or looks ill? And take it further: A CEO or a janitor? A woman or a man? A black person or a white person? Right now, there are no industry standards for just how to write this decision-making into self-driving technology. But in speaking to Radiolab, Carnegie Mellon professor Raj Rajkumar made clear just who should not have the ultimate say:

We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don’t think a programmer eating pizza and sipping Coke should be making that call.

Sara Wachter-Boettcher would agree. In her book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, Wachter-Boettcher looks at who is designing the websites, apps, and products we use every day in every area of our lives and what that means for users of tech. It is just one group of people in charge of the industry, she writes: young, privileged white men.  

This is not exactly news. Rajkumar’s pizza-and-Coke programmer pretty much nails the stereotype of “tech bros” (sometimes referred to as “brogrammers”), a stereotype Wachter-Boettcher confirms. She describes a culture dominated by a group of extremely educated, high-achieving men whose employment in the tech industry gives them license to do whatever they want. The result is “a group of mostly white guys from mostly the same places [that] believes it deserves to be at the top.” She describes a hyper-sexualized male environment that devalues the contributions of women.  Recent reporting details drug-fueled parties and the rampancy of sexual harassment.

Wachter-Boettcher notes that tech companies have made a show of increasing diversity in their companies, but that little has resulted from these efforts. She explains that tech companies blame “the pipeline,” claiming that not enough women or people of color apply, and so it’s not their fault that their hires aren’t diverse. But the country’s top universities graduate black and Hispanic computer science and computer engineering students at twice the rate that they’re hired by tech companies. One black woman who hasn’t been able to find a tech job laments, “Instead of putting in the effort to look for us, Facebook is ignoring the fact that we even exist.”

This is bad enough. But Wachter-Boettcher is concerned with something else: how this lack of diversity affects the tech products we use, and how our societal biases are reinforced by their use. Having worked in the tech industry herself, she observes, “The more I started paying attention to how tech products are designed, the more I started noticing how often they’re full of blind spots, biases, and outright ethical blunders—and how often those oversights can exacerbate unfairness and leave vulnerable people out.” Because the industry is dominated by a very particular type of person, it’s easy for them to overlook the needs and concerns of other groups of people.

Stop letting the industry see itself as an enclave of visionaries that can’t be hindered by outsiders’ concerns.

What blind spots does Wachter-Boettcher identify? Subscription software that automatically codes the title “Dr.” as male. Social networks that don’t accept Native American names. Forms that require the selection of “male” or “female” to join, or that have confusing racial categories. Housing ads that target by race. Software that labels pictures of black people as gorillas.

These oversights would be less frequent, Wachter-Boettcher argues, if the tech industry were run by a more diverse group of people, with the concerns of a larger and more diverse group of users in mind. She explains that when technology is created “by a homogeneous team that hasn’t taken the time to understand the nuances of its audience…they often end up designing products that alienate audiences, rather than making them feel at home.” Of course, she allows, everyone makes mistakes. “Individually,” these oversights are “just a paper cut.” But “put together, they’re a constant thrumming pain, a little voice in the back of your head: This isn’t for you. This will never be for you.”

Such alienation amounts to more than personal insult; it can affect the culture at large. When the default setting of technology is to establish some traits as “default” and everything else as undesirable anomalies, “the biases already in our culture are quietly reinforced.”

Knowing all of this, what should users of technology do? Wachter-Boettcher tells us to hold tech companies accountable. Stop letting the industry see itself as an enclave of visionaries that can’t be hindered by outsiders’ concerns. “I hope you’ll feel comfortable asking hard questions of the digital products you use, and the people who make them,” she writes. “Tech has spent too long making too many people feel like they’re not important enough to design for…There’s nothing wrong with you. There’s something wrong with tech.”

It’s easy to feel shut out of decisions that get made about technology without our consent or even our awareness. Raj Rajkumar’s call for massive participation in deciding how to program self-driving cars is admirable, but how does one even start?

In a world where Facebook has political power, Amazon gets in on the health-care industry, and Twitter can steal your online presence to boost celebrities’ follower count, we must get over that paralysis. Wachter-Boettcher encourages us, “Send customer service complaints. Tell the media. Write your congressperson. Support an alternate product.” Technically Wrong reminds us that we need a voice in tech decisions, and that if nothing changes, those of us who are most vulnerable might only suffer more.

Regina Munch is an associate editor at Commonweal.

Also by this author

Most Recent

© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.