Ethics Are For Everyone

Published in Online Spin, March 17th, 2017.

I admit to an overwhelming amount of schadenfreude at the challenges recently faced by Uber.
After all, its behavior has been egregious for ages. The company lied about how much drivers will make. It lied about their safety fees. It investigated critics and threatened a female journalist who dared to speak negatively about it. In New Zealand, where I live, Uber has encouraged drivers to break the law, without guaranteeing it would pay any fines or provide legal cover.

So the past few weeks -- which have included Susan Fowler’s explosive account of systemic sexism at Uber, a potentially devastating lawsuit from Waymo, and an unsympathetic video of Travis Kalanick fighting with a driver -- have seemed like a well-deserved comeuppance.

It’s satisfying to point the finger at Uber precisely because its actions are so brazen. But doing so also carries a risk: the risk of distracting us from the more subtle ways technology impacts our lives, and the profound ethical implications thereof.

Luckily, people are thinking about this. People like Abe Gong, chief data officer at Aspire Health, who recently gave a powerful talk about ethics for algorithms. These algorithms, he argues, represent gatekeepers to just about everything we want to do in life -- whether it’s renting a house, getting a car loan, or making parole -- and the way they’re designed has real-world consequences for the real-world people on the other end.

But it’s not just algorithms we should be thinking about. What are the ethics of Airbnb driving up rental prices in areas with lots of listings? What are the ethics of the gig economy? What are the ethics of the increasing income inequality felt so intensely in Silicon Valley, the home of technological innovation? What are the ethics of us using our exponentially expanding capabilities to create apps for the affluent, while millions suffer or starve or flee from the conflict in their homelands?

These are important questions, and they deserve robust discussion. Three days ago, a friend of mine started a thread in a Google group with this provocation: “What is our vision of a ‘better’ civilization? What are the foundations of what we believe to be good? What are our values? When we talk about ‘impact’ and ‘transform’, what do we mean? What kind of impact is positive, and why? What do we envision as a desirable future for millions of people in a specific reality and a specific culture?”

77 replies later, the conversation is going strong. Turns out not everybody measures success exclusively by how much capital you raise. Not everybody thinks the best use of new technologies is to create a startup with a good exit strategy. And not everybody wants to be the next Steve Jobs; some want to be the next Malala Yousafzai.

Abe Gong proposes that we ask the following four questions when designing algorithms: One: Are the statistics solid? Two: Who wins? Who loses? Three: Are the changes in power structures helping? And four: How can we mitigate harms?

He also proposes that we make ethics reviews part of the absolute norm for algorithm development, so it would be weird not to do one.

It’s not only algorithms that need ethics reviews. We need them throughout product design, as well as in marketing, sales, HR, finance — right up to the very top of the organization. What are we in business for? Who wins and who loses? What is the impact on power structures? And how can we mitigate harm?

It’s not just the programmers who need to care. Ethics are for everyone. And it’s time for everyone to get involved.

Kaila Colbin