What Skills Do Your Kids Need For The Future? Try This

Published in Online Spin, August 19, 2016.

We stood staring at the painting on the wall of the fancy gallery. We were both a bit flummoxed. Finally, my friend said, “I’m trying to avoid the whole, ‘My eight-year old could have done that’ thing.”
The conversation moved on. “Did you see that headline about the artificially intelligent teaching assistant helping students online for a whole semester? Nobody noticed,” he said.

“It’s true,” he added. “Computers can be lawyers, doctors, accountants. The only jobs left for humans are the creative ones.

“But computers have composed music, written screenplays, generated art. I’m pretty sure a computer could have created that monstrosity on the wall over there. If you told me a computer did it,” he said, “I’d be a lot more impressed.”

Of course a computer could do it. So what are the safe jobs? What should our kids study?

I don’t know what the jobs of the future look like. There may not even be jobs. But regardless, there are some things the world needs people to study urgently: things like philosophy, ethics, behavioral science, psychology and the like. We urgently need to better understand why we do what we do, and what our moral frameworks are.

It’s not that we shouldn’t teach technology. Technology is, in fact, the very reason we need more kids to study philosophy and ethics.

Think about the philosophical and ethical implications of having our lives run by algorithms. We tend to think algorithms are way fairer than humans. But what they’re actually great at is consistency. They have nothing to do with fairness.

Consistency and fairness might sound like the same thing, but they’re not at all. Our criminal justice system consistently discriminates against minorities -- but that doesn’t mean it’s fair.

And sure, an algorithm doesn’t “care” if you’re a minority. But if the initial dataset is biased, the algorithm will continue to reflect that bias -- generating the same result as if it did care.

Cory Doctorow articulated this last week, writing, ’Cities that use data from racist frisking practices to determine who the police should stop end up producing algorithmic racism; court systems that use racist sentencing records to train a model that makes sentencing recommendations get algorithmic racism, too.’”

Doctorow cites Cathy O’Neil, who pointed out that Donald Trump behaves like a biased machine learning algorithm. “Trump’s goal is simply to ‘not be boring’ at Trump rallies. He wants to entertain, and to be the focus of attention at all times… What that translates to is a constant iterative process whereby he experiments with pushing the conversation this way or that, and he sees how the crowd responds. If they like it, he goes there. If they don’t respond, he never goes there again, because he doesn’t want to be boring. If they respond by getting agitated, that’s a lot better than being bored. That’s how he learns.”

Back to Doctorow: ‘Trump's algorithm is to say semi-random things until his crowd roars its approval, then he iteratively modifies those statements, seeking more and more approval, until he maxes out and tries a new tack.’”

O’Neil: “[But] he’s got biased training data, because the people at his rallies are a particular type of weirdo… In that sense he’s perfectly objective, as in morally neutral. He just follows the numbers. He could be replaced by a robot that acts on a machine learning algorithm with a bad definition of success – or in his case, a penalty for boringness – and with extremely biased data.”

This is why we need our young people to study philosophy, and ethics, and behavioral science, and psychology. We need people to understand how humans work, and how easily manipulated we are. We need people to program compassion into our algorithms and machines, to imbue them with the ability to evolve as our society evolves, to shift away from institutionalized bias and bigotry as surely as we shifted away from slavery.

I don’t care if computers paint crummy paintings. But humanity should belong to humans.