Zuckerberg's Talk About User Control Is Meaningless
Published in Online Spin, April 13th, 2018.
“This is the most important principle for Facebook,” said the man sitting before the Senate's Commerce and Judiciary committees. “Every piece of content that you share on Facebook, you own, and you have complete control over who sees it… and how you share it, and you can remove it at any time.”
This was a repeated theme during Mark Zuckerberg’s two days of hearings on Capitol Hill: You own your data. You decide who sees what. You can change any setting whenever you want. You can stop third parties from using your data to target you.
It’s not our fault, he seemed to say. It’s yours.
Here is a question I would have asked: What percentage of Facebook users access or adjust their controls more than once a month?
I suspect it’s a small number. We’re set-and-forget-it creatures; we don’t want to think too hard before we reshare the latest American Chopper meme.
Behind Zuckerberg’s overly solicitous proclamations of user control sits a hard truth: You can give people as much control you want over who sees their information. It won’t make a dang bit of difference--because Facebook is asking the wrong questions.
The company is asking you whether you want this post to be seen by all friends, by specific friends, by all friends except specific ones, by a custom list of included and excluded friends, by everyone, by only you, by people you went to high school with, by people you went to college with, by close friends, by family.
You could specify those settings for each and every post — and it still wouldn’t stop the likes of Cambridge Analytica and their ilk.
These are the questions Facebook should be asking:
Would you like us to show you content that makes you angry?
Would you like us to show you content that challenges your world view, or, instead, reinforces what you already believe?
Would you like us to show you content, including advertising, that makes you feel better about yourself — or worse?
Would you like us to encourage you to buy things you don’t need?
Would you like us to show you content that makes you more informed — or less?
Would you like us to encourage you to be more compassionate — or less?
In other words, what kind of person do you want to become as a result of using Facebook?
These suggestions may seem shocking. Why would we ask Facebook to intentionally manipulate our emotions? We are lying to ourselves. It already does this.
I uninstalled the Facebook app on my phone, but I still access the platform through the phone’s browser. On a regular basis, I get a message at the top of the screen: “There are posts from your friends you can only see on the app. Download it now!”
You are missing out! Your friends are having fun without you! You should feel worse about yourself -- but you could feel better about yourself, if you use our product MORE!
Facebook will never ask the kind of questions I’m proposing, because its business model would crumble. Facebook doesn’t benefit if you feel better about yourself, or if you’re a more informed, thoughtful person. It benefits if you spend more time on its site, and buy more stuff. Giving the users control over who sees their posts offers the illusion of individual agency while protecting the prime directive.
I don’t want control over who sees my posts. I want control over the person I am, and the person I want to become. Give me that, and maybe I’ll reinstall the app on my phone.