I spend tons of time working at the intersections of marketing, artificial intelligence, data, and privacy. That means I spend lots of time working with — and worrying about — the role played by Amazon, Google, Apple, Microsoft, and especially given their recent missteps with regard to data and privacy, Facebook. Which is why Mark Zuckerberg’s recent opinion piece in the Washington Post proved so fascinating. Zuckerberg talked about Facebook’s challenges, and to address these asked for government regulation in a number of areas:
“But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone…From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.“ [Emphasis added]
Why does Facebook think that’s important? The main reason is because, as Zuckerberg continues:
Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own. So we’re creating an independent body so people can appeal our decisions.
Whoo-boy. There’s a lot going on here that needs unpacking. So, let’s dive in.
First, kudos to Facebook for recognizing two facts:
- Whether intentionally or not, the social giant has actively contributed to divisive, harmful conduct on the Internet.
- They probably shouldn’t be the final arbiter of the solution.
I agree wholeheartedly with both points. And good on Facebook for acknowledging their mistakes. Seriously. As, the saying goes, “the first step is admitting you have a problem.”
At the same time, I have a number of issues with the rest of the op-ed due to its potential effects for consumers and competitors alike.
For starters, as Brian Heater and Josh Contine write at TechCrunch:
“The op-ed rings somewhat hollow, though, because there’s plenty that Facebook could do to improve in these four areas without help from the government.”
Yep. Facebook is wise to turn over disputes around its policies to a third party, but why does it need a third party — in this case, the government — to tell them what its policies should be? In part, I suspect, because Zuckerberg and Facebook want to shape whatever form those regulations take.
Don’t misunderstand, I’m not opposed to government regulating customer privacy, election integrity, or use of customer data. I would argue they haven’t done enough in those areas. But I have a huge issue in Facebook driving that discussion.
As the joke goes about where an 800-lb gorilla sits (answer: Anywhere it wants to), Facebook’s size almost certainly guarantees them a seat at the table when it’s time to shape policy in these areas. But, ignoring that reality for a moment, given their past actions, do you really think Facebook has demonstrated it’s the right company to shape regulations around customer data and privacy? Yes, we’d hope they can provide plenty of lessons for others. The question is whether or not they’ve learned those lessons themselves. Offering them a role in the process feels a lot like letting the fox guard the henhouse after that fox has already helped himself to an all you can eat chicken buffet.
Facebook has continually failed to demonstrate that they’re a trustworthy advocate for consumers or competition. And please don’t misunderstand. I don’t think they’re actively evil. They’re simply untrustworthy in the same way a small child is untrustworthy. After all, you wouldn’t let your three year-old play with matches or sharp knives, would you? Of course not. Except in this case, the “three year-old child” is a $55 billion company, which makes it hard to make them sit in in the corner.
Still, the evidence is compelling for why that’s necessary. As recently as early February, TechCrunch reported this about Facebook:
“Since 2016, Facebook has been paying users ages 13 to 35 up to $20 per month plus referral fees to sell their privacy by installing the iOS or Android “Facebook Research” app. Facebook even asked users to screenshot their Amazon order history page…[Update 11:20pm PT: Facebook now tells TechCrunch it will shut down the iOS version of its Research app in the wake of our report. The rest of this article has been updated to reflect this development.] Facebook’s Research program will continue to run on Android.“ [emphasis added]
Remember, these are often the accounts of minors. The company also appears to have shared data about users’ health without their consent and stored “hundreds of millions of user passwords…in plaintext.”
Um, wow.
They also bought Instagram and WhatsApp when faced with competition that they couldn’t defeat. And, frankly, flat-out copied SnapChat’s most innovative features such as Instagram (and later Facebook) Stories when they could.
Is this the kind of company you can trust guiding regulations that will affect your privacy and personal data as an individual, to say nothing of the environment your company must compete in?
Remember, data is an increasingly valuable commodity in today’s business and marketing landscape. Would Facebook’s proposed solutions really protect consumers? Or would they simply pull up the ladder behind themselves now that they’ve scaled that solution and already have access to, oh, I dunno, more data than just about anyone?
Again, beware three year-olds with sharp knives.
(By the way, I’m scrupulously avoiding the topic of government regulation of “harmful content.” My thoughts are summarized best here.)
So, what should you do about all of this? Basically, there are two things you should focus on:
- Don’t wait for regulation to do the right thing by your customers. The worst excuse you could make for treating your customers badly is “well, technically, it was legal.“ GDPR exists because marketers did not treat customer data or customer privacy with the attention and respect it deserved. Facebook simply exhibits the worst of these tendencies But they’re hardly alone in acting less than perfectly in this regard. Don’t be “that guy.”
- Continue to pay attention to what’s happening with data privacy regulations. And then try to do better. This story has a long way to go. Between Facebook, Google, next year’s US elections, GDPR, the beginnings of the California Consumer Privacy Act, and other efforts around the world, we’re not done with this yet. You owe it to your customers — and your business — to stay informed.
Again, Facebook deserves credit for recognizing that there’s an issue in the way that it — and plenty of other marketers and businesses — treat customer data and privacy. And government undoubtedly has a role in helping to protect consumers’ best interest. However just because both of those statements are true, doesn’t mean that Mark Zuckerberg’s proposed solution is the right way to get there.
Instead, look out for your customers both because it keeps you on the right side of the law and because it’s the right thing to do. Better self-regulation is a strong first step towards doing what’s right by customers. And strong self-regulation practices will likely reduce the impact any government oversight will have on your business. Plus, I don’t know about you, but I find that customers generally prefer companies that treat them with respect.
Facebook has provided a roadmap for what not to do. Learn from their lessons. In the long run, your customers — and your bottom line — will thank you for it.