At the recent F8 conference, Mark Zuckerberg jokes that; “We don’t exactly have the strongest reputation on privacy.” No one laughed. Clearly, his developer audience took user demands for privacy protection more seriously than Mr. Z.
Other than the desperate need of a new speechwriter, what is going wrong with Facebook?
In March, Zuckerberg published, “A Privacy-Focused Vision for Social Networking”, a 3,000-word epiphany on privacy from the man himself. In the piece he claims to have discovered that, “We should be working towards a world where people can speak privately and live freely knowing that their information will only be seen by whom they want to see it and [it] won’t all stick around forever.” And, “Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.”
I always thought he built Facebook to collect, package and sell users’ private information to advertisers. At least that is what drives 99 percent of the revenue sustaining Mr. Z’s net worth.
To square this public stance with the true perspective of the company, we now turn to comments from a Facebook attorney, defending the company in the Cambridge Analytica case. According to an article in Law360, Facebook counsel Orin Snyder, argued that, “You have to closely guard something to have a reasonable expectation of privacy. [At Facebook], there is no invasion of privacy at all, because there is no privacy.”
Thus, a user’s right to privacy could not have been violated since they did not have any privacy. One curious aspect of this position is the implicit acknowledgement that Facebook’s security features are a joke and accordingly they should expect that their private information will be open to the world.
Foolishly posting on this insecure network is tantamount to granting Facebook consent to share your information with third parties. It’s a bad deal we opted into, willingly letting our information to be sold to the highest bidder…and receiving none of the proceeds of that sale. Children growing up now will likely look back at our collective deal with Facebook and shake their heads at our gullibility.
How can any company survive treating their customers’ greatest concern with such disdain?
The simple answer is that Facebook’s users are not customers. They are suppliers. They supply the product that is re-sold to advertisers.
Like any other commercial enterprise, Facebook pushes their suppliers to provide more and better product for the same or lower prices. Their attempts to get users’ financial data from banks and the announced entry into shopping and payment processing are simply efforts to expand their product set to make it more attractive to advertisers, political consultants and others. The best aspect of Facebook’s business model is the price the 2.3 billion suppliers charge for the product. Nothing.
The situation reminds me of Warren Buffett’s quote about the cigarette business. “It costs a penny to make. You sell it for a dollar. It’s addictive. And there is a fantastic brand loyalty.”
Facebook is a business, and business is great. In the U.S. they make about $25 from each of 190 million users. Selling advertisers access to their global user base produced a bodacious $29 billion in operating cash flow last year. Given these numbers, it is difficult to imagine Zuckerberg’s “commitment to privacy” which would necessarily degrade the quality of his product. Clearly, reducing the ethical flexibility necessary for this business model would be a challenge.
At times, we wonder how Facebook ended up with such a contentious relationship with their users/suppliers. We think there may be a deeper issue at play.
All successful startups face a peculiar problem – how to preserve their unique organizational culture in the face of hyper growth. After lift-off, the struggle to find talent becomes extreme. One year in, you realize that 80% of the team was not around at the outset. The oft retold stories of the creation myth are just stories to them. How is it possible to infuse these newbies with that sense of wonder, excitement and do-or-die commitment?
Naturally, in haste, hiring mistakes are made. In the early days it is easy to identify the misfits and when they are sent away an enhanced sense of calm productivity returns to the team. With growth, the mistakes multiply and are more difficult to discern. To teach and preserve the culture, companies develop guiding principles and they organize HR processes for hiring and performance reviews around those principles.
Now, suppose those founding events aren’t such great examples to build a company culture. They include stolen ideas, cheated partners and other modestly ethical behavior – followed by tremendous success. In a likely misguided effort to preserve that culture you define a company mantra, “Be a hacker”.
“Be a Hacker”, is a peculiar phrase with two meanings. It could mean being aggressively inventive, rapidly iterating to achieve a product success. Or for most people, it means that person that took over your email account or stole your credit card numbers.
Reports are beginning to appear claiming that Facebook used certain control measures to preserve their hacker culture. According to an article by S. Rodriguez at CNBC, Facebook enforces its culture using a stacked ranking performance evaluation system augmented by anonymous peer reviews.
Employees are expected to be 100% positive on the company both internally and in their privates lives. According to Rodriguez’ account, dissent is not encouraged and company products and direction are not to be questioned. Teams race headlong to increase “user engagement”, regardless of the long-term impact on the trust and the security of those users.
Not surprisingly, managers review social network accounts of their teams to assess compliance. Employees can become desperate to get good peer reviews, driving less than authentic office behavior. According to the comments from former employees, one bad peer outcome can crater a career.
Given this environment (and that amazing cash flow) it is difficult to imagine Facebook executing a cultural shift to caring about their user/suppliers.
It also provokes another question. Suppose Facebook offered a truly secure private network. What would users be willing to pay for that protection?
– Jim Anderson