I have decided that deprecating Facebook, migrating off of it as a social network, is the most prudent response to a series of actions by management. Because my decision is based on a lack of trust of the current leadership I will revisit it every six months or so to see if things have changed sufficiently to warrant a re-evaluation. I deleted about 80% of my connections tonight and don’t plan to add any until a I re-evaluate the service in early 2011.
“You have one identity… The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly… Having two identities for yourself is an example of a lack of integrity” – Zuckerberg, 2009
Deprecating Facebook Because I Can No Longer Trust It
Social networks are part of the “mission critical infrastructure” for a firm like ours. We had not been actively using Facebook for marketing or networking, for the most part I joined out of curiosity and accepted invites, many of which seemed to be generated by folks dumping their e-mail address books into the system. Something that I have never done on any social network.
Mark Zuckerberg, Facebook’s CEO, lent his name to an opinion column in the Washington Post “From Facebook, Answering Privacy Concerns Over New Settings” which outlined the the principles under which Facebook operates:
- You have control over how your information is shared.
- We do not share your personal information with people or services you don’t want.
- We do not give advertisers access to your personal information.
- We do not and never will sell any of your information to anyone.
- We do not and never will sell any of your information to anyone.
As far as I can tell none of these are actually measured or enforced. All of them, depending upon your definition of personal, seem to be violated routinely.
Zuckerberg was recently interviewed by Walt Mossberg and Kara Swisher at the All Things D D8 conference. He didn’t evidence much candor or willingness to engage substantively on what were for the most part softball questions. This indicates to me not just a lack of media training but some fairly serious deficiencies in his understanding of what his responsibilities are as a CEO of a technology companies used by hundreds of millions of people. The fact that he was able to do this also indicates he hasn’t hired anybody that either understands Facebook’s obligations to its users or has the ability to influence the CEO to do the right thing. This makes them dangerous as a supplier of mission critical infrastructure.
Brad Templeton has written two good posts on Facebook and Privacy in the last month that I have find personally very persuasive. His post “The Peril of the Facebook Anti-Privacy Pattern” makes this observations:
There’s been a well justified storm about Facebook’s recent privacy changes. The EFF has a nice post outlining the changes in privacy policies at Facebook which inspired this popular graphic showing those changes.
But the deeper question is why Facebook wants to do this. The answer, of course, is money, but in particular it’s because the market is assigning a value to revealed data. This force seems to push Facebook, and services like it, into wanting to remove privacy from their users in a steadily rising trend. Social network services often will begin with decent privacy protections, both to avoid scaring users (when gaining users is the only goal) and because they have little motivation to do otherwise. The old world of PC applications tended to have strong privacy protection (by comparison) because data stayed on your own machine. Software that exported it got called “spyware” and tools were created to rout it out.
Users will demand the rich experience, but what they need is a way to assure that sites that want to make use of personal information only ask for, and only get, what they truly need in order to make that experience work. If they don’t need your birthday, or all your friend’s names, they should not get it. And this must happen all the time, not just when you take the time to use a complex privacy console to control what they will be given. This is not something individual users can or will negotiate on a site by site basis. They don’t have the power to negotiate it and the companies don’t have the time. Negotiation requires parties of equal power to get real give and take.
If we don’t solve this, the two forces (market pressure to reduce privacy, and natural monopolies in identity provision) will drive us in a direction we don’t want to go. As I have written before, I believe the only answer is to move social apps back closer to our own computers and away from the cloud, as tempting as the cloud is. Only if the data never leaves our hands will they remain under our control. We need a resurgence of the belief that software that took our data and exported it for inappropriate purposes was spyware. Facebook and its partners are now purveyors of spyware, yet no anti-spyware program is yet ready to delete it from your browser for you. Indeed, the new way the protections work, your friends are offering up information about you when they visit the partner sites, and you have even less control over that.
Facebook argues that their whole service is “opt in” because you have to join it. That’s true to an extent, but ignores the fact that if social apps are going to be useful, we should find a way to do them without the pressure to strip users of all privacy, and not only offer people the choice of living in a glass house or never leaving the house at all.
In “When is ‘Opt Out’ a ‘Cop Out‘” Templeton critiques the Zuckerberg column in the Washington Post
Coming soon, you will be able to opt out of having your basic information defined as “public” and exposed to outside web sites. Facebook has a long pattern of introducing a new feature with major privacy issues, being surprised by a storm of protest, and then offering a fix which helps somewhat, but often leaves things more exposed than they were before.
We’ll only be able to convince web sites to truly protect our rights if we can sit down and negotiate with them. Users can’t negotiate, and privacy control panels create the illusion of negotiating, but letting you tweak the terms. But you can only choose among the options they have decided they like. Opt-out control panels may seem like they enable user choice but they can actually harm it. Real choice comes only in being able to put your terms forward in negotiations.
In an “Open Letter to Mark Zuckerberg, Step Down” Shel Israel addresses Zuckerberg directly:
Kara Swisher and Walt Mossberg are topnotch interviewers, but they were also your hosts. They asked tough questions in a nice way. You had to know those questions were coming. On your level, you should have smart people in the back room asking you those same questions.
Mark, watch the above video. You swung and missed at every important question. Often, you just answered different question than the ones being asked, and Mark you did yourself and your company no good. I would say you did yourself some damage.
Mark, the tech industry has a long history of young entrepreneurs who were challenged to grow as fast as the companies they had created. Some succeeded and are still at the helms of their corporate ships. Others did not and wisely stepped down to allow firmer hands to guide the ship.
It is time for you to do exactly that, Mark. You will be remembered as a brilliant founder. You will have planted seeds to a mighty tree that will live on.
Danah Boyd has been a perceptive observer of on-line social networking trends since early in this century (21st not 20th). Her “Facebook and Radical Transparency (A Rant)” concludes with some excellent suggestions for how Facebook could demonstrate commitment to their stated values:
If Facebook wanted radical transparency, they could communicate to users every single person and entity who can see their content. They could notify then when the content is accessed by a partner. They could show them who all is included in “friends-of-friends” (or at least a number of people). They hide behind lists because people’s abstractions allow them to share more. When people think “friends-of-friends” they don’t think about all of the types of people that their friends might link to; they think of the people that their friends would bring to a dinner party if they were to host it. When they think of everyone, they think of individual people who might have an interest in them, not 3rd party services who want to monetize or redistribute their data. Users have no sense of how their data is being used and Facebook is not radically transparent about what that data is used for. Quite the opposite. Convolution works. It keeps the press out.
The battle that is underway is not a battle over the future of privacy and publicity. It’s a battle over choice and informed consent. It’s unfolding because people are being duped, tricked, coerced, and confused into doing things where they don’t understand the consequences. Facebook keeps saying that it gives users choices, but that is completely unfair. It gives users the illusion of choice and hides the details away from them “for their own good.”
I have no problem with Scoble being as public as he’d like to be. And I do think it’s unfortunate that Facebook never gave him that choice. I’m not that public, but I’m darn close. And I use Twitter and a whole host of other services to be quite visible. The key to addressing this problem is not to say “public or private?” but to ask how we can make certain people are 1) informed; 2) have the right to chose; and 3) are consenting without being deceived. I’d be a whole lot less pissed off if people had to opt-in in December. Or if they could have retained the right to keep their friends lists, affiliations, interests, likes, and other content as private as they had when they first opted into Facebook. Slowly disintegrating the social context without choice isn’t consent; it’s trickery.
Jeff Jarvis has come interesting observations on “Confusing A Public With The Public.”
I will argue that we face choices today about keeping something private or sharing it with our public or with the public at large and that we need to see the benefits of sharing—the benefits of being public—as we make that calculation. I will argue that if we default to private, we risk losing the value of the connections we can make today. I will argue that we need institutions—companies and governments—to default to public. And I will argue that the more we live in public, the more we share, the more we create collective wisdom and value. I will defend publicness. But I will also defend privacy—that is, control over this decision.
And if you don’t think deprecating Facebook by not using your account is enough, WikiHow explains how to “Permanently Delete a Facebook account.” I am keeping it on file.