The Federal Trade Commission intervened this week to stop Facebook from doing things like making private user information public… back in 2009. Besides being years too late to impact key product decisions, the proposed settlement doesn’t go far in controlling what Facebook can do in the future.
Which is a good thing.
The company has willfully changed its product countless times over the years, ignoring its users, the developers on its platform, its advertisers, privacy advocates, politicians, etc. Facebook protest pages (um), scathing articles, and petitions to quit have pressured Facebook revert back from whatever it was trying to do.
But they’ve all been forgotten, except for stragglers like the FTC case. What remains for most people is a vague sense that Facebook has “privacy issues.” But no one can agree on what they exactly are, and no on really seems to mind, anyway.
Anyway, the site has 800 million users worldwide now, half of whom log in every day. It has gotten most of these users in the past few years precisely because it has gotten more open.
Would the FTC have prevented Facebook from being such an important part of so many people’s lives if it had intervened sooner? It actually could have, when you look at Facebook’s evolution and its results. One of the single biggest issues that the FTC had, for example, was that Facebook had forced friend lists to become public in December of 2009. But those friend lists have made it much easier for all of us to do things like see what our friends are reading when we visit news sites.
The way Facebook has succeeded with users is that it has changed the product based on how it has seen people using it, or how it thinks they’d want to use it. By convincing a closed community of college users to share all of their intimate details, then slowly making key parts of its site more public, it has been able to get all of its users sharing their identities online over the years.
The problem with that plan was that Facebook didn’t always know what it could be or how it could get there. Friend lists, for example, started out private because Facebook didn’t imagine itself becoming the social layer of the internet.
So, until November 2009, Facebook told users that “You choose what information you put in your profile, including contact and personal information, pictures, interests and groups you join. And you control the users with whom you share that information through the privacy settings on the Privacy page.”
Along with an interface that allowed users to keep any piece of information about yourself private, that term led those who cared to understandably assume that Facebook would not force them to make some information they shared public.
But then it changed the terms that November, and in December required everyone to make their profile name, profile picture, list of friends, current city, gender, networks, and fan pages public. It offered some ways to partially obscure this information, like making it so that your profile didn’t appear in searches, but the main result was still lots of private information becoming public. Facebook fundamentally deceived users in this case, even if it did so for what believed were the right reasons.
A massive outcry followed, focusing on all the horrors that might come (but never really materialized). That triggered what became a key issue in the FTC investigation and, eventually, a main change in the settlement this week.
Facebook now agrees to make all privacy changes to existing products opt-in. “Giving consumers clear and prominent notice and obtaining consumers’ express consent before their information is shared beyond the privacy settings they have established,” in the words of the FTC. While it can still make new products and make privacy settings opt-out for those products, Facebook and the FTC clarified to us earlier this week, it’s hard to think of anything the company might do that would be very scary for most users.
In the meantime — the past two years between the product change and the settlement — Facebook has pushed all sorts of new features that take direct advantage of all this public information.
The change around friend lists and that data, in particular, has allowed it to launch the graph API, and the Like buttons and other social plugins that populate the web today. By letting your friends share the fact that they’re friends with you when they join other services, Facebook can make it easier for them to share what they do with you. The Like button requires just a click to let you share something with friends.
It’s hard to know exactly how things would have gone differently if Facebook hadn’t made the changes when it did. Maybe people who keep friend lists private wouldn’t be able to receive shared items from friends? Or maybe everyone who wants to share anything would have to go through some sort of extra sharing approval window?
More broadly, the types of changes the FTC has been pushing could have blocked broader background uses that we take for granted. For example, Facebook now takes the place of a phone book. You can look up anyone you’ve ever met in the world, and you’ll be able to see enough information about them to figure if you’d like to be friends.
If Facebook hadn’t experimented, or if it had listened to criticism and pulled back, or if it had been prevented from making any changes by the FTC or other government bodies, every Facebook user might be a little less informed and more bored today.
The FTC has fair points. Yes, Facebook shouldn’t have misled users about its intentions at some points in the past. No, it shouldn’t have made security errors that at times have let some private data out. Yes, it should try to provide more clarity to users when new changes happen. No, Facebook still shouldn’t have let deleted content be accessible. (Yes, its efforts to explains its changes over the years make for a hilarious read).
But Facebook was right to try to change its product to become somewhat more public. The world has been slowly getting more comfortable with coming online over the last couple decades. Like online payments before it, social data sharing has turned out to be something that people are fundamentally okay doing. If Facebook was going to become a social layer that made every product it touched more valuable, it needed to make some of that social sharing public as well.
Doing something new and different is inherently full of mistakes. There will always be critics. But great results make everything else fade into the background.
So thank you, Federal Trade Commission, thank you for taking nearly two years to tell Facebook how to handle privacy issues that were already over and done with. And thank you for not getting in the way of whatever Facebook is planning to launch next.
Source:http://techcrunch.com/2011/12/04/thankyouftc/
Which is a good thing.
The company has willfully changed its product countless times over the years, ignoring its users, the developers on its platform, its advertisers, privacy advocates, politicians, etc. Facebook protest pages (um), scathing articles, and petitions to quit have pressured Facebook revert back from whatever it was trying to do.
But they’ve all been forgotten, except for stragglers like the FTC case. What remains for most people is a vague sense that Facebook has “privacy issues.” But no one can agree on what they exactly are, and no on really seems to mind, anyway.
Anyway, the site has 800 million users worldwide now, half of whom log in every day. It has gotten most of these users in the past few years precisely because it has gotten more open.
Would the FTC have prevented Facebook from being such an important part of so many people’s lives if it had intervened sooner? It actually could have, when you look at Facebook’s evolution and its results. One of the single biggest issues that the FTC had, for example, was that Facebook had forced friend lists to become public in December of 2009. But those friend lists have made it much easier for all of us to do things like see what our friends are reading when we visit news sites.
The way Facebook has succeeded with users is that it has changed the product based on how it has seen people using it, or how it thinks they’d want to use it. By convincing a closed community of college users to share all of their intimate details, then slowly making key parts of its site more public, it has been able to get all of its users sharing their identities online over the years.
The problem with that plan was that Facebook didn’t always know what it could be or how it could get there. Friend lists, for example, started out private because Facebook didn’t imagine itself becoming the social layer of the internet.
So, until November 2009, Facebook told users that “You choose what information you put in your profile, including contact and personal information, pictures, interests and groups you join. And you control the users with whom you share that information through the privacy settings on the Privacy page.”
Along with an interface that allowed users to keep any piece of information about yourself private, that term led those who cared to understandably assume that Facebook would not force them to make some information they shared public.
But then it changed the terms that November, and in December required everyone to make their profile name, profile picture, list of friends, current city, gender, networks, and fan pages public. It offered some ways to partially obscure this information, like making it so that your profile didn’t appear in searches, but the main result was still lots of private information becoming public. Facebook fundamentally deceived users in this case, even if it did so for what believed were the right reasons.
A massive outcry followed, focusing on all the horrors that might come (but never really materialized). That triggered what became a key issue in the FTC investigation and, eventually, a main change in the settlement this week.
Facebook now agrees to make all privacy changes to existing products opt-in. “Giving consumers clear and prominent notice and obtaining consumers’ express consent before their information is shared beyond the privacy settings they have established,” in the words of the FTC. While it can still make new products and make privacy settings opt-out for those products, Facebook and the FTC clarified to us earlier this week, it’s hard to think of anything the company might do that would be very scary for most users.
In the meantime — the past two years between the product change and the settlement — Facebook has pushed all sorts of new features that take direct advantage of all this public information.
The change around friend lists and that data, in particular, has allowed it to launch the graph API, and the Like buttons and other social plugins that populate the web today. By letting your friends share the fact that they’re friends with you when they join other services, Facebook can make it easier for them to share what they do with you. The Like button requires just a click to let you share something with friends.
It’s hard to know exactly how things would have gone differently if Facebook hadn’t made the changes when it did. Maybe people who keep friend lists private wouldn’t be able to receive shared items from friends? Or maybe everyone who wants to share anything would have to go through some sort of extra sharing approval window?
More broadly, the types of changes the FTC has been pushing could have blocked broader background uses that we take for granted. For example, Facebook now takes the place of a phone book. You can look up anyone you’ve ever met in the world, and you’ll be able to see enough information about them to figure if you’d like to be friends.
If Facebook hadn’t experimented, or if it had listened to criticism and pulled back, or if it had been prevented from making any changes by the FTC or other government bodies, every Facebook user might be a little less informed and more bored today.
The FTC has fair points. Yes, Facebook shouldn’t have misled users about its intentions at some points in the past. No, it shouldn’t have made security errors that at times have let some private data out. Yes, it should try to provide more clarity to users when new changes happen. No, Facebook still shouldn’t have let deleted content be accessible. (Yes, its efforts to explains its changes over the years make for a hilarious read).
But Facebook was right to try to change its product to become somewhat more public. The world has been slowly getting more comfortable with coming online over the last couple decades. Like online payments before it, social data sharing has turned out to be something that people are fundamentally okay doing. If Facebook was going to become a social layer that made every product it touched more valuable, it needed to make some of that social sharing public as well.
Doing something new and different is inherently full of mistakes. There will always be critics. But great results make everything else fade into the background.
So thank you, Federal Trade Commission, thank you for taking nearly two years to tell Facebook how to handle privacy issues that were already over and done with. And thank you for not getting in the way of whatever Facebook is planning to launch next.
Source:http://techcrunch.com/2011/12/04/thankyouftc/
No comments:
Post a Comment