that increasing privacy settings may actually produce what they call an “illusion of control” for social-network users.
WSJ runs a great article on issues with FB current privacy position. It seems FB position themselves as a repairing mechanics not as a professional architect when they work on privacy controls.
The newspaper story started with an example of involuntary disclosure of sexuality when a teenage joined a chorus FB group. Her parents was informed about her sexuality via FB. The reporter Geoffrey A. FOWLER then explained some inevitable change to privacy: “For much of human history, personal information spread slowly, person-to-person if at all.”; “Personal worlds that previously could be partitioned—work, family, friendships, matters of sexuality—become harder to keep apart.” ;”Facebook is committed to the principle of one identity for its users.” ; “increasing privacy settings may actually produce what they call an “illusion of control” for social-network users.”
After reading this article, I noticed that although FB is responsive in fixing the technical issue, they did not discuss how they design and verify privacy BEFORE launch. Millions of FB users do the test for FB for free. The largest software testing I ever know. FB improves their system after their user already suffered the misbehave of their system.
Privacy settings affect every user and FB should design each new function or each disclosure with systematic impact analysis. There should be a clear document listing how each activities is displayed to friend and the public. FB should notify the user community what impact a new system feature will bring to such disclosure.
The idea that we letting FB continuously fixing their system scares me. Privacy should start with impact analysis and robust testing before thing happen.