Mobile-radar-logo

Last Thursday Tim O’Reilly posted a contrarian view on the privacy flap regarding Facebook’s new facebook recognition feature to auto-suggest tags for photos. As Tim put it:

Face recognition is here to stay. My question is whether to pretend that it doesn’t exist, and leave its use to government agencies, repressive regimes, marketing data mining firms, insurance companies, and other monolithic entities, or whether to come to grips with it as a society by making it commonplace and useful, figuring out the downsides, and regulating those downsides.

Tim makes it clear that this is just an example of the overall stance he advocates with regard to cutting “the Gordian Knot on this thorny privacy problem”:

This is part of my general thinking about privacy. We need to move away from a Maginot-line like approach where we try to put up walls to keep information from leaking out, and instead assume that most things that used to be private are now knowable via various forms of data mining. Once we do that, we start to engage in a question of what uses are permitted, and what uses are not.

He even suggests a specific metaphor for thinking about privacy this way:

Overall, I think our privacy regimes need to move to a model similar to that applied to insider trading. It’s not possession of secret information that is criminalized; it is misuse of that information to take advantage of the ignorance of others.

Tim’s insider trading analogy illustrates why the heart of privacy is actually trust: company insiders have access to insider information precisely because they are the ones entrusted with this information to run the company. So the criminal act is not possessing the information; it is when the insider violates that trust by using that information for personal gain instead of for the company’s best interest.

Companies that aggregate and sell personal data without the knowledge or consent of the individual it describes are doing the same thing: violating the person’s trust by using this data for the company’s gain instead of for the person’s best interest.

That, in a nutshell, is the purpose of the Respect Trust Framework that Connect.Me announced last month (and which received the Privacy Award at the European Identity Conference). It lets us move beyond site-specific privacy policies, which are all-but-impossible for individuals to comprehend, to a simpler and more general set of social expectations about control of personal data. These expectations, captured by the five core principles of the Respect Trust Framework, make it in every company’s best interest to follow “the golden rule of data”, i.e., treat a customer’s data the same way they would want their own data to be treated.

Connect.Me agrees with Tim’s fundamental point: “Let’s stop trying to chase the personal data horse back into the barn — that’s impossible. Instead let’s start building fences governing where the horse can and can’t go.” The Respect Trust Framework stakes out those fences.