Privacy Mirror: A privacy experiment in Facebook

As I previously blogged, I read Canada’s Assistant Privacy Commissioner Elizabeth Denham’s findings on Facebook and it got me thinking about 3rd party applications. I wondered what 3rd party app developers could see in my profile. In my estimation, the easiest way to find out what a 3rd party application developer could see, was to become a 3rd party application developer. Enter Privacy Mirror I built a basic Facebook application called Privacy Mirror. The goal of Privacy Mirror was to see, as a 3rd party developer, just what information I could glean from my profile via Facebook’s APIs. At first, I used two Facebook API calls:

Laplace’s Demon, Santa Claus and TSA’s Secure Flight

No doubt you frequent fliers out there have received emails from your airline of choice talking about TSA’s Secure Flight. As you make air travel reservations in the future, your airline will communicate with TSA to get, essentially, a fly/no-fly decision from the Secure Flight system. As the TSA explains in the “How it works” section of their website dedicated to Secure Flight: Secure Flight matches the name, date of birth and gender information for each passenger against government watch lists to:

Personal Privacy Impact Assessments for Facebook

I’m reading Canada’s Assistant Privacy Commissioner Elizabeth Denham’s recently released findings into complaints levied against Facebook. (Report of Findings into the Complaint Filed by the Canadian Internet Policy and Public Interest Clinic (CIPPIC)against Facebook Inc. Under the Personal Information Protection and Electronic Documents Act.) My first reaction to this is, frankly, one of jealousy. I wish we had a similar commissioner/czar/wonk here in the US. I suppose elements of the FTC work in this regard but without the same charter, which is too bad.

Privacy Risks Get Real — California Privacy Laws, Octomom, and Kaiser Permanente

No organization wants to be the first to be fined because of a new regulation. Unfortunately, that’s exactly where Kaiser Permanente finds itself. After some high profile cases of unauthorized access to celebrities’ medical records, the California legislature adopted two new privacy laws (SB 541 and AB 211); these regulations were so swiftly enacted that they contained spelling errors. Both regulations went into effect on January 1 of this year. Five months later, Kaiser Permanente has become the first enterprise to be fined under this new regime. Regulators have levied the maximum fine, $250,000, for the recent incident involving Nadya “Octomom” Suleman. (Kevin commented on this previously.) All in all, 23 individuals looked at Ms. Suleman’s records without authorization. Of these, 15 have either been fired or resigned. And although the state regulators have fined Kaiser, they have yet to penalize any of these 23 individuals - which they can do under state law. As reported in the LA Times, Suleman’s lawyer said:

Two Bonuses for Privacy Professionals

There are plenty of reasons to come to Catalyst. Engaging workshops, great sessions, interesting speakers, the chance to see the entire Identity and Privacy Strategies team on stage with bags on their heads - you know, the kinds of thing you’d expect. For those of you with a Certified Information Privacy Professional (CIPP) certification, this year we’ve a little something extra for you – continuing education credits. By attending IdPS’ Privacy Risks Get Realtrack, you’ll earn 3.5 hours of continuing privacy education (CPE) credit. Attend SRMS’ Risk Management: Programs You Can’t Afford to Cut and receive another 3.5 hours of credit.

The beginning of the beginning: our privacy report publishes

Over the last 6 or so months, Bob Blakley and I have been doing a lot of listening and thinking about privacy. To successfully re-launch our privacy coverage, we needed to lay a wide foundation that would serve to support future research. We needed to provide a meaningful starting point for our customers. Since our customers’ jobs are not typically focused on privacy, we needed to start with a form of first principles and build outward. I’ve learned that it is generally frowned upon to use the second person in our reports – too informal I am told. Use the blog if you want to address the audience directly. Normally, I don’t have a problem avoiding the second person, but this report proved to be a challenge. We had to work hard not to write without using “you.” And why was that? Privacy discussions are and must be inclusive. They involve each of us on a far more personal level than a discussion of, say, account lifecycle management. Cognizant of privacy implications or not, the decisions you make on a daily basis have effects the privacy of your customers and partners. Because privacy is personal, because it requires concerted behavior throughout the enterprise, discussions about privacy must include everyone. You. Me. Everyone. To guide concerted behavior, in our recently released privacy report, we put forth a Golden Rule as a means of developing and evaluating privacy principles leadings to practices and behaviors:

Privacy risks get real

When you think of “the usual” privacy risks you think of things like brand and reputation damage, fines, and increased regulations. You don’t think of jail time for executives. But jail time is exactly what some Google executives face if an Italian prosecutor has his way. The arrest of Peter Fleischer, Google’s Paris-based Global Privacy Counsel, in Milan on January 23 stems from video that was briefly available on Google’s site in Italy. The video showed high school students bullying a classmate with Down Syndrome. Google took down the video in less than 24 hours after receiving complaints about it. The view of Milan’s public prosecutor is that permitting posting of the video for any period of time was a criminal offense. Fleischer and three other Google employees have been charged with defamation and failure to control personal information. In our forthcoming report, Bob and I explore the contextual nature of privacy. Google clearly operates in multiple geographic and legal contexts. In the US, Google enjoys protections similar to those afforded “common carriers”. However, in Italy, Google is being treated as a content provider and not a content distributor, and thus is not receiving any such protection. The contextuality of privacy requires that you evaluate your business from all relevant contexts. In this case, Google may find that it should have looked at its video services from the perspective of an Italian user as well as an Italian regulator. This examination from all relevant contexts would highlight not only conflicts between contexts (someone’s desire to publish a video versus a state’s definition of what constitutes offensive or inappropriate content) but also conflicts between contexts and the organization’s business model. Google’s business of allowing anyone to post a video is in this case colliding with an Italian regulator’s desire to treat Google as a content provider, holding Google to an unanticipated set of requirements. There’s no way that a small privacy team will be able to know everything about every context the company does business in. To that end, a side effect of doing business in multiple contexts can be a budgetary one. Organizations may need to budget for external legal counsel, counsel that specializes privacy for the contexts they are working in to aid privacy teams in their evaluation of relevant contexts. We don’t expect criminal penalties for privacy violations to become common, and it’s not at all clear that the action against Google’s executives will be sustained by the Italian courts. But that being said, we do expect privacy regulations to become stricter and subsequent penalties to become more severe. Privacy risks are getting real. Join us at Catalyst this summer and learn how to adapt, and thrive, in the face of this new reality. (Cross-posted from Burton Group’s Identity Blog.)

Putting privacy controls in the hands of your users

I mentioned yesterday that Bob and I have just finished up some research on privacy. In this upcoming report, we stress the importance of establishing privacy principles and then using those principles to guide privacy practices. I happen to see this NY Times article (via Nishant’s Twitter stream) and had a bit of a Baader-Meinhof moment. The article talks about how social networking sites are giving their end-users more and more control over how information is disclosed. Giving users choice as to how their information is disclosed and used is important. Giving users meaningful choice as to how their information is used is much better.

International Privacy Day: Synchronicity

Today is International Privacy Day (and also National Data Privacy Day here in the USA and maybe where you are too). The day is set aside to celebrate the anniversary of the Council of Europe Convention on Data Protection. Put on your reading list for today both the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data as well as the Organisation for Economic Co-operation and Development’s Guidelines on the Protection of Privacy and Transborder Flows of Personal Data.

Stripping Search

In response to regulatory pressure and to apply some pressure on their competition, Yahoo has announced that after 90 days it will anonymize search queries and remove personally identifiable information (PII) from them as well. Specifically, Yahoo will delete the last eight bits from the IP address associate with a search. Further, Yahoo will remove some PII data, like names, phone numbers and Social Security numbers from the searches. The goal is to (eventually) destroy the ties between a person and what that person searches for which could include embarrassing, compromising, or sensitive items such as information about medical conditions, political opposition materials, adult entertainment, etc. There are two points I want to draw you attention to. The first point is related to the amount of time search providers, like Yahoo, hold identifiable search queries. Regulators have recommended to search vendors to reduce how long they hold identifiable searches. The EU has recommended 6 months, for example. Yahoo, reducing their retention time from 13 months, has taken a laudable step to reduce that time to 90 days. In the future, the time it takes a search provider to extract whatever goodness it wants to out of a search query (to feed its varied businesses) and anonymize that query will reach zero. External pressures aside, the Googles and Yahoos of the world will achieve near-instantaneous goodness-extraction/anonymization of search queries simply because it reduces what they have to store, maintain, and worry about. That being said, even though search providers will be able to achieve near-instantaneous extraction and anonymization, they will never be able to put it into practice. Why? Because there will always be a desire on the part of law enforcement to gain access to those identifiable searches. The second point relates to the methods and outputs of the anonymization process. The industry needs to provide greater transparency in their anonymization methods to ensure that the scrubbed queries are truly anonymous. Consider AOL Stalker. AOL thought they had scrubbed their searched, but in reality those searches were fairly trivial to de-anonymize. Removing the last 8 bits of the IP address, as Yahoo and Google are doing, certainly helps to anonymize a search, but it does not do so completely. In fact, all removing the last 8 bits does is render my IP address indistinguishable from 255 other IP addresses in those last 8 bits - hardly anonymized. Once a search provider extracts what it needs to from a search, I question why it has to retain any IP information at all. I applaud Yahoo’s announcement; decreasing retention time is a good thing. But, I’d like to ask two more of the search providers. First, work together, in a transparent manner, to ensure the methods anonymization produce truly anonymous search query data. Make sure that when you strip a search of PII of all sorts including IP address, it cannot be transformed back into an identifiable search. Second, work with browser makers to have an anonymous search mode. Akin to the private browsing mode of better browsers everywhere, an anonymous search mode would indicate to the search provider that the search being submitted from the browser must be anonymized immediately. With announcements like Yahoo’s, 2009 may shape up to be a great year for privacy. (Cross-posted from Burton Group’s Identity Blog.)