This past summer, the University of Amsterdam launched a new, week-long Privacy Law and Policy Summer Course related to the Internet, electronic communications, and online and social media. Course faculty included European and U.S. academics, European regulators and the head of the global privacy law practice at an international law firm, among others. Course participants consisted of 25 legal practitioners and post-graduate researchers from the Netherlands, Spain, Italy, Slovakia, the United States, Japan, Brazil, Kenya and other countries. I was lucky enough to serve as a co-organizer and faculty member for the course.
Taken together, the nine mini-seminars that constituted the backbone of the course provide a snapshot of developments in privacy law and policy in Europe and in the United States, and how they relate to one another. This should be of interest to U.S. lawyers and others who work in the areas of privacy law, compliance and management. What follows is a brief description of some key takeaways from the week, and an attempt to pull them together into a broader perspective.
Doing business over the Internet
Daniel Cooper, head of the Global Privacy Practice at Covington & Burling, discussed emerging legal and policy challenges facing European companies that seek to do business over the Internet. Cooper’s comprehensive presentation stressed that companies face a wide array of matters, including privacy issues related to online behavioral advertising and business use of social media, facial recognition technology, mobile apps, and big data. The 1995 Data Protection Directive pre-dates these technological developments and fits awkwardly with them.
For example, facial recognition technology currently allows social media providers to translate a user’s profile picture into a template, then employ that template to suggest “tags” when others upload photos of that user. Under the directive, both an image that clearly identifies a person’s face and a template that contains the distinctive measurements of that face, likely qualify as personal data. This means that a social media provider must obtain consent before enrolling an individual’s facial template into an identification database.
In 2011, the Hamburg (Germany) DPA ruled that Facebook should have obtained opt-in consent for its facial recognition function, rather than providing users with the opportunity to opt-out by changing their privacy settings. The Hamburg DPA ordered Facebook to delete its entire database of faces collected in Germany. This is but one example of the potential conflicts between existing data protection law and new technologies and business applications.
The status of data protection in Europe
Jan Philipp Albrecht, a German member of European Parliament who serves as the rapporteur for the proposed European General Data Protection Regulation; Giovanni Buttarelli, the assistant European data protection supervisor; and Joris van Hoboken of the University of Amsterdam’s Institute for Information Law (IViR) each discussed the current status of the proposed European Data Protection Regulation (as of July 2013, when the course took place). Together, their presentations indicated that the regulation was progressing well and would likely become law. Unlike the 1995 EU Data Protection Directive, the regulation will not be implemented through national law. Instead, it will apply directly to regulated parties. The regulation will, accordingly, harmonize data protection law throughout Europe.
Dutch authority’s enforcement efforts
Sjoera Nas, Internet and Telecom Expert for the Dutch Data Protection Authority and the Dutch DPA’s representative to the Article 29 Working Party’s Technology Subgroup, discussed how European DPAs enforce privacy and data protection rules related to the Internet. Nas explained that the Dutch DPA focuses its enforcement efforts on violations that are serious, structural, and/or affect a large number of people. For example, the Dutch DPA recently concluded an investigation of the voice messaging application WhatsApp for its collection, without permission, of the phone numbers contained in its users’ address books.
Do Not Track Initiative focuses on negotiation and self-regulation
Peter Swire, the Nancy J. and Lawrence P. Huang professor in law and ethics at Georgia Tech’s Scheller College of Business was, at the time of the course, serving as the co-chair of the World Wide Web Consortium’s (W3C) Do Not Track initiative. Swire explained the genesis of the DNT process and how it reflects the United States’ interest in multi-stakeholder negotiation and privacy self-regulation. He engaged the participants in a mock negotiation that simulated the discussion among parties engaged in the Do Not Track initiative.
The Internet of the future
Ian Brown, senior research fellow at the Oxford Internet Institute (UK); and Mireille Hildebrandt, professor of smart environments, data protection and the rule of law at Radboud University (The Netherlands) each spoke about the technologies of the future and the challenges they will pose for privacy law and policy. Brown described several different ways in which the future Internet could evolve.
Possibilities include optimistic scenarios in which Internet access is democratized, individuals gain increasing access to online education and services, the spread of Internet-connected appliances and objects (the so-called “Internet of things”) produces a more energy-efficient and productive economy, and government policies emphasize privacy and consumer protection and so increase user trust. They also include more pessimistic scenarios in which the mergers between ISPs, search engines, social media networks and entertainment conglomerates results in a fragmented, sanitized Internet in which users are intensively profiled and subtly controlled.
The policy choices we make today will influence which of these scenarios more accurately describes the Internet of the future. Professor Hildebrandt analyzed the relationship between big data, the smart grid and data protection law. She identified some of big data’s troubling implications for privacy and individual rights, including the tendency to subject individuals to automated and invisible decision-making based on their user profiles; the potential for discrimination; and the lessening of the presumption of innocence. Professor Hildebrandt then explained how the smart grid generates data that can facilitate these very tendencies. She showed how existing data protection law does not sufficiently address the issues that big data and the smart grid raise, and stressed the importance of data protection by design.
Privacy by design and security by design
Four presenters with expertise in privacy by design — Seda Gürses of Catholic University in Leuven (Belgium), Jaap-Henk Hoepman of Radboud University (The Netherlands), Kristina Irion of Central European University (Hungary), and Jeroen Terstegge of Privasense (The Netherlands) — offered insights into where this critical field stands today, the opportunities it presents and the challenges it faces. Among other things, these speakers identified the importance of addressing privacy during the technology development stage, of considering privacy by design and security by design together, and of maintaining privacy by design’s flexibility so that it can address many different types of issues.
Several themes emerge from the course as a whole. The European Union remains committed to its comprehensive approach to privacy law. The General Data Protection Regulation will harmonize privacy regulation throughout the EU. By contrast, the W3C Do Not Track initiative demonstrates the United States’ continuing reliance on self-regulation as an important aspect of privacy governance. Emerging technologies, including the smart grid, the Internet of things and the growing field of data analytics will generate new and profound privacy issues in the years to come. Society has important choices to make about whether, and how, it will seek to influence these developments — and what it wants for the Internet of the future. Privacy by design can play an important role in meeting these challenges.