Google Prevails In Privacy Case Before Third Circuit - But Court’s Decision May Leave Door Open For Future Video Privacy Suits – On June 27, the U.S. Court of Appeals for the Third Circuit, in a precedential opinion, rejected the allegation that Google and Viacom violated the Video Privacy Protection Act (“VPPA”) and federal and state wiretapping statutes by allegedly tracking children’s Internet activity. The decision represents one of the first rulings under the VPPA to not stretch the definition of personally identifiable information (“PII”) to include attenuated digital identifiers, such as IP addresses, which other courts have accepted. However, the appeals court’s refusal to establish a clear rule for what constitutes PII may leave the door open for future challenges to companies’ data collection practices.
In In re: Nickelodeon Consumer Privacy Litigation, a multidistrict consolidated class action, the plaintiffs were children younger than 13 who alleged that the defendants, Google and Viacom, unlawfully collected personal information about them on the Internet, including what webpages they visited and what videos they watched on Viacom’s websites. The court found that plaintiffs’ claims overlapped substantially with claims the court rejected in November 2015 in In re: Google. However, the court concluded that two of the plaintiffs’ claims – one for violation of the federal VPPA, and one for invasion of privacy under New Jersey law – raised questions of first impression for the court.
The VPPA, passed by Congress in 1988, prohibits the disclosure of PII relating to viewers’ consumption of video-related services. Interpreting the Act for the first time, the Third Circuit first held that the law permits plaintiffs to sue only a person who discloses such information, not a person who receives such information. The court then held that the VPPA’s prohibition on the disclosure of PII applies only to the kind of information that would readily permit an ordinary person to identify a specific individual’s video-watching behavior. According to the court, the information that Viacom allegedly provided Google – which included links for viewed videos and “static digital identifiers” such as IP addresses and unique device identifiers – could not be considered PII under the VPPA.
In reaching this decision, the panel noted that the 1988 statute is “not well drafted” and stressed that its interpretation of the term PII was intended to “articulate a more general framework” rather than establishing a “single-sentence holding capable of mechanistically deciding future cases.” While the panel’s reading of the statute foreclosed plaintiffs’ VPPA claims in the case before it, the court’s failure to clearly define what types of disclosures run afoul of the VPPA may leave the door open for future plaintiffs to allege that other methods companies are using to maximize the value of data could indeed be linked to a specific user and thus constitute PII under the statute.
Reporter, Drew Crawford, Washington, DC, +1 202 626 5512, email@example.com.
ACLU Suing To Limit The Scope Of The Computer Fraud And Abuse Act And Promote Research Of Online Discrimination – On June 29, 2016, the American Civil Liberties Union filed a lawsuit against Loretta Lynch in her official capacity as the United States Attorney General to challenge the constitutionality of a provision of the Computer Fraud and Abuse Act (“CFAA”), 18 U.S.C. § 1030, et seq. in Sandvig v. Lynch, Civ. No. 1:16-cv-01368 (D.D.C. June 29, 2016). The lawsuit is brought on behalf of several academics and a media organization, who argue that the current interpretation of CFAA by courts prohibits and chills their research efforts to identify discrimination on the Internet.
As alleged by the ACLU in Sandvig,with the rise of big data analytics, more and more commercial websites are relying on confidential and proprietary algorithms to collect and track people’s data, cull the data, and use the results to determine access to services for their customers. However, because the process of big data analytics is typically considered proprietary and confidential, Plaintiffs fear that the websites are secretly taking into consideration information that is leading to discrimination on the basis of race, gender, or other protected characteristics. In order to test their theories, Plaintiffs want to conduct audit testing and investigations of online discrimination. Audit testing is the practice of pairing individuals of different races to pose as potential users of the website to identify whether they are treated differently. This practice has historically been used to identify potential civil rights abuses in the employment and housing context.
As a result, the ACLU on behalf of the Plaintiffs has brought claims alleging that this interpretation of the CFAA is a (1) violation of the First Amendment as an unconstitutional restriction on Plaintiffs’ protected speech activities; (2) violation of the First Amendment right to petition the government for redress or grievances regarding violations of the Fair Housing Act, Title VII, and other civil rights; (3) violation of the Fifth Amendment’s due process clause because the statute is vague; and (4) unconstitutional delegation of law-making authority to website owners. The ACLU is seeking declaratory relief, asking the court to declare that CFAA on its face violates the United States Constitution as alleged above. The ACLU also seeks injunctive relief to stop the Attorney General’s office from enforcing Section 1030(a)(2)(C) of CFAA.
The concerns raised by the Plaintiffs and the ACLU in Sandvig regarding the potential discriminatory use of big data appear to be shared by the Federal Trade Commission (“FTC”). On January 6, 2016, in advance of the FTC’s PrivacyCon conference, the FTC released a Commission Report entitled, Big Data: A Tool for Inclusion or Exclusion? The Report was the result of an FTC public workshop and details the FTC’s concerns with what it characterizes as the “era of big data.” The FTC’s report focused on how the commercial use of big data can impact low-income and underserved populations. While the Report details the potentially positive impacts that these analytics can have, such as advancements in medicine, education, and health, it also outlines the many ways in which big data can be used to perpetuate existing disparities, such as making assumptions that deny individuals access to opportunities like credit based on analysis of non-traditional credit information (e.g., analysis of online shopping histories). Accordingly, the FTC cautioned companies that collect, store, sell, and use big data to be mindful of how their activities could potentially result in discriminatory actions that could subject them to liability.
Reporter, Julie A. Stockton, San Francisco and Silicon Valley, +1 650 422 6818, firstname.lastname@example.org.
The Impact Of Brexit On Implementation Of The GDPR In The UK – As many of you will have read in the international press, the British public has voted at an historic referendum to leave the European Union. There are wide-ranging implications for many areas of law, the economy and government as a result of the decision. These challenges must be worked through, and so far there is a significant lack of clarity on many of the implications of the vote. King & Spalding already has updated our clients in general terms on the potential impact of a British Exit (“Brexit”) via the Firm’s client alert on Brexit. In the meantime, what is the likely status of the General Data Protection Regulation (GDPR) as a result of Brexit? Please read this article as a high level summary to be read in conjunction with our broader Brexit client alert.
The GDPR is an EU Regulation. Regulations rely on the principle of direct effect, which means that they are directly implemented into the law of each member state without the need for domestic legislation and have immediate effect. The GDPR is due to come into force in April 2018 and will have direct effect in each EU member state.
The UK has now voted to leave the EU. It will take some time to achieve the exit from the EU – the likelihood is that it will happen from mid to late 2018. It is currently very unclear what form the Brexit may take, so we cannot make any actual predictions in particular on specific pieces of legislation. It is unclear whether the Regulation will be implemented at all in the UK, as the Brexit will occur in the same year as the anticipated implementation of the GDPR, and the Regulation can only have direct effect in EU member states.
We anticipate that the UK Parliament will consider the implementation of any legislation which is pending in advance of an actual Brexit. Parliament may decide to effect the implementation of the GDPR by way of an independent UK statutory instrument. This may mean that the GDPR is adopted in its current form in the UK. Alternatively, a different version of the GDPR may be adopted in the UK – potentially a more likely scenario, as the UK has traditionally (where it has been able to do so) not always chosen to implement the most stringent versions of European law.
So far there is little clarity around what kind of future relationship the UK will maintain with the European Union or how this will be implemented and regulated. Unsurprisingly, there is currently no specific comment on the future of the GDPR in the UK. For the meantime, the UK maintains its current regime around data protection law, having implemented its own domestic legislation compliant with the existing EU Data Protection Directive as applicable across the European Union. King & Spalding will keep our readers up to date as we learn of developments which affect our DPS practice.
Reporter, Kim Roberts, London, UK, +44 20 7551 2133, email@example.com.