« July 2011 | Main | September 2011 »

August 28, 2011

Article 29 Working Party Publishes Letter Criticizing the Proposed Online Behavioral Advertising Self-Regulatory Framework.

Earlier this week, the Article 29 Working Party published a letter it sent to the Interactive Advertising Bureau Europe (IAB Europe) and the European Advertising Standards Alliance (EASA) regarding their proposed self-regulatory framework for online behavioral advertising (OBA) to satisfy the EU’s ePrivacy Directive.   The letter referred to a meeting between the Working Party and the OBA industry scheduled for sometime in September and was sent in advance of the meeting to inform the OBA industry of the Working Party’s main concerns with the proposed framework.

The Article 29 Working Party expressed several concerns with the current OBA industry proposals:

·         The absence of action by a consumer cannot be presumed to indicate his consent. Thus, the proposed opt-out scheme, which would allow consumers to be tracked before they exercise an objection, does not comply with the requirement set forth in Article 2(h) of Directive 95/46/EC that consent be “freely given, specific and informed.” 

·         The concern raised by the OAB industry that consumers’ online experiences will be negatively affected by having to continually agree to ad network cookies in order to view content has been overstated. The Working Party indicated that once consent has been given to an ad network, then the ad network will not have to seek consent again. The Working Party also suggested that the OAB industry could create a centralized method for consumers to provide consent to multiple ad networks.

·         The proposed browser methods for objecting to cookies are insufficient because they all currently default to accepting cookies. In order for a browser to provide valid consent, it must reject third-party cookies by default and require consumers to “engage in an affirmative action to accept cookies from specific websites for a specific purpose.”

·         The proposed icon on online advertisements is insufficient because: (i) the icons currently do not mean anything to average internet users; and (ii) the icons do not make the required information regarding tracking and profiling directly available (noting that once users click on the icon, they will still need to click at least two times to obtain the additional information and be able to opt out). 

The Working Party’s letter also attached a letter from David Vladeck, the Director of the FTC Bureau of Consumer Protection, responding to the Working Party’s request for information regarding the FTC’s position on transparency and consumer choice in connection with OBA.  Mr. Vladeck noted the guidance the FTC provided regarding OBA, including the 2009 FTC Staff Report, Self-Regulatory Principles for Online Behavioral Advertising, and the 2010 Preliminary Staff Report, Protecting Consumer Privacy in an Era of Rapid Change. Notably, the letter recognized the recent efforts of the OBA industry by stating “[s]ince the issuance of these reports, industry has taken a number of steps to improve transparency and consumer choice in the context of behavioral advertising.” Mr. Vladeck also summarized the FTC’s recent guidance regarding what it considers to be essential components of a successful Do Not Track mechanism: any mechanism should be easy to use, effective, universal and persistent.

August 24, 2011

Class action filed against comScore over alleged privacy violations.

            A putative class action was filed yesterday (8/23/11) against comScore, Inc., an internet research and analytics company.  The plaintiffs allege that comScore violated federal law and the Illinois mini-FTC Act by collecting personal information from consumers’ computers without the consumers’ knowledge or consent.  The complaint was filed in the federal district court for the Northern District of Illinois, Dunstan et al. v comScore, Inc. (No. 1:11-cv-05807).

            The complaint alleges that comScore induced consumers to download its surveillance software by bundling the software with third-party free software products such as screensavers, games, and CD burning software, but failed to clearly disclose the extent to which the surveillance software will monitor a consumer’s internet activity and the access the software will have to change privacy and security settings.   The complaint also alleges that comScore intentionally made the surveillance software difficult to disable or uninstall by not deleting it when the freeware with which it was bundled was deleted. 

The claims asserted include violations of the federal Stored Communications Act (18 U.S.C. § 2701 et seq.),  Electronic Communications Privacy Act (18 U.S.C. § 2510 et seq.), Computer Fraud and Abuse Act (18 U.S.C. § 1030 et seq.),  the Illinois Consumer Fraud and Deceptive Practices Act (815 ILCS 505/1 et seq.), and common law unjust enrichment.  The plaintiffs are seeking actual, statutory and punitive damages, an injunction to stop comScore’s  illegal practices, disgorgement of profits, and attorneys' fees.

August 19, 2011

Ninth Circuit: DDPA Does Not Forbid Buying Driver’s Data in Bulk

The Ninth Circuit found, in Howard v. Criminal Information Services, Inc., that the Driver’s Privacy Protection Act (DPPA), 18 U.S.C. §§ 2721–2725, does not prohibit the buying in bulk of state driver databases for future use of the information therein.

Two different groups of plaintiffs had filed suit seeking to represent a class in Oregon and Washington states, seeking damages on the ground that their personal information was obtained by defendants, among them a newspaper company and a company performing background checks, in violation of the DPPA.

The DPPA provides that personal information from state driver license databases can be obtained, disclosed, or used only for certain specified purposes, such as verifying the accuracy of personal information submitted by the individual, or to use in connection with matters of motor vehicle or driver safety and theft.  

However, plaintiffs did not complain that the ultimate use of their information was for purposes not permitted by the DPPA, but rather that the DPPA forbids bulk purchasing of driver’s personal information for future use, as future use is not a permitted purpose under the DPPA. Indeed, defendants had not requested driver’s records individually, but instead bought the entire database from the state, for the purpose of “stockpiling” it, a term used by the statute.  However, their ultimate use of the information was permitted purposes under the DPPA.

The Ninth Circuit concluded that plaintiffs did not state a claim that stockpiling information for a permitted use is not a violation of the DPPA, as the statute is concerned with the use to which the information is put, not the way it is acquired:

The DPPA does not contain a temporal requirement for when the information obtained must be used for the permitted purpose. Nor is there a requirement that once the information is obtained for a permitted purpose that it actually be used at all. The DPPA only requires that Defendants obtained the information for a permitted purpose.”

 

August 17, 2011

U.K. Equality and Human Rights Commission Publishes “Protecting Information Privacy” Report

The United Kingdom Equality and Human Rights Commission (EHRC) published this week a report, “Protecting information privacy,” written by Charles Raab and Benjamin Goold, from the University of Edinburgh and the University of British Columbia. The report represents the views of the two authors and do not necessarily represent the views of the Commission.

The report claims that current U.K. privacy laws and regulation do not adequately protect human rights, and that fundamental reform is needed, especially as data security breaches happen regularly (see p. 9-10 for examples). Such breaches are bound to happen more frequently, as demand for personal information increases, and new technology facilitates its collection. Indeed, “personal information privacy is under particular threat in today’s ‘information economy’ and ‘information-age government’” (p.10).

The public sector has increased its use of personal information, and the state plays an expanded role. The U.K. legal framework has “a weak, fractured and piecemeal approach to [privacy] regulation” (p.12), and it is more and more difficult for individuals to understand how their personal information is used, and what they should do when it is misused.

 The 1984 Data Protection Act (DPA) was the first statutory information privacy protection law. Also, Article 8 of the European Convention on Human Rights (ECHR) protects an individual’s ‘right to respect for his private and family life, his home and his correspondence.’ The ECHR is incorporated into U.K. law by the Human Rights Act (HRA) of 1998 (for an overview of current laws, see p. 25 and following).

According to the report, U.K legislation has not kept pace with technology changes, and that the state has failed to adequately protect the right to privacy. The report states that “[n]ew strategies must continually be developed to cope with the increasingly novel ways in which privacy, including information privacy, is at risk” (p.75).

The report makes four main recommendations:

(1)    The government should develop a clear set of ‘privacy principles’ to be used as a basis for future legislation, and as a guide to regulators and governments agencies concerned with information privacy and data collection.

 

(2)    Existing privacy legislation should be reformed to be consistent with ‘privacy principles’ in order to enhance existing provisions of the HRA.

 

(3)    There should be greater regulatory coherence, that is, the U.K. needs to rationalize and consolidate its current approach to the regulation of surveillance and data collection.

 

(4)     Technological, organizational, and other ways to protect privacy should be improved, and the development and use of technological and non-legal solutions to the problem of information privacy protection should be encouraged by government.

August 15, 2011

Settlement in FTC First Case Involving Mobile Applications

The Federal Trade Commission announced today that W3 Innovations, LLC, a developer of mobile applications, will pay $50, 000 to settle charges that it violated the Children’s Online Privacy Protection Act (COPPA) and the FTC COPPA Rule (the Rule). The case, United States of America v. W3 Innovations, LLC, is the first FTC case involving mobile applications.

The Rule applies to any operator of a commercial website or online service directed to children that collects, uses and/or discloses children’s personal information. A website operator must obtain “verifiable parental consent prior to collecting, using, and/or disclosing personal information from children.”

The Complaint alleged that defendant had offered some forty apps for download from the Apple’s app store, which allowed users to play games and share information online. These apps, listed by the Defendant in the Games-Kids section of the Apple store, and similar to games played by elementary school girls and boys, were targeted to children.

The Complaint also alleged that the defendant had collected over 30,000 email addresses, and also had collected, maintained, and/or disclosed personal information from about 600 users, but had failed to provide direct notice to parents about this practice and had failed to maintain or link to an online notice of the way it collects data. Defendant had not obtained verifiable consent from parents prior to collecting, using, or disclosing children’s personal information.

The Consent Order (the Order) ordered that Defendant must, within 5 days from the date of entry of the Order, delete all personal information collected and maintained in violation of the Rule, and also pay a $50,000 penalty.

 

August 12, 2011

Spain Enforces “Right to Be Forgotten”

Spain’s Data Protection Agency has ordered Google to delete personal information regarding approximately 90 individuals from Google’s search engine indexes. These individuals filed formal complaints with the Data Protection Agency alleging that certain personal information, such as decades old arrest records and the current address of a domestic violence victim, should not be accessible through the Internet. In ordering Google to delete information, the Data Protection Agency indicated that every individual has the “right to be forgotten” and have certain information deleted from the Internet.

The Agency and Google are now engaged in a lawsuit regarding whether Google can be required to remove certain information from its search indexes. Privacy experts have expressed concern that requiring search engines to delete certain personal information could restrict access to public information. Regardless of the outcome, however, the European Union is expected to draft legislation later this year that could include a “right to be forgotten” provision and allow individuals to have certain information deleted from the search indexes or websites.

August 11, 2011

Connecticut Enacts Law Restricting Access to Credit Reports

In late July 2011, Connecticut passed a law restricting employers’ access to employee’s or potential employee’s credit reports. Public Act No. 11-223 prohibits employers from requiring an employee or prospective employee to consent to a credit report request as a condition of employment, unless one of the following conditions is met:

  • The employer is a financial institution;
  • A credit report is required by law;
  • The employer reasonably believes that the employee has engaged in specific activity that constitutes a violation of the law related to the employee’s employment; or
  • A credit report is substantially related to an employee’s current or potential job or the employer has a bona fide purpose for requesting or using information in the credit report that is substantially job-related.

The new statute defines “substantially related to an employee’s current or potential job” to include a number of situation where an employee or potential employee would have managerial or fiduciary responsibilities, or would have access to personal information, confidential business information, or other sensitive data. Connecticut’s statute becomes effective October 1, 2011. This law is similar to employer credit report restrictions that have recently been enacted in other states, such as Illinois and Oregon.

August 3, 2011

Upcoming Privacy Panels at the ABA Annual Meeting in Toronto

Attending the ABA Annual Meeting in Toronto and interested in privacy?  Then don't miss these two important panels on the afternoon of Saturday August 6th:

New Restrictions on U.S. Internet Sales: Data Passes, Negative Options, Automatic Renewals and Recurring Charges (if you don’t know what they are, you should attend), on Saturday, August 6, from 2:00 pm – 3:30 pm, in the Metro Toronto Convention Centre, South Bldg, Room 716A, 700 Level.  The panel will address hot topics in data sharing practices involving personal information.  Speakers include Damier Xandrine, Senior Counsel, Wells Fargo; Holly Towle, partner at K&L Gates LLP, and Alysa Hutnik, partner at Kelley Drye & Warren LLP. 

"Can the Law Keep Up with Technology? Can Self Regulation Help?" - on Saturday, August 6th, from 3:45 - 5:15 p.m, in room 713B, 700 Level, in the South Building of the Toronto Convention Center.  Saira Nayak will moderate a discussion around the meaning of privacy self regulation, with FTC Commissioner Julie Brill, Privacy Commissioner Jennifer Stoddart of Canada, Stuart Ingis of Venable LLP, and Dr. Paolo Balboni of the European Privacy Association.  

 A complete listing of the ABA Annual Meeting programs is available at: http://www2.americanbar.org/annual/pdfs/2011TorontoProgramFinal.pdf

Massachusetts AG Announces $7500 Settlement with Bank for Data Breach

The Massachusetts Attorney General recently announced a $7,500 settlement with Belmont Savings Bank following a data breach in which an unencrypted backup computer tape was lost after an employee failed to follow the bank's policies and procedures.  This tape contained the names, Social Security numbers, and account numbers of more than 13,000 Massachusetts residents.

The tape was lost in May 2011, when an employee left it on a desk rather than storing it in a vault for the night.  Surveillance footage showed that the tape was then thrown away by the cleaning crew.  The tape was most likely incinerated by the bank's waste disposal company, and the bank has indicated that it has no evidence that the Massachusetts residents' personal information had been acquired or used by an unauthorized person.

In addition to the $7,500 penalty, the settlement requires Belmont Savings Bank to mitigate the risk of future data breaches by:

  • Ensuring the proper transfer and inventory of backup computer tapes containing personal information;
  • Storing backup computer tapes containing personal information in a secure location; and
  • Effectively training its employees on the bank's policies and procedures for maintaining the security of personal information.

This is the second announcement this year by the Massachusetts Attorney General’s office of a settlement as a result of a data breach. 

August 2, 2011

FTC Withdraws FCRA Commentary

Recently, the FTC withdrew its Statement of General Policy or Interpretations under the Fair Credit Reporting Act ("FCRA"), including the FTC's Commentary on the FCRA (the "Commentary'), the day before the authority to enforce and administer the FCRA transferred to the new Consumer Financial Protection Bureau (“CFPB”).

The FTC also released a staff report entitled "Forty Years of Experience with the Fair Credit Reporting Act."  This report provides background on the FTC's role in enforcing the FCRA, and includes a section-by-section summary of the agency’s interpretations of the FCRA. 

In announcing the withdrawal of the Commentary and release of the staff report, the FTC stated that the Commentary "has become partially obsolete since it was issued 21 years ago."  The new staff report deletes several interpretations in the Commentary that have since been repealed, modified or otherwise amended, and adds updated interpretations to reflect changes in the law since the Commentary was released in 1990.  The FTC stated that, given the Commentary's staleness, it "does not believe it is appropriate to transfer the Commentary."

The staff report highlighted several sections where the report differed from the Commentary, including:

·         Departments of Motor Vehicles:  The report does not adopt the Commentary’s position that a DMV providing motor vehicle reports for insurance underwriting purposes can be a “consumer reporting agency” for purposes of the FCRA.

·         Commercial Transactions:  The report adopts the positions of staff opinion letters issued in 2000 and 2001 that (1) a report by a consumer reporting agency is a “consumer report” even if it used for commercial purposes, and (2) an application for business credit does not give rise to a permissible purposes unless the report is on an individual who will be personally liable for the debt.

·         Joint Users: The staff report recognizes that the term “joint user” does not appear in the FCRA and therefore removes that terminology.

·         Identified Information:  The staff report clarifies that, given the advances in technology and the availability of consumer data, information contained on “credit guides” may still constitute a “credit report” even if it does not identify the consumer by name if it could “otherwise be reasonably linked to the consumer.”

·         Address in Adverse Action Notices:  The staff report permits consumer reporting agencies to list a post office box designated to receive disputes from consumers.

Since much of the FTC’s authority to enforce the FCRA has now been transferred to the CFPB, it remains to be seen what impact this new staff report will have on how the CFPB chooses to interpret and enforce the FCRA.