Archives: Online Advertising

Written by Jake Romero

California district judge Lucy Koh has rejected a motion to dismiss brought by Facebook, Inc. in response to a lawsuit brought by plaintiffs Angel Fraley, et al. alleging that Facebook’s “Sponsored Stories” violate California’s Right to Publicity Statute (CA Civil Code §3334).

Sponsored Stories are paid advertisements containing the name and profile picture of one of the friends of the Facebook user viewing the advertisement, suggesting that that user’s friend “likes” the sponsor’s product or service.  Aside from clicking the “Like” button on the web page associated with such sponsor’s product or service, the user being featured in the Sponsored Story does not otherwise consent to the use of his or her name or likeness.  Although Facebook’s privacy settings can be customized in a number of ways, the plaintiffs allege that it is impossible to entirely opt out of participation in Sponsored Stories.

Facebook has previously prevailed in petitioning to have a number of privacy-related lawsuits dismissed.  However, the court held that Fraley may be distinguishable from such prior cases in a number of ways.  First, plaintiffs were able to allege injury with sufficient specificity by asserting that Facebook has violated the plaintiffs’ right of publicity.  California’s Right to Publicity Statute prohibits the nonconsensual use of another’s name, voice, signature, photograph, or likeness for advertising, selling or soliciting purposes.  By describing precisely the information of each plaintiff that was used by Facebook (primarily names and profile photos), the court held that the plaintiffs were able to allege injuries concrete and imminent enough to survive Facebook’s motion to dismiss.  The plaintiffs in Fraley allege that in violating the rights granted by California’s Right to Publicity statute, each plaintiff has suffered an injury similar to a celebrity whose image or likeness is misappropriated to sell a produce or service for the commercial gain of another.  “[I]n essence,” the plaintiffs argued, “[Facebook’s users] are celebrities – to their friends.”

In its motion to dismiss, Facebook claimed that the plaintiffs lacked standing because the alleged injury was conjectural or hypothetical and cited a number of privacy cases which were dismissed for failure to show injury.  However, the court rejected this argument because the plaintiffs alleged that the economic injury to each of the plaintiffs can be quantified and valued on the basis of Facebook’s own profit and valuation of sponsored advertising.  According to published articles, the sale of targeted advertising such as Sponsored Stories is a primary source of revenue for Facebook.  Plaintiffs in Fraley supported this argument, in part, by quoting Facebook CEO Mark Zuckerberg as stating that “[a] trusted referral influences people more than the best broadcast message.”  In other words, while plaintiffs in prior privacy claims failed to show, for example, that each individual plaintiff’s browsing history or the direction of advertisements toward the plaintiff, had caused economic injury, the plaintiffs in Fraley used Facebook’s own valuation of sponsored advertisements to allege specific economic harm on an individual basis.

Finally, the court also rejected Facebook’s argument that it is shielded from liability by Section 230 of the Communications Decency Act (CDA) which provides broad immunity to websites that primarily publish the content of third parties.  The court held that while Facebook falls under the definition of an interactive website under the CDA, by compiling Sponsored Stories Facebook may be a content provider, and therefore may not be entitled to immunity under the CDA.

This case hits at the heart of the Facebook revenue model, and bears close watching with an anticipated IPO for the company in 2012.

Most of the legislative privacy buzz this session has centered around online behavioral advertising (OBA) — along with the Federal Trade Commission’s proposal for a universal “do-not-track.”

The center of discussion for U.S. legislators and regulators has been clear and conspicuous disclosure to users about OBA and to allow opt-outs.    Regulators on the EU are engaged in the ‘great cookie debate” over whether the EU privacy laws require a user’s explicit consent to OBA or whether opt-out would be sufficient.

Two privacy experts argue in a new paper that they are all missing the point and this paper is excellent reading.  Omar Tene, a fellow at the Center for Democracy and Technology, and Jules Polonetsky, co-chair and co-director of the Future of Privacy Forum think tank write:

Unless policymakers address this underlying normative question — is online behavioral tracking a social good or an unnecessary evil — they may not be able to find a solution for implementing user choice in the context of online privacy…..This is not to say that a value judgment needs to be as stark as a binary choice between privacy and efficiency. On the contrary, a more nuanced equilibrium is needed taking into account the benefits of not only privacy rights but also access to information and services, freedom of speech and economic efficiency.

 Written by Kevin McGinty

In a mixed decision, a federal court judge in New York dismissed federal statutory claims arising from Web-based advertisers’ use of cookies that tracked users’ Web browsing activities, but denied a motion to dismiss claims under state law.  The plaintiff in Bose v. Interclick  alleged that Interclick and clients McDonalds, CBS, Mazda and Microsoft violated the federal Computer Fraud and Abuse Act (“CFAA”), 18 U.S.C. § 1030, and the New York deceptive business practices statute, , N.Y. General Business Law § 349 (“Section 349”) through their use of such cookies.  The court reached different results based on the respective statutes’ damages requirements.

The CFAA claim was dismissed for failure to meet the statute’s minimum damages requirement.  The CFAA permits civil claims where the defendants’ conduct caused “loss to 1 or more persons during any 1-year period . . . aggregating at least $5,000 in value.”  Id., § 1030(c)(4)(A)(i)(I).  Plaintiff, however, failed to allege that she had suffered losses sufficient to meet the statutory threshold.  Her complaint alleged three types of injury; damage due to impairment of computer processing, losses resulting from Interclick’s collection of personal information and loss due to interruption of her Internet service.  As to the first, while costs associated with impairment of computer functioning, including the cost of repair, can be cognizable damages under the CFAA, plaintiff failed to allege any specific cost associated with such injury to her computer, let alone that the cost exceeded $5,000.  Similarly, plaintiff did not and could not ascribe any value to the mere possession of her personal information, nor is there anything inherently wrongful in business collection of demographic information about customers.  In addition, plaintiff did not specifically allege the extent of purported service interruptions, nor did she allege facts showing that the cost of such interruptions exceeded $5,000.  Finally, the court concluded that the statutory language of the CFAA did not permit aggregation of putative class injuries for purposes of meeting the $5,000 threshold and in any event, there were insufficient allegations in the complaint to permit the court to infer that even aggregated injuries would exceed that amount.

Conversely, the court denied Interclick’s motion to dismiss plaintiff’s state law claim Section 349.  The court rejected Interclick’s argument that plaintiff had the burden to allege reliance under Section 349.  Moreover, unlike the CFAA, Section 349 does not require a plaintiff to allege or prove that there has been a pecuniary loss.  In support of that latter point, the court notes that prior New York state court decisions have held that a complaints alleging a purported breach of privacy stated claims under Section 349 even in the absence of any allegation of pecuniary loss.  Left unresolved by this decision is the question of whether injury can be established for any class members without engaging in individualized fact finding that would preclude certification of a plaintiff class.  Even so, this case demonstrates how allegations under state consumer protection statutes can, in some jurisdictions, provide a basis at the pleading stage to keep a privacy class action alive even in the absence of actual damages.  In other states – including, notably, California – actual damages requirements under state consumer protection laws pose an obstacle to maintaining such claims.

The court did grant the motions of McDonalds, CBS, Mazda and Microsoft to dismiss the Section 349 claims against them based on the failure of plaintiff to allege that they had engaged in any of the purportedly deceptive conduct set forth in the complaint.

The Federal Trade Commission’s public comment period on its preliminary staff report, Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers, has closed.   The FTC received over 300 comments during the extended comment period, including several states.

It is looking more likely that some form of privacy regulation — either at the FTC or Congress — will develop in 2011.   Several bills have been introduced in this Congress and both the FTC and the Commerce Department are working on their proposals.   “Self-regulation,” the mantra of the online advertising industry, may no longer be a viable option, unless industry acts and acts quickly to provide consumers with the level of choice and transparency that the FTC’s Privacy Framework outlines.

In fact, FTC Chairman Jon Leibowitz today recommended exactly that in an interview with Multichannel News, posted here.  “I guess I would say that the business community really has it in its hands to avoid regulation, it just has to step up to the plate,” he said.


The Federal Trade Commission has extended the public comment period on its December 1, 2010 report — FTC Privacy Report.  The FTC press release says that, in light of the complex issues raised by the report, a number of organizations have requested an extension of the original January 31, 2011 deadline.  Stakeholders now have until February 18, 2011 submit their comments.

We reviewed some of the questions posed by the FTC in the Report in earlier posts.  It is important that the FTC hear from stakeholders in this process in order to have the broadest input possible.  Parties with significant interest in these issues should pay close attention to these questions and participate by filing a comment with the FTC.

[UPDATED] – There have been over 200 comments filed with the FTC, however, nearly all of them were filed by individuals supporting the FTC’s “do not track” proposal.   Industry participants have filed requesting additional time to comment.

While on the “do-not-track” thread, since Sunday, both Google and Mozilla have announced that they will be adding “do-not-track” options to their browsers.  Mozilla announced its plan on Sunday.   Google’s announcement came today, called “Keep My Opt-Outs.”

Microsoft announced a similar plug-in for Internet Explorer back in December.


(UPDATED to include links to report and press release from FTC website)

The Federal Trade Commission (FTC) has just released its long-awaited (and 123-page long) report on consumer privacy:  “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers”  (the “Report”).

This Report is the result of a year-long effort by the FTC, through a series of roundtables, to explore the privacy issues and challenges associated with 21st century technology and business practices.   According to the Report, “many companies — both online and offline — do not adequately address consumer privacy issues.   Industry must do better.”

The Report proposes a new framework for addressing the commercial use of consumer data which builds upon the current so-called “notice-and-choice” and “harm-based” models of consumer privacy, the FTC’s law enforcement experience, and the record from the roundtables.   This new framework (the “Framework”) would apply “broadly” to online and offline commercial entities “that collect, maintain, share, or otherwise use consumer data that can be reasonably linked to a specific consumer, computer or device.”  

The three main components of the Framework are:

1)   Companies should adopt a “privacy by design” approach — The Framework suggests that companies should build privacy into their everyday business practices, including such practices that provide reasonable security for consumer data, collecting only the data needed for a specific business purpose, retaining the data only as long as necessary to fulfill that purpose, safely disposing of data no longer being used, and implementing reasonable procedures to promote data accuracy.

2)  Companies should simplify the choices presented to consumers about data practices — The Framework proposes that consumer choice not be required for data practices that are “commonly accepted,” but otherwise, consumers should be able to make informed and meaningful choices. Importantly, the Framework suggests that “this may entail a ‘just-in-time’ approach, in which the company provides the consumer with a choice at the point the consumer enters his personal data or before he accepts a product or service.”   Further, the Report states: “The most practical method of providing such universal choice would likely involve the placement of a persistent setting, similar to a cookie, on the consumer’s browser signaling the consumer’s choices about being tracked and receiving targeted ads.  Commission staff supports this choice, sometimes referred to as ‘Do-Not-Track'”.   

3)  Proposal of a number of measures that companies should take to make data practices more transparent to consumers – these measures include (a) review and improvement of privacy policies to make them more clear, concise and easy-to-read, (b) providing consumers with reasonable access to the data that companies maintain about them, with particular mention of data brokers, (c) provide “robust notice” and obtain affirmative consent for material, retroactive changes to data collection policies (note that this does not only apply to published privacy policies), and (d) undertake a broad effort to educate consumers about commercial data practices and the choices available to them.

The Commission staff is seeking comment on the Framework by January 31, 2011. 

We will have more on the details of the Framework in coming posts.

 (Added 12:10 pm) The Commission’s press release can be found here.




The Federal Trade Commission (FTC) has reached a settlement with EchoMetrix over charges that it failed to inform parents that information it was collecting about their children would be disclosed to third-party marketers.  The company’s website says that EchoMetrix is a publicly traded systems development company that “understands and interprets content on the digital web.”

According to its press release, the FTC claimed that EchoMetrix did not adequately disclose to parents that information it collected from its Sentry software program allowing parents to monitor their children’s online activities also was being shared with marketers through EchoMetrix’s Pulse marketing research program. EchoMetrix also advertised Pulse, a web-based market research software program that it claimed would allow marketers to see “unbiased, unfiltered, anonymous” content from social media websites, blogs, forums, chats and message boards. One source of content available to Pulse users, the FTC alleged, was portions of the online activity of children recorded by the Sentry software.

Continue Reading EchoMetrix Settles FTC Complaint Over Disclosure of Children’s Information to Marketers

As part of Global Entrepreneurship Week USA, the Digital Media SIG is holding a panel discussion tonight that will be thought-provoking (or at least the panel — including your author — hopes so) and takes on the issue of online privacy as it relates to the advertising world.  Tracking, following, assembling, analyzing, dicing and slicing — all of this activity happens behind browser and mostly without the end user’s knowledge and certainly without the end user’s consent.   Panel is from all sides of the debate –

Join us!  Registration information is here


On June 24, 2010, the European Union’s body that addresses data protection issues, the so-called Article 29 Working Party, adopted Opinion 2/2010 (the “Opinion”) providing further clarification on the amended e-Privacy Directive (below) as applied to online behavioral advertising. The Working Party also issued a press release on this topic.

Continue Reading Online Behavioral Advertising: The European Union Controversy