A challenge to the use of a cy pres charitable donations to settle privacy claims against Google will be heard by the Supreme Court. In Frank v. Gaos, petitioners seek reversal of lower court decisions rejecting their objection to an $8.5 million settlement of claims arising from Google’s transmission of users’ search terms to third-party websites. Because the proposed settlement amount could not feasibly be distributed to the estimated 129 million class members, the settlement called for Google to pay the settlement proceeds, less class counsel fees, to certain privacy-related charities. The trial court awarded 25% of the settlement — or $2.125 million – to class counsel; the balance went to the charities. The petitioner’s objections to the settlement were overruled.
As we predicted in our post late last month, Google’s YouTube Kids app has attracted more than just the “curious little minds” Google was hoping for. Yesterday, a group of privacy and children’s rights advocates (including the Center for Digital Democracy and the American Academy of Child and Adolescent Psychiatry) asked the Federal Trade Commission “to investigate whether Google’s YouTube Kids app violates Section 5 of the FTC Act . . . .”
The advocacy group downloaded the YouTube Kids app onto an Android device, and two iOS devices. It then reviewed and assessed the app as it functioned; watching content Google says caters to children while protecting them from questionable or troubling content.
The advocacy group claims this review identified three features of the app it believes are unfair or deceptive. First, the group faults Google for offering content “intermixed” with advertising content in a manner the group claims “would not be permitted to be shown on broadcast or cable television” under Federal Communications Commission guidelines. Second, the group worries that much of advertising violates FTC Endorsement Guidelines because it is user-generated in a way capable of masking relationships with product manufacturers. Finally, the group claims the advertising content violates the YouTube Kids app’s stated policies and procedures.
Taken together, the advocacy group issues all collapse around the same core argument: very young children (generally under 5 years of age) cannot distinguish between actual content and advertising and that makes them “uniquely vulnerable to commercial influence.” This argument has a lot of emotional appeal: who wouldn’t want to protect small children? But the implications of this argument extend far beyond the YouTube Kids app, and would call into question any free, advertising supported video platform, including network television. As such, it seems like the advocacy groups position face significant First Amendment hurdles.
Although the advocacy group does not (yet) take issues with YouTube Kids’ data collection practices, it does question how the app is able to generate video recommendations. And its letter to the FTC explicitly asks the Commission to investigate whether or not children are being tracked without verifiable parental consent.
The ball is now squarely in the FTC’s court. It could launch a non-public investigation regarding the app’s practices, or it could do nothing. However, as the Commission has recently signaled a renewed interest in protecting children online (including entering a $19 million dollar settlement with Google over children’s in-app purchases last September), it seems likely the Commission will have at least some questions for Google following the advocacy group’s letter.
We’ll be sure to keep you posted.
Google made good on the rumors and the company’s subsequent promise last December to create a family-friendly version of its popular YouTube service with its launch on Monday of the YouTube Kids app. Available on both the App Store and Google Play free of cost and only in the United States, the YouTube Kids app is described by Google as an “app designed for curious little minds to dive into a world of discovery, learning, and entertainment…delightfully simple and packed full of age-appropriate videos, channels, and playlists.” Continue Reading The YouTube Kids app is here! Now what?
When small and mid-size companies start expanding their apps or web presence into Europe, they need to start thinking about EU data protection laws. It’s tempting to take a look at what one or two of the “big guys” do about EU data protection compliance and think that whatever the big guys do in Europe must be good enough. But the ongoing saga between Google and the EU’s data protection authorities shows that this approach shouldn’t be adopted uncritically.
- Providing “clear, unambiguous and comprehensive information” regarding its data processing,” including an “exhaustive list of the types of data . . . and purposes.”
- Providing more information about its use of anonymous identifiers (a next-generation tracking/behavioral profiling technology that’s being developed and may eventually replace cookies).
- Educating its employees better concerning notice and consent requirements.
- Making sure that users are equally protected regardless of what device they are using (mobile phones, tablets, desktops, and any new devices that are invented).
Google has committed to putting these changes into effect by June 30, 2015. In the meantime, Google’s undertaking provides a useful spotlight on the areas of EU data protection compliance that the ICO (and other data protection offices) think require significant attention.
Last week the United States District Court for the District of New Jersey dismissed, with prejudice, class action claims against Google and Viacom concerning targeted advertising and the online tracking of children through cookies. Perhaps surprisingly, the claims did not involve allegations that the parties violated the Children’s Online Privacy Protection Act (COPPA). The suit arose from allegations that when users register on Viacom’s Nick.com website, they are asked to input their gender and birthday and create a username. Viacom collects this information and gives each user a unique internal code that reflects their gender and age. Viacom then places a cookie on each user’s computer, which tracks the user’s IP address, browser settings, unique device identifier, certain system and browser information, and the URLs and videos requested from Viacom’s children’s websites. Viacom would share with Google its unique internal code, along with the record of what parts of the site users interacted with, and Google would place its own cookie on each user’s computer. Google and Viacom would then use this information to target the user’s with advertising.
The plaintiff’s alleged violations of the Wiretap Act, Stored Communications Act, California’s Invasion of Privacy Act, the Video Privacy Protection Act (VPPA), New Jersey’s Computer Related Offenses Act (CROA), and two New Jersey torts, including Intrusion upon Seclusion. The plaintiffs did not allege violations of the Children’s Online Privacy Protection Act (COPPA). In July 2014, the Court dismissed with prejudice all claims except the VPPA claim against Viacom and the CROA and Intrusion upon Seclusion claims against both defendants, about which the court allowed the plaintiffs to amend their complaint.
In January 2015, the court dismissed the amended complaints with prejudice. With regard to the VPPA claim against Viacom, the court found that the plaintiffs had not alleged sufficient facts to show that the information collected by Viacom could actually identify the plaintiffs. The Court noted that the VPPA requires disclosure of personally identifiable information (PII) concerning a consumer, but that there is no support for the proposition that PII includes the kind of information Viacom collected and shared, such as IP address, gender, and age. Further, the court found that this information was insufficient to identify an individual plaintiff and a video that plaintiff watched, as required for a violation of the VPPA to be found. Therefore, the court holds that the VPPA claim fails. Continue Reading Viacom and Google Win Important Dismissal in Online Tracking Class Action
Written by Julia Siripurapu, CIPP/US
According to recent media reports, Google is allegedly designing a Google account for children under 13 which would permit children in this age group to officially create their own Gmail account and to access a kid-friendly version of YouTube. Google currently prohibits children 12 and under from creating a Google account by implementing an age neutral verification mechanism in the account creation process and using cookies to ensure that children cannot bypass the age screen on a subsequent try. As reported by the Wall Street Journal, “now Google is trying to establish a new system that lets parents set up accounts for their kids, control how they use Google services and what information is collected about their offspring… Google wants to make the process easier and compliant with the rules.”
The reported initiative, which has not yet been confirmed by Google, is certainly very interesting and would clearly require the tech giant to comply with the Children’s Online Privacy Protection Act (“COPPA”) and its implementing rule (as amended, the “COPPA Rule”). It will be especially interesting to see how Google handles the advertising component of the service, which is a major piece of its business. In order to comply with COPPA, Google would have to engineer and design the new service based on the requirements of the COPPA Rule. PC Magazine reported in its story that “as part of the move, Google will also introduce a dashboard where parents can oversee their kids’ activities.” This seems like a step in the right direction for Google, but it will be a long journey! As we all know, the COPPA Rule goes far beyond giving parents the right and ability to monitor their children’s online activities and includes, among other requirements, complex, parental notice and verifiable consent requirements. You can link here for a copy of the Mintz Levin Guide to COPPA.
As the first company that would offer an online service specifically targeting children under 13, Google would certainly be in the spotlight and the new service would be closely monitored by the privacy community and the FTC. In fact, privacy advocacy groups, like the Center for Digital Democracy (CDD), have already voiced concern, as reported by the Wall Street Journal, that “Unless Google does this right it will threaten the privacy of millions of children and deny parents the ability to make meaningful decisions about who can collect information on their kids.” CDD’s executive director, Jeff Chester, informed the Wall Street Journal that the CDD shared its concerns with the Federal Trade Commission on Monday and that the organization is in the process of creating an action plan for monitoring how Google rolls out the service to children. The Wall Street journal also reported that the FTC declined to comment on the matter, “saying the agency does not comment on specific companies’ plans.”
- Google Is Planning to Offer Accounts to Kids Under 13, Wall Street Journal
- Google is Planning to Target Kids with Child-Friendly Version of Gmail and YouTube, International Business Times – Australia
- What Google Can Gain From Special Accounts For Children, Newsy – USA
- Google ponders child online accounts, ITWeb
- Google Eyes Kid-Friendly Accounts, PCMag.com
- Google reportedly working to offer Gmail, YouTube accounts to kids under 13, TechSpot – USA
- Google has a clever plan to get your kids hooked on Gmail and YouTube, BGR (Boy Genius Report) – USA
Written by Susan Foster, Solicitor England & Wales/Admitted in California
(LONDON) Google – along with the rest of us – is still considering the implications of the European Court of Justice’s May 13, 2014 decision that Google must remove links to a newspaper article containing properly published information about a Spanish individual on the basis that the information is no longer relevant or accurate. This decision by Europe’s highest court is unappealable, so the Google Spain case is law throughout the European Economic Area (EEA) until changed by legislation (unlikely) or modified by the ECJ in a later decision (also unlikely).
To reach this conclusion, the ECJ found that:
- Google is a data controller (and not merely a data processor) because it indexes information gleaned from the Internet in order to create its search results.
- The information in question (which had to do with a government order that a house be put up for auction due to its owner’s failure to pay certain taxes) is protected personal data despite the information having been properly published at the time of its initial publication. (Ironically, the Spanish newspaper that initially published the information was not required to remove the article – Google just can’t include the article in its search results.)
- Countervailing considerations such as the potential burden on Google that will arise from having to consider “right to be forgotten” requests and the interest of the public in having access to past public information are outweighed by the right of the individual to be forgotten.
From one perspective, this is just a search engine case, and the only companies that need to worry about it are search engine companies with some kind of business presence or technical facilities in Europe (which creates the nexus for the EU’s legal jurisdiction). And of course, historians might be worried, along with anyone else who thinks that public information should stay publicly available to safeguard freedom of expression, or the integrity of the historical record, or the democratic process, or the like. And EEA residents might even wonder what their life would be like if all search engines blocked off European results because the compliance burden outweighed the ad revenues – or, because, now that they are deemed to be data controllers, they couldn’t work out a way to comply with the Eighth Principle restricting transfers of personal data outside of the EEA . . .
No, the reasons that other (non-search) businesses, particularly in the US, should be concerned about the Google Spain decision are the following:
- The EU notion of personal data is not the same as the US notion of private information. It is far broader and includes information obtained from public sources as well as information that an individual has voluntarily disclosed to the world. When you evaluate your company’s data collection and processing activities, you need to remember that, in Europe, personal data is virtually everything about, or written by, an individual, whether or not the information has already been made public.
- The EU is unconcerned about imposing huge burdens on companies. Well, at least it’s unconcerned about imposing huge burdens on large companies that aren’t headquartered in the EEA – but it would be unwise to look at the Google Spain case as inherently exceptional. There’s a draft Data Protection Regulation making its way through the EU legislative pipeline that will levy fines for breaches in the order of up to 5% of global turnover. The draft Data Protection Regulation imposes very strict standards and processes on businesses that process personal data, and the Google Spain decision simply underscores that the balance of rights and interests in the EU is tipped firmly in the direction of the individual. Message to business? Get ready for the hammer. The Google Spain decision shows where it’s going to strike.
Written by Julia Siripurapu, CIPP/US
Just two months after Apple’s settlement with the FTC over lax parental controls over children’s in-app purchases (see our prior blog post), Google takes the spotlight with claims of unauthorized children’s in-app purchases in the Google Play Store! This time, it’s not an FTC action, but a class action. The suit was filed on March 7 in the U.S. District Court for the Northern District of California. The suit was brought by a New York mother (“Plaintiff”) on behalf of herself and other parents whose minor children downloaded free or relatively inexpensive child-directed games from the Google Play store and then incurred charges for purchasing items that cost money within the app without parental consent or authorization. For example, the Plaintiff’s five year old son spent over $65 dollars on virtual Crystals while playing the game “Marvel Run Jump Smash!” on an Android device.
According to the complaint, the apps directed to children that are offered for sale in the Google Play store are “designed to induce purchases of what Google refers to as ‘In-App Purchases’ or ‘In-App Content,’ i.e. virtual supplies, ammunition, fruits and vegetables, cash, and other fake ‘currency,’ etc. within the game in order to play the game as it was designed to be played (‘Game Currency’)”. As noted in the complaint, while Google required users to enter a password to authenticate their account before purchasing and downloading an app or Game Currency, once the account is authenticated, the user, including children, could purchase “several hundreds of dollars” in Game Currency during a 30 minute window without having to re-enter a password. This billing practice allowed Google to automatically charge the account holder’s credit or debit card or PayPal account, without notifying the account holder or obtaining further consent of the account holder. Continue Reading Unauthorized Children’s In-App Purchases Round Two: Google Faces Class Action
Written by Julia Siripurapu
Earlier this month, Google, Inc. (“Google” or “Company”) entered into an agreement with the Attorney Generals of 37 states and the District of Columbia, settling allegations of violation of the participating states’ consumer protection or applicable computer abuse statutes (the “Settlement Agreement”).
Here’s what got the tech giant in trouble: Google informed users of Apple Inc.’s Safari web browser (“Safari Browser”) via a notice posted on its web page describing the DoubleClick opt-out plugin that the Safari Browser is set by default to block all third-party cookies and that if a user does not change those default settings, she/he will accomplish the same thing as setting the opt-out cookie. Subsequent to making this statement, from June 1, 2011 through February 15, 2012, Google intentionally altered its DoubleClick advertising platform coding to circumvent the default Safari Browser cookie-blocking settings and enabled the placement of DoubleClick cookies on users’ Safari Browsers, without the users’ knowledge or consent. The browsing history collected from the Safari Browser users was then sent to advertisers.
In addition to agreeing to pay a hefty civil penalty of $17,000,000.00 (“Penalty”) to settle the allegations, as part of the settlement, Google has also agreed to modify its privacy practices as follows: Continue Reading Google pays BIG to state Attorney Generals for Improper Consumer Tracking