Federal Trade Commission

“Don’t make promises that you don’t intend to keep” is an admonishment received by every child and delivered by every parent. This pithy maxim is equally applicable to consent orders entered into with regulatory authorities. Indeed, Upromise’s failure to abide by it is costing the company $500,000 in the form of a civil penalty from the Federal Trade Commission (FTC). Continue Reading More Broken Privacy Promises from Upromise: Key Takeaways From Upromise’s Latest Settlement with the FTC

Haul out the holly, fill up the stockings, even though it’s just one week past Thanksgiving day…..

 

Rather than look back at 2013, next week the Privacy & Security blog will count down The 12 Days of Privacy, looking ahead to what we might expect in 2014.    The editor’s muse for this series came from our friend and partner, Len Weiser-Varon, who riffed on yesterday’s post regarding the latest password hack:

 

  • 318,000 Facebook accounts
  • 70,000 Gmail, Google+ and YouTube accounts
  • 60,000 Yahoo accounts
  • 22,000 Twitter accounts
  • 8,000 ADP accounts (ADP says it counted 2,400)
  • 8,000 LinkedIn accounts
  • Three French hens
  • Two turtle doves
  • And a password in a pear tree.

In Len’s words: This year, a brand new password in an unhacked stocking is a holiday must.

Don’t miss our series starting on Monday. Continue Reading Coming Next Week: The 12 Days of Privacy

As the summer winds down, we find that privacy and security issues remain at the top of mind for companies, hackers, and regulators alike.

 

EMPLOYEE PERSONAL INFORMATION EXPOSED AT FED

Bloomberg is reporting today on a large-scale exposure of the personal information of every employee of the Federal Reserve Bank.   According to the article, a website associated with the hacktivist group Anonymous has posted what the group said were “full details of every single employee at Federal Reserve Bank of America,” adding central banks have “systematically defrauded the planet.” The post included a spreadsheet containing phone numbers, e-mails and other employee information that the Federal Reserve said today was probably accessed more than six months ago.  According to the Fed, the bank’s critical operations were not affected.

Read more:  Bloomberg Business News

PIN THIS

A security researcher has discovered a vulnerability in Pinterest.   According to Threatpost, the researcher discovered a hole that enables an attacker who knows a target’s username or user ID to discover that user’s email address.  The bug could provide an attacker a ready-made (and huge) target list for phishing attacks.

Read more: Threatpost

BIG DATA IS ON THE FTC RADAR – THE “LIFEGUARD ON THE BEACH”

In her keynote address delivered at the Technology Policy Institute Aspen Forum last week, FTC Chairwoman Edith Ramirez spoke about the privacy challenges of Big Data. While recognizing the tangible benefits of innovative ways of collecting, analyzing and storing the aggregation of decades of information we refer to as “Big Data” to consumer and businesses alike, Chairwoman Ramirez expressed concern that these Big Data-driven innovations offer great potential for misuse of personal information and pose significant privacy risks to consumers.

In speaking about the FTC’s role in Big Data, Ramirez described the enforcement tools that can be used by the Commission to protect consumers against the privacy challenges posed by Big Data and  promised to continue to use these tools aggressively against entities that breach their commitments to safeguard consumer information. The Chairwoman gave as examples the FTC’s actions against data giants Google, Facebook, and Myspace for deceiving consumers by breaching commitments to keep their data confidential, and the forty-plus actions brought by the Commission under its unfairness and deception authority against large data companies like LexisNexis, ChoicePoint, and Twitter for failing to provide reasonable security safeguards.

Ramirez urged businesses that use Big Data to design their data collection and use approach based on the three core principles set forth in the FTC’s 2012 Privacy Report, specifically, privacy-by-design, simplified choice, and greater transparency, to ensure that consumers understand who is collecting their data, how their data is used, and are given a choice in whether their data is collected and how it is used.

Ramirez ended her address by noting that while the FTC will not stand in the way of innovation, the Commission will play an active, central role in ensuring that consumer privacy is respected. In fact, the Chairwoman characterized the FTC as “lifeguard at the beach”, that “will remain vigilant to ensure that while innovation pushes forward, consumer privacy is not engulfed by that wave.”

If you are interested in reading more about privacy considerations of Big Data, please see two recently published articles with competing views on this point.  The Underwhelming Benefits Of Big Data, authored by Paul Ohm, a professor of law at the University of Colorado and former senior policy advisor to the FTC was published this month in the Penn Law Review. Big Data for All: Privacy and User Control in the Age of Analytics, co-authored by Omer Tene, Vice President of Research and Education at the IAPP and Jules Polonetsky, Director of the Future of Privacy Forum was published in the Northwestern Journal of Technology and Intellectual Property in April.

 

Written by Julia Siripurapu, CIPP

Yesterday, the FTC published a Federal Register notice requesting public comment on the first new method for obtaining verifiable parental consent submitted for FTC approval by AssertID, Inc under the Voluntary Commission Approval Process provision of the COPPA Rule. The FTC is particularly interested in receiving comments on the questions of whether the AssertID, Inc. method (“AssertID VPC Method”): (1) is already covered by existing methods in Section 312.(b)(1) of the COPPA Rule, (2) meets the requirements for parental consent in 16 CFR § 312.5(b)(1), and (3) poses a risk to consumers’ personal information, and if so, whether the benefit to consumers and businesses of using this method outweigh those risks. As noted by the FTC, the mere publication of the Federal Register notice does not indicate approval of the AssertID VPC Method and the FTC has 120 days to review and approve or reject the method.

As way of background, the Voluntary Commission Approval Process provision of the COPPA Rule permits interested parties to submit written applications to the FTC requesting approval of verifiable parental consent methods that are not currently enumerated in the COPPA Rule. The COPPA Rule enumerates the following methods for obtaining verifiable parental consent: (1) sending a consent form to the parent that must be signed by the parent and returned via U.S. mail, fax, or electronic scan; (2) requiring the parent, in connection with a monetary transaction, to use a credit card, debit card, or other online payment system that provides notification of each transaction to the primary account holder; (3) requiring the parent to connect to trained personnel via telephone or video-conference; or (4) verifying a parent’s identity by checking a form of government-issued identification against databases of such information. Further, if a child’s information is used solely for internal purposes and not made publicly available or provided to third-parties, parental consent may also be obtained by using the “e-mail plus” method which consists of sending an email message to the parent, requesting that the parent indicate consent in a return message, and taking certain additional steps prescribed by the COPPA Rule to confirm the consent. The FTC made it clear in the COPPA FAQ updated last month that this is a non-exhaustive list and that the use of other methods is permitted as long as the selected method is reasonably calculated in light of the available technology to ensure that the person providing the consent is the child’s parent. To see our previous blog posts on the COPPA Rule here.

The AssertID VPC Method consists of the following six processes which are intended to collectively ensure compliance with the COPPA Rule:

  1.  A process for parental notification of a consent request;
  2.  A process for  presenting  the consent-request to parents;
  3.  A process for recording and reporting a parent’s response to a consent request;
  4.  A process for recording and reporting a parent’s request to revoke consent(s) and to request the deletion of their child’s personal information;
  5.  A process for verification of the parent-child relationship; and
  6.  A process to ensure that only the parent of the child for whom consent is being requested accesses and responds to consent requests.

The AssertID VPC Method is incorporated into a web service called ConsentID, designed to be used by COPPA-covered entities. Once an entity completes the ConsentID self-registration process, the entity can initiate the verifiable parental consent process via an API. Consent requests are then sent to parents through a password-protected parent portal where parents can access and respond to consent requests from multiple entities, review consents previously granted, revoke consents and requests that his/her child’s information be deleted. Parents are directed to the parent portal via email or other optional notification methods. To verify that the individual providing the consent is in fact the parent of the child for whom consent is being requested, ConsentID creates a unique digital credential for each parent-child pair which is then assigned a trust score that can be increased as the parent-child relationship is verified by friends and family. A minimum trust score is required before a parent is permitted to grant or revoke consent for his/her child.

To learn more about the AssertID VPC Method, you can read AssertID’s 85 page application  filed with the FTC on June 28th.. Comments on the AssertID VPC Method may be filed with the FTC online or on paper until September 20, 2013.

UPDATE — The Federal Trade Commission has published its promised COPPA FAQs here.   

 

Volley #1 Trade Associations to FTC:  Please Delay!

The long-awaited amendments to the Children’s Online Privacy Protection Act (COPPA) have been the subject of much discussion and debate.  Last week, Federal Trade Commission (FTC) Chairwoman Edith Ramirez received letters from 19 trade organizations, including the Interactive Advertising Bureau, the Application Developers Alliance, the Toy Industry Association, and the Direct Marketing Association, urging the FTC to consider a six (6) month extension of the effective date for the amendments to the Children’s Online Privacy Protection (COPPA) Rule (the “Amendments”), pushing out the effective date from July 1, 2013 to January 1, 2014.

The common concern voiced by these trade organizations in their letters to the FTC is the inability of their members to comply with the Amendments by July 1, since they claim that the Amendments significantly expand the scope of COPPA and the obligations of the covered entities. The Toy Industry Association described compliance with the Amendments by July 1 as a potentially “monumental task,” the Direct Marketing Association noted in its letter to the FTC that the “final amendments released in December 2012 contained several unanticipated material changes from previous versions” that “significantly impact the long standing business model that [companies subject to COPPA] have relied upon in planning the capabilities of their products and services since COPPA’s inception”, and the Application Developers Alliance stated in its letter that “the changes create significant new obligations for app developers and their partners that are still not well understood.” The request for extending the effective date to January 1, 2014 is based on the argument that a longer timeline for implementation will provide more time for the industry to understand the effect of the Amendments, to implement and quality-check the changes necessary to comply (both internally and with respect to third party relationships), and to overall assure widespread compliance with the Amendment.

Volley #2 — Consumer Advocacy Groups:  Don’t Delay!

This week, several consumer privacy and children advocacy groups  — including the Center for Digital Democracy, Common Sense Media, Consumer Watchdog, and the Electronic Privacy Information Center (collectively, “Advocacy Groups”) — wrote to Commissioner Ramirez to oppose the compliance delay requested by the trade associations. Noting  that the FTC process of amending COPPA began in 2011 and included industry participation and input (with the Amendments being issued in December 2012) and that industry has had sufficient time to adjust their business practices and make the necessary changes for compliance, the Advocacy Groups characterized the compliance delay as unwarranted and harmful to children. The Advocacy Groups urged the FTC to remain firm on the July 1 enforcement date as a delay would “undermine the goals of both Congress and the FTC.”

No word from the FTC yet on any of these requests, however, the Commission is expected to release further guidance on compliance with the Amendments in the form of FAQs .

 

Written by Jake Romero

Perhaps we are being cynical, but if we imagine the current conversation between consumers and the makers of mobile payment applications, it would be something along the lines of:

Mobile Payment Industry: “Hello Consumer, would you like to start using your mobile device to transmit payments and make purchases?

Consumer: “Thank you, but no . . . I have serious concerns about my privacy and the security of my financial and purchasing data.  I’m just not comfortable with mobile device payments.

Mobile Payment Industry: “I see, but did you know that you can use mobile payments to more easily purchase Girl Scout Cookies  and Starbucks coffee ?

Consumer: “mmmm . . . . Thin Mints.

In other words, the willingness of consumers to bargain away an increasing amount of their privacy and accept certain data security risks in exchange for the latest in mobile device services suggests that widespread acceptance of mobile payments is inevitable.  The Federal Trade Commission (the “FTC”), in a Staff Report titled “Paper, Plastic . . . or Mobile?” (the “Staff Report”), agrees.  The Staff Report cites a survey of industry executives in which 83% of those surveyed agree that mainstream consumer adoption of mobile payments will be achieved by 2015, and notes that in the past year, a number of the mobile industry’s largest companies, as well as many start-ups, have taken actions to claim a portion of the mobile payment market.

The main purpose of the Staff Report is to highlight certain primary issues affecting the mobile payment industry that, if not addressed early, could potentially cause great harm to consumers and hinder the industry’s development.  For example, the FTC points out that the payment source that underlies a mobile payment (such as a credit card, debit card, bank account or mobile phone account) can have a significant impact on the potential liability of consumers who wish to dispute fraudulent charges, and in some cases leave consumers with no statutory protection.  The FTC also discusses the difficulties involved in international transactions.  Not surprisingly, however, the bulk of the FTC’s discussion of key issues addresses data security and privacy.

Data Security

A study conducted by the Federal Reserve found that data security, and specifically the theft or interception of financial information, was the reason most cited by consumers who have chosen not to adopt mobile payments.  However, the FTC’s discussion of data security in connection with mobile payments is noteworthy because the FTC argues that if this issue is addressed correctly at the outset, the use of mobile payments could ultimately benefit consumers by allowing for greater security in the transmission of information.

Under a traditional payment regime, the FTC argues, the financial information of the consumer is at some point transmitted or stored in an unencrypted format.  In addition, if the information on the magnetic strip of a payment card is acquired, it can be used repeatedly to make purchases.  Mobile payments, on the other hand, can implement technology that allows for “end-to-end encryption” (meaning that at no point in the process would the data not be encrypted) and utilize an authentication system that generates unique payment information for each transaction (which would prevent thieves from being able to use stolen information on multiple transactions).  As a result, mobile payments can not only reduce the likelihood that financial information will be acquired, but also minimize potential losses in connection with stolen or intercepted information.

The Staff Report urges mobile payment providers to adopt strong data security measures by all companies in the mobile payment chain, and warns that the industry as a whole may suffer if the lax measures of any provider result in widespread harm to consumers.  The FTC also advocates that consumers be educated to ensure that they utilize certain common sense security measures, such as using password protection to unlock each mobile device, and setting separate passwords to access mobile payment applications.

Privacy

Although the FTC argues that mobile payments, if implemented correctly, could increase data security, there is no denying the adverse effect that mobile payments will have on consumer privacy.  The privacy concerns raised by mobile payments are significant for a number of reasons.  The number of third parties that will have access to the consumer’s information is much greater than in a traditional payment system where only banks, merchants and payment card networks are involved.  As the Staff Report notes, mobile payments potentially involve the aforementioned third parties, as well as operating system manufacturers, hardware manufacturers, mobile phone carriers, application developers and coupon and loyalty program administrators.    Also, mobile payments will likely increase the amount of information that can be collected by each third party involved in the payment process.  Under a traditional point-of-sale process, the FTC argues, merchants and financial institutions receive access to some, but not all, of the information generated by the purchase.  Mobile payments, on the other hand, will generate data that could potentially be broadly collected and consolidated by third party processors.

The FTC’s recommendations for addressing the privacy concerns associated with mobile payments are set forth in its report “Protecting Consumer Privacy in an Era of Rapid Change” :

  • • Use a “privacy by design” approach to the development of applications and services.  “Privacy by design” means that a consumer’s privacy has been considered at each step in developing, designing and implementing a website, service or application.  This typically includes providing consumers with warnings when highly sensitive information is being collected, making easy-to-understand resources available that describe what is being collected, and limiting the collection of information to only what is necessary to deliver the product or service.  In mobile applications, privacy by design is particularly important because mobile devices generate a greater amount of sensitive data.
  • • Simplify the choices that are presented to consumers in connection with the collection of personal information.  The FTC warned that the solution to privacy issues cannot be to inundate the consumer with lengthy disclosures where the collection of information is obvious, but in all other instances the consumer should be permitted to restrict how and when data is collected by third parties.
  • • Increase transparency and educate consumers about the transaction process.  Without greater transparency through meaningful disclosures, the mobile payment industry will be unlikely to win over the trust of the general public.

It is worth noting that the FTC’s recommendations for addressing data security and privacy in mobile payments follow a strategy similar to the FTC’s recent recommendations regarding mobile applications  in that the FTC is advocating for a combination of industry self-regulation and consumer education.  This approach suggests that the FTC believes that the development and adoption of technology moves too quickly to rely solely, or even primarily, on a statutory framework.  Knowing how much you love Girl Scout cookies, the FTC is likely correct.

As we continue our “new year, new look” series into important privacy issues for 2013, we boldly predict:

Regulatory Scrutiny of Data Collection and Use Practices of Mobile Apps Will Increase in 2013

Mobile apps are becoming a ubiquitous part of the everyday technology experience.  But, consumer apprehension over data collection and their personal privacy with respect to mobile applications has been growing.   And as consumer apprehension grows, so does regulatory scrutiny.  In 2012, the Federal Trade Commission (FTC) offered guidance to mobile app developers to “get privacy right from the start.”    At the end of 2012, the California Attorney General’s office brought its first privacy complaint against Delta Airlines, Inc., alleging that Delta’s mobile app “Fly Delta” failed to have a conspicuously posted privacy policy in violation of California’s Online Privacy Protection Act.  And also in December, SpongeBob Square Pants found himself in the middle of a complaint filed at the FTC by a privacy advocacy group alleging that the mobile game SpongeBob Diner Dash collected personal information about children without obtaining parental consent.

In 2013, we expect to see new regulatory investigations into privacy practices of mobile applications.   Delta was just one of 100 recipients of notices of non-compliance from the California AG’s office and the first to be the subject of a complaint.  Expect to see more of these filed early in this year as the AG’s office plows through responses from the lucky notice recipients.   Also, we can expect to hear more from the FTC on mobile app disclosure of data collection and use practices and perhaps some enforcement actions against the most blatant offenders.

Recommendation for action in 2013:  Take a good look at your mobile app and its privacy policy.   If you have simply ported your website privacy policy over to your mobile app – take another look.  How is the policy displayed to the end user?  How does the user “accept” its terms?  Is this consistent with existing law, such as California, and does it follow the FTC guidelines?  

 

 

Written by Amy Malone

The Center for Digital Democracy (CDD) filed a complaint yesterday asking the Federal Trade Commission (FTC)  to investigate violations of the Children’s Online Privacy Protection Act (COPPA) by Nickelodeon and mobile app-maker PlayFirst.

The CDD alleges that the mobile game SpongeBob Diner Dash collects personal information about children, including full names, e-mail addresses and online identifiers, without obtaining parental consent, as required under COPPA.

The online identifiers collected by the app, such as unique device identifiers and device tokens, allow the app to track the behavior of the child and send push notifications to them.  These types of online identifiers are considered personal information under COPPA.

This is the second complaint the CDD has filed with the FTC concerning COPPA violations in the last two weeks.  The previous CDD complaint asked the FTC to investigate the Mobbles app for gathering and sharing the precise location of children without obtaining parental consent.  Both apps were quickly ripped from the app store following the complaints.

In a letter accompanying its new complaint the CDD states that SpongeBob Diner Dash and Mobbles are representative of the widespread disregard of COPPA requirements.  Earlier this month the FTC reported that of the 400 children’s apps that it surveyed 59% shared geolocation, device ID or phone numbers with developers or third parties, but 80%  failed to disclose any information about their privacy practices, including whether parental consent was required.

Proposed amendments to COPPA have been the subject of much discussion and comment over the past year.  According to a release from the Federal Trade Commission today, the final amendments are scheduled to be released tomorrow in a press conference at noon.   The press conference can be viewed live via webcast at the US Senate Commerce Committee website.

Written by Amy Malone

Amid the chatter regarding the elections,  the Berkeley Center for Law and Technology recently released a report on Americans’ views of “Do Not Track”.  They found that 87% of the 1,200 people surveyed had never heard of Do Not Track.  The Do Not Track regulations that privacy groups began advocating for over 5 years ago calls for an opt-out mechanism that would allow people to opt-out once from all behavioral advertising.  The Federal Trade Commission supported this measure and in 2010 they offered testimony to Congress stating that industry self-regulation has fallen short and that legislation may be the best course for a uniform approach.

While the government, privacy advocates and the ad industry sort out the details, it’s time to focus on users. The survey report demonstrates a clear need for user education as almost half of those surveyed answered “don’t know” to survey questions about online tracking and sharing of information that is collected.  Those surveyed that answered the questions often gave the wrong answers.  For example, almost half of those surveyed said that if a company wants to follow your internet use across multiple sites on the internet it must first obtain your permission.  In addition, 25 % of those surveyed thought they had a right to require websites to delete the information the website has about them and 22% thought permission was needed for advertisers to track them on medical websites such as WebMD.

In addition to highlighting the  participants’ general misunderstanding of online tracking, the survey also shed light on what participants want from a do not track mechanism.  When asked what they wanted Do Not Track to do, 60% of the participants responded that they want Do Not Track to prevent websites from collecting information about them, while 20% said they want it to stop all ads and 14% said they want it to prevent websites from sending targeted advertisements.

The survey report evidences that there is a disconnect surrounding what users want (to not have information collected about them), what’s actually happening (information is being collected about them) and what users think is happening (that privacy laws are limiting the collection and use of information about them).  The report concludes by promoting a revised approach to tracking and targeting advertising by endorsing a model that was proposed in 2010 that allows highly targeted ads without creating databases of internet behavior held by third parties.

Written by Amy Malone

This week the Federal Trade Commission released a best practices guide that outlines how companies using facial recognition can protect consumer privacy.  The Commission continued to push the “privacy by design” model that it first promoted in its March 2012 report, “Protecting Consumer Privacy in an Era of Rapid Change.”

The key areas targeted in the report are facial recognition technologies used on social networking sites, digital signs and sites that allow people to upload pictures and use those pictures to simulate changes in their appearance, like a haircut.  In each area, the Commission stressed the privacy by design pillars of data security, transparency, consumer choice and data retention limits.   The Commission gave examples of how companies could incorporate privacy by design in each of the target areas.  One example was that stores using digital signs to snap pictures of customers could post information at the entrance of the store telling customers that photos of them may be taken in the store so customers can decide whether or not they want to enter the location.   And social networking sites can inform users of how uploaded images will be used and give the users a chance to opt out of certain features.  For instance, most members of Facebook upload images to share with friends and family (the Commission reported that in one month 2.5 billion photos were uploaded to Facebook) and do not realize they are freely contributing to the largest facial recognition database on the planet.  The Commission suggested letting the user opt out of being part of the data base.

The report also stressed the importance of obtaining “affirmative express consent” in at least two situations: (1) before using biometric data in a way that materially differs from why it was collected and (2) if the data is used to identify anonymous images.  The goal of obtaining affirmative consent is to prevent data owners from being blindsided by unintended uses of their information, such as your Facebook photo being used by a mobile app to reveal your identity to a stranger who shot a picture of you in a restaurant.  It sounds like something out of the Matrix, but it could happen today if the app had access to Facebook’s data base of billions of identified pictures.

Despite pushing the importance of these practices the Commission pointed out that they encourage self-regulation and this report does not serve as a “template for law enforcement actions or regulations under laws currently enforced by the FTC.”

Although the report claims that it doesn’t serve as a template, it may draw attention to help legislators like Al Franken gather support to pass legislation limiting the use of facial recognition and other biometric technologies.