Archives: Mobile Privacy

a tracking device in my car …. she is now my ex-true love….

Written by Jonathan Cain

A year ago, privacy and data security issues in the media were all about credit cards and identity theft.  Concerns about privacy related to location data were, at least among the general public and Congress, somewhere in a galaxy far far away.  Users of mobile devices had relatively few complaints about the scraping , aggregation and sale of location data, if they were even aware that it was occurring.

What a difference a year makes.

Continue Reading On the Ninth Day of Privacy, my true love gave to me….

Our series last year was a reader favorite, so we decided to put our prognosticator hats on again and present:

 

Rather than look back at 2014, starting tomorrow, the Privacy & Security blog will count down The 12 Days of Privacy, looking ahead to what we might expect in 2015 and what we might be talking about in the year to come.

Don’t miss a day starting tomorrow!

Day One – 12/9 – Does Santa Claus Have to Comply  with EU Data Protection Laws: 2015 Compliance Considerations for Non-EU Companies

Day Two – 12/10 – Through the Looking Glass: Privacy Litigation

Day Three – 12/11 -What the 2015 Proxy Season Might Bring……

Day Four – 12/12 – Cyberliability Policies: What to Expect in 2015

Day Five – 12/15 – California Dreaming … New Legislation Effective January 1

Day Six – 12/16 – Hacks and the State Actor:  What Sony Portends…

Day Seven – 12/17 — Questions of Authority:  Who is “the cop” on the Privacy and Data Security Beat?

Day Eight – 12/18 – Health Data Sharing – How much is too much?

Day Nine – 12/19 — OCR Corrective Action Planning in 2015:  The Gift That Keeps on Giving

Day Ten – 12/22 —Wearables:  What will that new gadget be spilling about you?

Day Eleven – 12/23 –ISO and the Courts:  How Your Coverage is Likely to Narrow in 2015 (and why….)

Day Twelve – 12/24 –On the Twelfth Day…..

 

Join us each day as we celebrate the 12 Days of Privacy, v.2014!

Written by:  Stephanie D. Willis 

As the world recovers from the excitement leading up to Tuesday’s Apple Live Event announcement of the new iPhone 6 and Apple Watch, mobile app developers are chomping at the bit to create software that leverages the new operating system and Apple’s widely-anticipated “HealthKit,” a purportedly secure platform that allows mHealth apps to share user’s health and fitness data with the new Health app and with each other.  In fact, over 300 apps were created per day in recent years, according to some reports.  But because the mobile app market is supersaturated, the quantity of available mobile apps does not equal the number of quality and secure apps that would be appropriate for use at an organization with a high privacy and security risk profile.  The draft Technical Considerations for Vetting 3rd Party Mobile Applications (the Vetting Report) issued by National Institute of Standards and Technology (NIST) in August 2014 is an essential document for any organization to use to help weed out the mobile apps that may create unnecessary IT risks.

Continue Reading NIST Issues Draft Report Enumerating Risks and Protections to Consider When Evaluating Mobile Apps for Your Enterprise

Welcome to the first Monday in April.

Our Privacy Monday is a report on the Federal Trade Commission’s latest privacy notice-related settlements with Fandango and Credit Karma.   These settlements should be reviewed by any company with (or planning to have) mobile applications and reinforces our mantra:  Say what you do, and do what you say.    And make sure you know what that is.

Stop Phoning it In on Mobile Security:  What Your Business Needs to Know About the FTC Settlements with Fandango and Credit Karma 

 

Of all the “Days of Privacy” looking forward to 2014, we believe that the issues surrounding mobile applications and privacy will see some of the most intense regulatory focus …. read on, and be prepared….

Written by Jake Romero, CIPP/US

One could argue that the guiding principle behind the exponential growth and pervasive influence of the mobile application industry has been that it is limited only by our imagination’s ability to identify tiny new problems to solve, like how to pick the ripest watermelon and how to be a less terrible person while using Google+ video chat.  And yet, 2014 could be a year in which regulatory efforts and critical market forces are as notable as the technological innovations, as increased enforcement efforts, greater consumer awareness and certain compliance difficulties unique to mobile applications create a “perfect storm” of potential liability.

In the past few months, state and federal regulatory agencies have continued to publish policy rationales for directing resources toward mobile application privacy. In February, the Federal Trade Commission released a Staff Report focused entirely on mobile technology in which the FTC made it clear that all parties involved in the mobile industry share responsibility for ensuring that appropriate consumer protections are in place.  California released a similar report after laying groundwork for more than a year to ensure that California’s data privacy regulations, which are among the toughest in the nation, would be applicable to mobile applications.  In each case the reports hinted strongly at increased enforcement efforts and in the past few weeks we have been given a clearer view of what such increased enforcement efforts will look like. 

The office of New Jersey Acting Attorney General John J. Hoffman announced a settlement agreement with California-based mobile application developer Dogokeo, Inc. in connection with its “Dokobots” application.  The Dokobots app was a scavenger hunt game that utilized geolocation data to direct users toward animated cartoon Dokobots and other digital items.  The state alleged that Dokogeo’s collection of information through its mobile application violated the federal Children’s Online Privacy Protection Rule (COPPA) and the Federal Trade Commission’s COPPA Rule because the Dokobots application was directed to children and failed to obtain verifiable parental consent prior to the collection of personal information from children.  Dokogeo allegedly also failed to link to its privacy policy on its homepage so that parents and other users would be able to find information regarding Dokogeo’s data collection practices.  Under the terms of the settlement agreement, Dokogeo is required to (i) clearly and conspicuously disclose, in its mobile applications and on the homepage of its website, information regarding its collection, disclosure and use of information, (ii) verify that all persons using any of its mobile applications that collect personal information are over the age of 13, (iii) remove certain information regarding children from its website and (iv) otherwise comply with the requirements of COPPA and the COPPA Rule as they pertain to online services directed to children.  The settlement also includes a suspended settlement payment of $25,000, which Dokogeo will be required to pay if it fails to comply with the terms of the settlement, or otherwise violates COPPA or New Jersey’s Consumer Fraud Act at any point during the 10 year period following the date of the settlement.

Less than a month later, the FTC announced a proposed consent agreement with Goldenshores Technologies, LLC.  Goldenshores Technologies marketed the “Brightest Flashlight Free” mobile application, a free mobile application that, according to the proposed consent agreement, has been downloaded tens of millions of times.  The FTC alleged that Goldenshores Technologies engaged in unfair and deceptive practices by failing to disclose that certain data, including geolocation data and persistent device identifiers, would be collected by the application and shared with third parties.  The FTC also alleged that the collection of data by the application commenced prior to the user’s acceptance of the app’s end user license agreement.  Under the terms of the proposed consent agreement, Goldenshores Technologies is required to, among other things, (i) update its disclosures with respect to the collection, use and disclosure of information, (ii) specifically disclose how geolocation information is used, why it is collected and with whom it is shared, (iii) delete personal information of users collected prior to the date of the consent agreement and (iv) maintain, for a period of 5 years following the date of the consent agreement, certain advertising and promotional materials containing representations about data collection, user complaints and inquiries, and documentation showing compliance with the consent agreement.

The varying allegations in the Dokogeo and Goldenshores Technologies agreements highlight the difficulties mobile applications face in complying with state and federal regulations.  On one hand, the extremely personalized nature of data collected by mobile phones mandates heightened protections and disclosure.  On the other hand, the complex and multi-layered support structure for most mobile applications (not to mention the smaller screen size) can make it difficult to fully describe the extent to which data is shared with third parties, and create unforeseen security risks.  One recent report, for example, found that the majority of mobile applications are vulnerable to hackers because of serious security flaws related to a combination of over-collection of personal data and incorrect implementation of encryption measures, while another report found vulnerabilities in apps that access data using public Wi-Fi networks.  In addition, the application of certain sector-specific laws, such as Dokogeo’s alleged violation of COPPA, are a particular risk for mobile applications because the use of animated characters and kid-friendly themes (both of which were a factor in the Dokogeo settlement agreement) are commonly used by mobile applications to entice adults.

At the same time, we have seen an increase in media coverage of mobile data privacy issues.  The information leaked by Edward Snowden has kept stories related to collection of personal information squarely on the front pages and over Black Friday there were numerous stories describing the use of mobile device tracking by retailersOn the editorial pages concerns persist that consumers do not realize how much private information is being collected by smart phones.  As a result, a substantial increase in the number of class action claims against mobile applications in 2014 is not merely possible, but likely. 

With that in mind, on this 8th day of privacy, here are 8 steps you can take to help reduce liability in connection with your mobile application or online service in 2014:

  1. Reassess your security measures. 2014 will also bring new data breach notification requirements.  If your business is newly subject to any such requirements, understanding your risk profile will require a fresh look at how secure your system is.
  2. Understand how your customers’ information is used and shared. Providing full and accurate disclosure to users requires understanding who you share information with and how those third parties use and share the information.  This includes sharing with service providers in ways that may otherwise be considered “routine” in your industry. Also, be sure that you understand how those third parties protect the information you share.
  3. Not just “how” and “what”, but “why” The requirements under the Goldenshores Technologies consent agreement show that it isn’t enough to simply disclose that information may be collected.  Effective notice requires that users be informed why information is being collected.
  4. Consider deleting what you don’t need.  The easiest way to reduce your risk profile is to limit what you collect and retain. Consider putting processes in place to collect only what your service requires and to delete information that you no longer need, such as information related to closed accounts.
  5. Consider context, not just consent.  As we discussed after New York Comic Con was criticized for hijacking attendee Twitter accounts, the expectations of users regarding data collection should be considered as a separate issue from obtaining user consent.  In considering whether to bring an enforcement action, it is likely that the FTC considered how many of Goldenshores Technologies’ flashlight users would be surprised to learn how much geolocation data the “free” flashlight required.
  6. Consider your audience.  Whether a service is “targeted to children” may seem like a simple concept generally, but it can be difficult to apply to specific examples, particularly in the realm of games and entertainment.  As we describe in our guide to compliance with the amended COPPA Rule, there are a number of factors that should be considered when determining whether a Web site or online service or portion thereof is directed to children.
  7. Have a plan for when the worst occurs.  Data breaches are considered “one of the unfortunate realities of doing business today”.  The moment when you discover there has been a data breach is not the time to figure out your plan for what to do when you have a data breach. There’s no time like the present to put a game plan in place that can be used in the event of an emergency.
  8. Subscribe to our blog. Throughout the year we’ll continue to cover ongoing issues related to data privacy and security. Let your Mintz Levin privacy & security team help you keep up with the most recent developments.

 

Written by Amy Malone, CIPP/US
In 2013 geolocation and biometrics were hot topics.  Apple included a fingerprint reader on the new iPhone which was either really cool or an epic fail depending on your viewpoint, and Google and the NSA are tracking our every move.

While Edward Snowden’s revelations may have been eye opening (and headline-grabbing), the government has long been first in line to develop and use technology like geolocation and biometrics.  Homeland Security insists that biometrics are essential in national defense – identify and stop the bad guys.  The feds have also pushed biometrics in immigration reform bills for over a decade and continue to push that legislation forward.  And your location?  Well, law enforcement has been conducting warrantless geolocation tracking for years!

States have also been active in this area – passing legislation to allow the storage of the high resolution photos they take of you at the DMV in a searchable data base.  Many states allow federal and state law enforcement officials to search those databases.  Most legislation is aimed at limiting government use of this information, but the winds may be turning…

Biometrics

Currently, no federal law limits a private entity’s ability to collect, use or disclose biometric information.  Cybersecurity has been a hot button issue over the last few years and legislation has been introduced, but no legislation regarding private use of biometric data has been passed.  The Cyber Privacy Fortification Act has been introduced a few times and was reintroduced in March.  This legislation could be passed in 2014; it would require covered entities to provide notice to the FBI or the United States Secret Service of “major” security breaches of “sensitive personally identifiable information,” which by definition in the legislation includes unique biometric data.

Despite the current lack of proposed legislation, legislators are definitely paying attention to this area.  Senator Franken has repeatedly taken aim at the use of biometrics and recently questioned Apple about their use of fingerprint readers on the iPhone and urged the Department of Commerce to develop best practices for facial recognition technology.  The National Telecommunications and Information Administration responded to Franken’s request by announcing the kick-off of a privacy multistakeholder process to implement the Consumer Privacy Bill of Rights in the field of facial recognition.

With Senator Franken pushing and the multistakeholder process moving forward, there’s a good chance we will see new legislation aimed at regulating biometric information in 2014.

As this technology has flowed into our everyday lives we’ve seen some states take action by regulating the collection and use of biometric information.  Both Illinois and Texas have laws restricting a private entities use and disclosure of biometric information and several other states have laws governing the disposal of biometric information.  A few states also include biometric data in their definition of “personal information” and require notice to data owners in the event of a data breach involving that information.

In 2014 Alaska may pass its proposed House Bill No. 144, which is similar to the laws in Illinois and Texas.  The law requires covered entities to provide notice and obtain written consent from individuals prior to the collection of their biometric information and provides for an individual cause of action.    It would not be a surprise to see other states move forward in the biometric regulation area in 2014.

Geolocation

With the advent of smartphones came the love-hate relationship with geolocation.  We love when Siri gives us the name of a great restaurant that is up the street, but we are creeped out when we discover she’s been tracking our every move, even when we aren’t trying to locate that hip hangout.

Like with biometrics, the government has been all over geolocation technology for some time now and courts are playing catch up.  The big question today is whether police need warrants to obtain the location information of suspects.  Decisions around the country have been all over the map.  In July the New Jersey Supreme Court overturned an appellate decision and ruled that the use of cell phone information obtained by police without a warrant from a wireless provider violates the suspect’s constitutional rights under the Fourth Amendment of the New Jersey Constitution.  It’s possible that in 2014 the US Supreme Court will take this matter up for review.

Most legislation in this area has focused on limiting the government’s ability to collect and use geolocation information.  The Geolocation Privacy and Surveillance Act was reintroduced in 2013, and the bill requires government agencies to obtain a warrant to obtain geolocation information in the same way they currently get warrants for wiretaps.

On the state level, both Maine and Montana have laws requiring law enforcement agencies to get a warrant before they can obtain location information of an electronic device.  Texas, Maryland Ohio, Colorado, California, and Illinois introduced similar bills this year, and we expect to see more state legislative activity in this area in 2014.

In the private sector, geolocation is an exploding industry.  In an attempt to compete with online competitors (who can easily track your every move) brick and mortar retailers use geolocation tracking via your mobile device to gather specific information on your shopping habits – like how long you stayed in the store, whether you went to the register, how long you waited in line and where the store hotspots are located.  In 2013 we saw this type of tracking blow up in Nordstrom’s face, but  that did not stop Apple from rolling out its iBeacon in its own company stores in the U.S., or Macy’s from piloting the iBeacon technology in a few of its stores this holiday season.  We expect that 2014 will bring more new and creative technology to retailers who will use that to find new ways to find us — and monetize mobile location information.

Mobile app providers are also trying to get your geolocation information to improve their bottom line.  The New Year rings in with Twitter tapping into its location data.   Twitter just entered into an agreement with a provider for location intelligence technology which Twitter will use to support location sharing in tweets.  A news source reports, “Twitter will have an option to combine that location data for tweets with buying patterns, behaviors, preferences and influencers, and cross-reference it with nearby stores or other mobile users within an individual’s social network. It uses a smartphone’s GPS signal to pinpoint a location.”

Although we have not seen laws regulating the private sector’s collection of geolocation information, we blogged recently about the release of the Mobile Location Analytics Code of Conduct.  The Code is a self-regulatory framework of seven principles for services provided to retailers by mobile location analytic companies.

If a voluntary framework doesn’t ease your worried mind, maybe an app to block location tracking will?   Android users can now download an app  to do just that!

 

Written by Jake Romero

If you’ve ever dealt with that pushy salesperson at Bed, Bath & Beyond who won’t take your word for it that you’re just browsing and not ready to commit to a high-end home espresso machine, you know that being followed around at a retail store can be unsettling and intrusive. “Unsettling” and “intrusive” are also the words that Senator Charles Schumer used to describe using mobile phones to track customer movement, a practice that an increasing number of retail outlets are beginning to implement. In response to an increase in scrutiny over the past few months, companies that enable tracking of customers through Wi-Fi enabled smartphones have published a code of conduct help bolster transparency and customer data security.

In March the New York Times profiled Euclid Analytics, which collects mobile location analytics (“MLA”) data for approximately 100 customers, including Nordstrom and Home Depot. According to Euclid’s CEO Will Smith, in some cities between 40% and 60% of users can be tracked in this manner. The information provided can include how long the customer was in the store, which parts of the store the customer visited or whether you walked by the store but declined to go in.

In July, Senator Schumer authored a letter to Federal Trade Commission Chairwoman Edith Ramirez, asking that the FTC investigate the practice of consumer tracking as an unfair and deceptive trade practice if a retailer fails “to notify shoppers that their movements are being tracked in a store or to give them an opportunity to opt out” of being tracked.

Now, in an effort to calm concerns and avoid potential onerous regulations, 8 of the 10 major MLA firms have agreed to abide by a code of conduct. The Code of Conduct will place restrictions on MLA firms, as well as the retailers who use their services. Except in certain cases where data is aggregated or not unique to the individual, companies that utilize MLA technology will be required to notify consumers that their data is being collected, and provide information about the use of the information and the company collecting it. MLA companies will be required to either limit data collection to non-unique or aggregated data, promptly de-identify personal data or obtain the consumer’s prior consent. Although the Code does not require consumers to opt-in to MLA tracking, MLA companies who collect unique or personal data will be required to allow consumers to opt-out through a central site that will be effective across all participating MLA companies. Additional restrictions in the Code further limit the use, transfer and retention of MLA data.

Although the Code is a voluntary framework, its widespread adoption could help to establish an industry standard that would help regulators like the FTC distinguish the collection and use practices of non-adopting firms as an unfair practice. In the meantime, with these guidelines in place you can focus on more important things when using your mobile device in a mall — like whether the mall has a fountain.

 

Privacy tidbits and bytes for this Monday —

App Developers – Put this on your calendar!

Now that the US government shutdown is over, the Federal Trade Commission (FTC) has announced its participation in a workshop with the Application Developers Alliance and the California Attorney General’s office on best practices for mobile app privacy.  The Mobile Privacy Summit will be held on Wednesday, October 23 in Santa Monica, California.  The agenda includes a day of panels to help mobile app developers understand industry best practices, regulatory requirements, and the role and responsibility of publishers, platforms and advertisers to ensure the privacy of mobile application users.   You have been invited.

Manitoba Adds Breach Notification Requirement

Soon, we will have to expand the Mintz Matrix to include Canadian provinces with provincial data breach notification requirements.   Manitoba joined Alberta and recently passed the Personal Information Protection and Identity Theft Prevention Act (PIPITPA – try saying that 3 times fast….).   PIPITPA includes a rather broad breach notification obligation requiring an organization to notify an individual when personal information “in its custody or under its control” is stolen, lost or accessed in an unauthorized manner.   Notice is not required if it is “not reasonably possible for the personal information to be used unlawfully.”   Unlike in Alberta, this is not a “real risk of significant harm” test and notice must be given directly to individuals.  In Alberta, notice is required to be given to the privacy commissioner who then makes the determination on individual notice.

EU Set to Vote Today

European Union lawmakers are set to record their final vote today on potentially sweeping changes to data protection.  We will update you when we have more information.  While you are waiting, read our Privacy blog post from Friday.

 

(LONDON) Who is on the ICO’s radar these days?  August seems to be the month for getting new guidance documents out the door at the United Kingdom’s Information Commissioner’s Office.  The UK ICO has just published guidance as to when it is likely to take regulatory action.

The new guidance should be reassuring to companies that are making good faith efforts to comply with the UK’s data protection laws.  Companies that haven’t yet engaged fully with the data protection laws, on the other hand, would be well advised to review the regulatory action guidance, which (along with the ICO’s other guidance documents) puts the law into practical context.

The ICO’s guidance states that regulatory action is likely to be triggered by:

  1. Issues of public concern (including those in the media).
  2. The novel or intrusive nature of specific data processing activities.
  3. Complaints made by the public to the ICO.
  4. Issues that emerge from the ICO’s other activities (such as audits).

Interestingly, the ICO has said that it is less likely to take action where market forces are likely to put pressure on data processors to comply with the data protection laws.  This pro-market approach distinguishes the ICO from other EU data protection regulators, who typically take a more skeptical view of the effectiveness of the free market to incentivize companies to protect personal data.

By way of contrast, the ICO notes that the public sector may require more regulatory action since public sector data protection practices are less transparent, individuals have less choice as to their relationship with public sector data collection and processing, and the nature of the data being processed is frequently more sensitive (such as health data).

The ICO’s current priority areas are:

  • Health
  • Criminal justice
  • Local government
  • Online and mobile services

Three out of the four current priority areas are largely served by the public sector, but the for-profit sector should also stay alert:  The ICO’s enforcement notices page lists Glasgow City Council right next to Google.

Written by Amy Malone

Digital marketing company, PulsePoint  entered into a Consent Order with the New Jersey Attorney General and agreed to pay $1 million, following an investigation of claims that PulsePoint bypassed privacy setting of Apple’s Safari browser to allow tracking of consumer activity.

Last year, Google settled similar claims with the Federal Trade Commission for $22.5 million (see our blog post here).  The allegations against PulsePoint mirror those that the FTC brought against the search engine giant:  the NJ AG’s complaint alleged that PulsePoint placed cookies on Apple Safari web browsers without the knowledge or consent of New Jersey consumers.  PulsePoint allegedly did this by bypassing privacy settings that were chosen by Safari users.  The Safari settings allow users to select between “always” accepting cookies, “never” accepting cookies, or accepting cookies only from “sites I visit-block cookies from third parties and advertisers.”

According to the complaint, PulsePoint circumvented the user settings by using a form that made the Safari browser act as if the user had clicked on the advertisement, when in fact the user had not.   Once the form was sent, the Safari browser allowed PulsePoint to set their cookies on the browser, even when the user had opted to block cookies.

This activity occurred between June 2009 and February 2012 and in the press release, the state claims that PulsePoint may have placed up to 215 million targeted ads on the browsers of New Jersey consumers.

The $1 million settlement includes (1) a civil penalty of $556,196.96, (2) reimbursement of the state’s attorneys’ fees in the amount of $32,048.00, (3) reimbursement of the state’s investigative costs in the amount of $1,755.04, (4) a payment of $150,000.00 to be used in the state’s discretion for the promotion of consumer privacy programs and (5) a payment of $250,000.00 to be used by the state for in-kind advertising services.

PulsePoint also agreed to, among other things; implement numerous privacy controls and procedures to protect the privacy and confidentiality of consumer information.   PulsePoint agreed to not override or change a consumer’s browser settings without her affirmative consent.  And, PulsePoint must provide information on its website explaining what information it collects and how it uses that information.

The future may be a difficult one for PulsePoint as more attorneys general may engage in their own investigations.