Data Compliance & Security

As our readers know we maintain a summary of U.S. state data breach notification laws, which we refer to as the “Mintz Matrix.”   Our latest update is available here, and it should be part of your incident response “toolbox” and part of your planning.

 During 2016, amendments to breach notification laws in five states went into effect (California, Nebraska, Oregon, Rhode Island and Tennessee).  And by the end of last year, well over twenty states had introduced or were considering new regulations or amendments to their existing security breach laws.  We expect there to continue to be significant regulatory activity in the data security space during 2017.  As always, we will keep you abreast of changes and will release updated versions of our Mintz Matrix to keep pace with developments in the states.

We are keeping an eye out for signs of support for a national breach notification law.  So far, there does not appear to be much political motivation for undertaking this effort.  A key sticking point is anxiety among a number of states that a federal law would offer less protection than their existing state law.  This is a valid concern since a national standard will only alleviate the significant burden of complying with the present patchwork of state laws if it has broad pre-emptive effect.  Only time will tell if state and federal lawmakers can work together to develop a comprehensive nationwide regime for security breach notification and remediation.

In the meantime, we must keep tabs on the forty-seven states (along with the District of Columbia, Guam, Puerto Rico and the Virgin Islands) with their own security breach laws.  Here is what’s been happening since our previous update in the Fall:

 California

 California amended its security breach law in order to require disclosure to affected residents (and to the Attorney General if more than 500 Californians are affected) when encrypted personal data is acquired by an unauthorized person together with an encryption key or security credential that could render the personal data readable or useable.

We note also that former Congressman Xavier Becerra recently took over as Attorney General in California, replacing Kamala Harris who aggressively pursued regulation in the privacy arena during her tenure as AG and who now serves California as one of its U.S. Senators.  Given this change in leadership, it will be interesting to see if the state continues to be a leader in pushing for stringent data security and privacy measures at the state and federal level.

 Illinois

Last summer Illinois passed an amendment to its Personal Information Protection Act (“PIPA”) that significantly broadened protections for personal information and the obligations imposed on businesses that handle such data.  The amendment became effective on January 1, 2017 and made several key changes to PIPA:

  • Definition of Personal Information. PIPA’s definition of “personal information” has now been expanded to include medical information, health insurance information, and unique biometric data used for authentication purposes (examples cited in the statute are a fingerprint, retina or iris image, or unique physical representations or digital representations of biometric data). The amended definition also encompasses a user name or email address in combination with a password or security question and answer that would permit access to an online account when either the user name or email address, or password or security question and answer, are not encrypted or redacted.
  • Encryption Safe Harbor. While PIPA already provided a safe harbor for data collectors if data disclosed due to a security breach was fully encrypted or redacted, the amendment clarified that the safe harbor does not apply if the keys to unencrypt or unredact or otherwise read compromised encrypted or redacted data have also been acquired in connection with the security breach.
  • Nature of Notification. For security breaches involving a user name or email address in combination with a password or security question and answer, data collectors may now provide notice in electronic or other form to affected Illinois residents. Such notice must direct individuals to promptly change their user name or password and security question and answer, or to take other appropriate steps to protect all online accounts for which the affected resident uses the same user name or email address/password or security question and answer. The amended statute also provides an additional option for substitute notice when residents affected by a security breach are confined to one geographic area.
  • New Exemptions. The amendment added an exemption for data collectors who meet their obligations under applicable provisions of the Health Insurance Portability and Accountability Act (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act (“HITECH”). Any data collector that provides notice of a security breach to the Secretary of Health and Human Services pursuant to its obligations under HITECH must also provide this notification to the Illinois Attorney General within five business days of notifying the Secretary. This exemption will primarily apply to certain entities operating in the healthcare space. The amended statute also deems financial institutions subject to applicable provisions of the Gramm-Leach-Bliley Act in compliance with PIPA’s data security requirements.
  • Security Requirements. Beyond addressing breach notification, the amendment requires covered entities to implement and maintain reasonable security measures to protect records containing personal information of Illinois residents and to impose similar requirements on recipient parties when disclosing such personal information pursuant to a contract. The amended statute also requires state agencies to report security breaches affecting more than 250 Illinois residents to the Illinois Attorney General.

 Massachusetts

 For those information junkies out there!  The Office of Consumer Affairs and Business Regulation (the “OCABR”) in Massachusetts has created a public web-based archive of data breaches reported to the OCABR and the Massachusetts Attorney General since 2007.  The data breach notification archive is available at www.mass.gov/ocabr and includes information about which entity was breached, how many Massachusetts residents were affected, if the breach was electronic or involved paper, and the nature of remediation services offered to affected residents.

 It is always a good time to review your incident response plan and data privacy policies to bring everything in line with changes happening on the state level. 

 And now for the disclaimer: The Mintz Matrix is for informational purposes only and does not constitute legal advice or opinions regarding any specific facts relating to specific data breach incidents. You should seek the advice of the Mintz Levin privacy team or other experienced legal counsel when reviewing options and obligations in responding to a particular data security breach.

Make sure to get your February 2017 Mintz Matrix!  Available here for downloading and always linked through the blog’s right-hand navigation bar.

It’s a new year, and time for the Financial Industry Regulatory Authority (FINRA)’s annual Regulatory and Examination Priorities Letter (the “2017 Letter”)    We remind regulated entities of this list of examination priorities every year, because cybersecurity appears high on the list every year.  2017 is no exception.

The 2017 Letter

FINRA has been increasing its on-site examinations and enhanced risk-based surveillance “to apply a nationally consistent approach to identify and focus on material conduct at firms…”   Among the operational risks listed in the 2017 Letter, Cybersecurity is listed first, and according to FINRA, “remain[s] one of the most significant risks many firms face, and in 2017, FINRA will continue to assess firms’ programs to mitigate those risks.”

Firms should be prepared for FINRA reviews of methods for preventing data loss, including understanding of data (e.g., its degree of sensitivity and the locations where it is stored), and its flow through the firm, and possibly to vendors.  FINRA may assess controls firms use to monitor and protect this data, for example, through data loss prevention tools. In some instances, FINRA has been known to review how firms manage their vendor relationships, including the controls to manage those relationships, and this line of examination is expected to continue.  Importantly, the 2017 Letter recognizes the nature of the “insider threat” and expresses FINRA’s intent to inquire into what controls firms have in place to acknowledge and manage that “insider threat”.    According to the 2007 Letter:  “The nature of the insider threat itself is rapidly changing as the workforce evolves to include more employees who are mobile, trusted external partnerships and vendors, internal and external contractors, as well as offshore resources.”

The WORM Actions

As if to emphasize the seriousness of the inquiries, FINRA issued a series of Letters of Consent at the end of December, levying fines totaling $14 million against 12 firms, and discussed the record-keeping requirements at the core of the December regulatory actions in its 2017 Letter.

Specifically, Securities & Exchange Commission and FINRA rules require member firms to maintain certain electronic records in a non-erasable, non-rewritable format, known by the acronym WORM, for  “Write Once, Read Many”.  This format prevents the alteration or destruction of records stored electronically.

in its press release, FINRA explained that WORM format requirements were essential to FINRA’s investigative duties. FINRA noted how the volume of sensitive financial data stored electronically by members had risen exponentially in the past decade. This increase in the amount of sensitive information stored by FINRA members coincides with increasingly aggressive attempts to hack into electronic data repositories. “These disciplinary actions are a result of FINRA’s focus on ensuring that firms maintain accurate, complete and adequately protected electronic records. Ensuring the integrity of these records is critical to the investor protection function because they are a primary means by which regulators examine for misconduct in the securities industry.

FINRA found that the each of the 12 fined firms failed to follow required document retention regulations in various ways outlined in the Letters of Consent.

Brad Bennett, FINRA’s current chief of enforcement, will be stepping down shortly.  #MLWashingtonCyberWatch will be keeping an eye on what, if any, changes may come with the new administration in 2017. Only time will tell whether FINRA will continue its aggressive enforcement actions or if we will see a softening of FINRA’s actions.   Regardless of the regulatory inquiries, firms should continue to take actions to improve cybersecurity resilience and investor protection.   For a quick review of the FINRA Report on Cybersecurity Practices, check out our webinar recording.

For the past few months, the Mintz Levin Privacy Webinar Series has focused on the upcoming EU General Data Protection Regulation (GDPR) to help businesses understand the reach and scope of the GDPR and prepare for the potentially game-changing privacy regulation. The GDPR will affect how US businesses handle and process personal data originating in the EU and may require changes to business process.

This week, we’ll present a webinar examining the criteria that determines whether or not your organization needs to appoint a Data Protection Officer. We will discuss the role of the DPO, the significance of the “independence” requirement, and the qualifications required to hold the position. Make sure to join us for this important webinar!

Registration link is here.

 

Developers and operators of educational technology services should take note.  Just before the election, California Attorney General Kamala Harris provided a document laying out guidance for those providing education technology (“Ed Tech”).  “Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data” provides practical direction that operators of websites and online services of a site or service used for K-12 purposes can use to implement best practices for their business models.

Ed Tech, per the Recommendations, comes in three categories: (1) administrative management systems and tools, such as cloud services that store student data; (2) instructional support, including testing and assessment; (3) content, including curriculum and resources such as websites and mobile apps.  The Recommendations recognize the important role that educational technology plays in classrooms by citing the Software & Information Industry Association; the U.S. Market for PreK-12 Ed Tech was estimated at $8.38 billion in 2015.

The data that may be gathered by through Ed Tech systems and services can be extremely sensitive, including medical histories, social and emotional assessments and test results.  At the Federal level, the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Rule (COPPA) govern the use of student data.  However, according to the Recommendations, these laws “are widely viewed as having been significantly outdated by new technology.”

Recognizing this, California has enacted laws in this space to fill in gaps in the protection.  Cal. Ed. Code § 49073.1, requires that local education agencies (county offices of education, school districts, and charter schools) that contract with third parties for systems or services that manage, access, or use pupil records, to include specific provisions regarding the use, ownership and control of pupil records. On the private side, the Student Online Personal Information Privacy Act (SOPIPA), requires Ed Tech provides to comply with baseline privacy and security protections.

Building on this backdrop of legislation, Attorney General Harris’ office provided six recommendations for Ed Tech providers, especially those that provide services in the pre-kindergarten to twelfth grade space.

  • Data Collection and Retention: Minimization is the Goal 

Describe the data being collected and the methods being used, while understanding that data can be thought of to include everything from behavioral data to persistent identifiers.  If your service links to another service, disclose this in your privacy policy and provide a link to the privacy policy of the external service.  If you operate the external service, maintain the same privacy and security protections for the external service that users enjoyed with the original service.  Minimize the data collected to only that necessary to provide the service, retain the data for only as long as necessary, and be able to delete personally identifiable information upon request.

  • Data Use: Keep it Educational

Describe the purposes of the data you are collecting.  Do not use any personally identifiable data for targeted advertising, including persistent identifiers, whether within the original service, or any other service.  Do not create profiles other than those necessary for the school purposes that your service was intended for.  If you use collected data for product improvement, aggregate or de-identify the data first.

  • Data Disclosure: Make Protections Stick 

Specifically describe any third parties you share personally identifiable data with. If disclosing for school purposes, only do so to further the school specific purpose of your site.  If disclosing for research purposes, only disclose personally identifiable information if you are required by federal or state law, or if allowed under federal and state law, and the disclosure is under the direction of a school, district or state education department.  Service providers should be contractually required to use any personally identifiable data only for the contracted service, not disclose the information, take reasonable security measures, delete the information when the contract is completed, and notify you of any unauthorized disclosure or breach.  Do not sell any collected information, except as part of a merger or acquisition.

  • Individual Control: Respect Users’ Rights 

Describe procedures for parents, legal guardians, and eligible students to access, review and correct personally identifiable data.  Provide procedures for students to transfer content they create to another service, and describe these procedures in your privacy policy.

  • Data Security: Implement Reasonable and Appropriate Safeguards

Provide a description of the reasonable and appropriate security you use, including technical, administrative and physical safeguards, to protect student information.  Describe your process for data breach notification.  Provide training for your employees regarding your policies and procedures and employee obligations.

  • Transparency: Provide a Meaningful Privacy Policy

Make available a privacy policy, using a descriptive title such as Privacy Policy, in a conspicuous manner that covers all student information, including personally identifiable information.  The policy should be easy for parents and educators to understand.  Consider getting feedback regarding your actual privacy policy, including from parents and students.  Include an effective date on the policy and describe how you will provide notice to the account holder, such as a school, parent, or eligible student.  Include a contact method in the policy, at a minimum an email address, and ideally also a toll-free number.

Given the size of the California market, any guidance issued by the California Attorney General’s office should be carefully considered and reviewed.   If you are growing an ed tech company, this is the time to build in data privacy and security controls.   if you are established, it’s time to review your privacy practices against this Guidance and see how you match up.  If you have any questions or concerns as to how these recommendations could be applied to your company, please do not hesitate to contact the team at Mintz Levin.

Imagine you are the CEO of company sitting across from an interviewer. The interviewer asks you the age old question, “So tell me about your company’s strengths and weaknesses?”  You start thinking about your competitive advantages that distinguish you from competitors.  You decide to talk about how you know your customers better than the competition, including who they are, what they need, and how your products and services fit their needs and desires.  The interviewer, being somewhat cynical, asks “Aren’t you worried about the liabilities involved with collecting all that data?”

In honor of National Cyber Security Awareness Month, we at Mintz Levin wanted to take the chance to remind our readers of data’s value as an asset and the associated liabilities that stem from its collection and use, as well as provide guidelines for maximizing its value and minimizing its liabilities.  Continue Reading 3 Guidelines to Maximize Value of Data

The term “cloud computing,”  — a process by which remote computers are used to store, manage and process data — is no longer an unfamiliar term. According to at least one estimate, “approximately 90 percent of businesses using the cloud in some fashion.” American Airlines is assessing major providers of cloud services for an eventual relocation of certain portions of its customer website and other applications to the cloud.

What some may not realize is that there are actually three main types of clouds: public, private and hybrid.  Public clouds are those run by a service provider, over a public network.  For example, Amazon Web Services offers public cloud services, among others.  A private cloud is operated for a single entity, and may be hosted internally or by a third-party service provider.  A hybrid cloud is a composition of two or more clouds, such as a private cloud and a public cloud, such that the benefits of both can be realized where appropriate.  Each of these cloud infrastructure types has different advantages and disadvantages.

For a given company looking to migrate to the cloud, the appropriate option will be motivated in part by business considerations; however, data privacy and security laws, compliance best practices, and contractual obligations will provide mandatory baselines that companies cannot ignore. As such, relevant laws, best practices, and contractual obligations serve as a useful starting point when evaluating the appropriate cloud option.

Most every organization has data flow systems that receive data, and then process and use the data to deliver a service. Below are three initial steps a decision maker should take when evaluating a potential cloud infrastructure choice.

 

First, consider the statutory implications of the types of data being processed

For example, is the system collecting social security numbers and driver’s license numbers? Pursuant to California Civil Code Section 1798.81.5, businesses that “own or license” personal information concerning a California resident are required to “implement and maintain reasonable security procedures and practices . . . to protect the personal information from unauthorized access, destruction, use modification, or disclosure.”  Of course, many other state and federal laws may also provide additional obligations, such as the HIPAA Security Rule, which applies to certain health information under certain circumstances.

Deciding which relevant laws apply, and then interpreting language such as “reasonable security procedures and practices” is a complicated process. Companies should consult experienced legal counsel regarding these risks, especially in light of potential liability.

Second, consider any relevant contractual obligations

For example, many companies may have contracts that provide for certain service level availability (SLA) obligations for services they provide. It is also possible that these contracts could have their own security requirements in place that must be met.

Third, decide which cloud architecture option makes sense in light of the first two steps as well as business considerations

After senior decision makers, with the benefit of experienced legal counsel, have decided what elements of applicable laws, best practices, and contractual obligations apply, further business considerations may need to be addressed from an operational standpoint.  For example, interoperability with other services may be an issue, or scalability may be an issue.

 

Through these requirements, in conjunction with appropriate information technology stakeholders, the appropriate cloud architecture can be chosen. Private clouds can offer the strongest security controls, as they are operated by a single entity and can offer security options not present in public clouds.  As such, a private cloud may be appropriate where a very strong security stance is deemed necessary.  Public clouds are often less expensive, but offer a more limited range of security options.  A hybrid cloud may be appropriate where an entity hosts certain high security data flow systems, as well as other systems with less sever security requirements.  For example an entity that has an HR system that contains social security numbers, as well as an employee shift scheduling system might choose to host the HR system on a private cloud, while hosting the customer feedback system on a public cloud system, with limited cross over and interoperability between the two systems.

Once you have chosen which cloud suits your business and data flow, the real work of getting appropriate contract documents in place begins.   We’ll discuss those issues in a future blog post.

 

Last week the clothing retailer Eddie Bauer LLC issued a press release to announce that its point of sale (“POS”) system at retail stores was compromised by malware for more than six months earlier this year.  The communication provided few details but did specify that the malware allowed attackers to access payment card information related to purchases at Eddie Bauer’s more than 350 locations in the United States, Canada and other international markets from January 2 until July 17, 2016.  According to the company, its e-commerce website was not affected.

In an open letter posted online, Eddie Bauer’s CEO Mike Egeck explained that the company had conducted an investigation, involved third party experts and the FBI, and now is in the process of notifying customers and reviewing its IT systems to bolster security.  These are customary and important steps following a security breach to mitigate harm to customers, protect against future threats, and comply with state data breach notification laws.    Read on to find out more ….. Continue Reading Eddie Bauer Latest Victim of POS Malware Attack

On Friday, the heads of the Federal Trade Commission overruled the decision of the Administrative Law Judge (“ALJ”) in In the Matter of LabMd., Inc. The FTC concluded that the ALJ had erred in dismissing the Commission’s case against a lab testing company LabMD and misapplied the unfairness standard.  The key determination by the FTC was that the mere disclosure of sensitive medical information is cognizable harm under Section 5(c) of the FTC Act, 15 U.S.C. § 45(a), irrespective of whether there is further economic or physical harm.   What does this mean for privacy enforcement?   Read on. Continue Reading FTC Plants A Flag With LabMD Ruling: What This Means for Enforcement

In a decision favorable to the airline industry—but not helpful to other companies—the California Court of Appeal said that a privacy enforcement action against Delta is not going to fly.  On May 25, 2016, the Court of Appeal tossed the California Attorney General’s CalOPPA enforcement action against Delta Airlines, affirming the lower court’s 2013 dismissal of the case with prejudice.

As we previously wrote, California AG’s office has been taking incremental steps toward ensuring that mobile applications comply with CalOPPA.  As early as 2012, its office began sending notices of non-compliance to mobile application developers.  When some companies failed to respond, the Attorney General chose Delta as its pilot case, promptly filing its first-ever enforcement action under CalOPPA.  Over the past three years, we have followed the Attorney General’s CalOPPA compliance campaign, including the Delta case.   Continue Reading Delta Wins CalOPPA Case – But Your Mobile App May Not Fly

Mintz Levin’s Immigration Law Blog is running a series titled “Innocents Abroad” addressing issues in an increasingly globalized economy where employers assign employees all over the globe.

These are big questions, reflecting some of the practical concerns in our international marketplace.  The series focuses on the well-intentioned Global HR Director, Ned Help, who will raise hot topics and difficulties his company faces when sending their employees abroad.  We will then explore the common pitfalls and offer practical solutions to the difficulties Ned Help faces.   This month’s edition:   Privacy Considerations – follow the rest of the series at Innocents Abroad.


 

From:            Carrie Counselor

To:                  Ned Help

Date:              May 24, 2016

RE:     Privacy considerations for employees working abroad

Dear Ned,

I understand that one of your employees will be engaging a six-month temporary assignment around Europe to scope market opportunities, and you’d like to have a better understanding of what to be thinking about in terms of privacy.  Great question!  This is an area where many employers struggle because other jurisdictions protect privacy and personal data quite differently than we do here in the United States.

Generally speaking, federal and state laws applicable to employee information do not have “extraterritorial” effect beyond the information that remains in the United States, meaning that American employees working abroad (even temporarily) will not benefit from US legal protections with respect to personal information collected, stored or transmitted outside of the country.

What makes this area of the law particularly crucial and daunting for employers is that non-US countries frequently offer greater protections to employees and establish far higher compliance obligations on the part of employers.  Of particular concern for you should be the data protection landscape across the European Economic Area (referred to as the “EEA,” encompassing all European Union (EU) Member States as well as Iceland, Liechtenstein and Norway) because each country has passed its own set of national laws governing the collection, use, retention and transmission of personal data. Companies must consider these local laws before electronically monitoring an employee outside the United States or transferring an employee’s personal information back home.  Let’s talk specifics: Continue Reading Innocents Abroad: Privacy Considerations for Employers