Data Compliance & Security

 

If you are one of the many businesses licensed by the New York Department of Financial Services (DFS), and cannot avail yourself of the (very) limited exemptions, you must be ready for the first compliance transition date for the stringent DFS cybersecurity regulations – August 28, 2017.

Just in case you’d forgotten, the DFS cybersecurity regulations became effective March 1, 2017 and you can refresh your memory here. Continue Reading Are You Ready for the New York August 28th Compliance Deadline?  

If you are a retailer with locations in New Jersey, you will need to review your procedures in anticipation of a new law effective October 1, 2017. 

New Jersey Governor Chris Christie has signed the Personal Information Privacy and Protection Act (we can now add #PIPPA to the alphabet soup of privacy acronyms…..), which limits the ability of retailers to collect PII scanned from customer driver’s licenses and identification cards and restricts the usage of any PII collected for the purposes identified in the Act.

Within recent years, retailers have commonly started a practice of scanning the barcodes on customer ID cards to verify the authenticity of an ID presented, verify identity when credit cards are used, or to prevent and control fraudulent merchandise return practices (or to identify consumers who abuse return policies).

Under PIPPA, retailers will only be permitted to scan ID cards to:

  • Verify the card’s authenticity or the person’s identity, if the customer pays for goods or services with a method other than cash; returns an item; or requests a refund or exchange.
  • Verify the customer’s age when providing age-restricted goods or services to the customer.
  • Prevent fraud or other criminal activity if the person returns an item or requests a refund or an exchange and the retailer uses a fraud prevention company or service.
  • Establish or maintain a contractual relationship.
  • Record, retain, or transmit information as required by state or federal law.
  • Transmit information to a consumer reporting agency, financial institution, or debt collector to be used as permitted by federal laws, including the Fair Credit Reporting Act, Gramm-Leach-Bliley Act, and Fair Debt Collection Practices Act.
  • Record, retain, or transmit information by a covered entity under HIPAA and related regulations.

PIPPA prohibits retailers from sharing the information with marketers or other third parties that are unknown to consumers.   It is unlikely that an online privacy notice describing sharing of scanned ID information with third parties would comply with PIPPA.  In-store notice of any such practices will likely be required.

The big “however” in this legislation is the restrictions on retention of the information when collected for the permitted purposes.  Under PIPPA businesses cannot retain information related to how the customer paid for the goods, whether the customer returned an item or requested a refund, and cannot store ages.   Retailers will only be permitted to collect the customer’s name, address, and date of birth; the issuing state; and the ID card number.    Any of this information collected from scanned ID cards Is required to be “securely stored” and PIPPA makes it clear that any security breach of this information is subject to New Jersey’s data breach notification law and must be reported to any affected individual and the New Jersey State Police.

And there are penalties.   PIPPA provides civil penalties of $2,500 for a first offense, and $5,000 for any subsequent offices.   Further the law allows for “any person aggrieved by a violation” to bring an action in NJ Superior Court to recover damages.

 

Recently the United States Computer Emergency Readiness Team (US-CERT), an organization within the Department of Homeland Security’s (DHS) National Protection and Programs Directorate (NPPD) and a branch of the Office of Cybersecurity and Communications’ (CS&C) National Cybersecurity and Communications Integration Center (NCCIC), encouraged users and administrators to review a recent article from the Federal Bureau of Investigation (FBI) regarding Building a Digital Defense with an Email Fortress.

Are we have discussed in many posts before, phishing — the fraudulent practice of sending emails purporting to be from a reputable entity to induce an individual to reveal privileged information such as a password — remains a major security threat.  Within the article, the FBI provides several helpful actions for businesses can take to reduce their risk of being phished, including reporting and deleting suspicious e-mails, and making sure that countermeasures such as firewalls, virus software, and spam filters are robust and up-to-date.

We encourage each of our readers to review the FBI’s guidance and consider whether their organization could benefit from any of the methods of protection provided.

Companies with any questions regarding any of these issues should not hesitate to contact the team at Mintz Levin.

At last week’s Health Care Compliance Association’s annual “Compliance Institute,”  Iliana Peters, HHS Office for Civil Rights’ Senior Advisor for HIPAA Compliance and Enforcement, provided a thorough update of HIPAA enforcement trends as well as a road map to OCR’s current and future endeavors.

Continuing Enforcement Issues

Ms. Peters identified key ten enforcement issues that OCR continues to encounter through its enforcement of HIPAA.  Do any of them look familiar to you? These issues include:

  1. Impermissible Disclosures. HIPAA’s Privacy Rule prohibits covered entities and business associates from disclosing PHI except as permitted or required under HIPAA. Impermissible disclosures identified by Ms. Peters all center on the need for authorization, and include:
    • Covered entities permitting news media to film individuals in their facilities prior to obtaining a patient’s authorization.
    • Covered entities publishing PHI on their website or on social media without an individual’s authorization.
    • Covered entities confirming that an individual is a patient and providing other PHI to reporters without an individual’s authorization.
    • Covered entities faxing PHI to an individual’s employer without the individual’s authorization.
  2. Lack of Business Associate Agreements. OCR continues to see covered entities failing to enter into business associate agreements.
  3. Incomplete or Inaccurate Risk Analysis. Under HIPAA’s Security Rule, covered entities are required to conduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic PHI (ePHI). According to Ms. Peters, organizations frequently underestimate the proliferation of ePHI throughout their environment, including into systems related to billing, faxing, backups, and medical devices, among others.
  4. Failure to manage identified risks. HIPAA requires regulated entities to put in place security measures to reduce risks and vulnerabilities. According to the presentation, several OCR breach investigations found that the causes of reported breaches were risks that had previously been identified in a risk analysis but were never mitigated. In some instances, encryption was included as part of the remediation plan, but was never implemented.
  5. Lack of transmission security. While not required in all cases, HIPAA does require that ePHI be encrypted whenever it is deemed appropriate. The presentation identified a number of applications in which encryption should be considered when transmitting ePHI, including email, texting, application sessions, file transmissions (e.g., FTP), remote backups, and remote access and support services (e.g., VPNs).
  6. Lack of Appropriate Auditing. HIPAA requires the implementation of mechanisms (whether hardware, software or procedural) that record and examine activity in systems containing ePHI. HIPAA-regulated entities are required to review audit records to determine if there should be additional investigation. The presentation highlighted certain activities that could warrant such additional investigation, including: access to PHI during non-business hours or during time off, access to an abnormally high number of records containing PHI, access to PHI of persons for which media interest exists, and access to PHI of employees.
  7. Patching of Software. The use of unpatched or unsupported software on systems which contain ePHI could introduce additional risk into an environment. Ms. Peters also pointed to other systems that should be monitored, including router and firewall firmware, anti-virus and anti-malware software, and multimedia and runtime environments (e.g., Adobe Flash, Java, etc.).
  8. Insider Threats. The presentation identifies insider threats as a continuing enforcement issue. Under HIPAA, organizations must implement policies and procedures to ensure that all members of its workforce have appropriate access to ePHI and to prevent those workforce members who do not have access from obtaining such access. Termination procedures should be put in place to ensure that access to PHI is revoked when a workforce member leaves.
  9. Disposal of PHI. HIPAA requires organizations to implement policies and procedures that ensure proper disposal of PHI. These procedures must guarantee that the media has been cleared, purged or destroyed consistent with NIST Special Publication 800-88: Guidelines for Media Sanitization.
  10. Insufficient Backup and Contingency Planning. Organizations are required to ensure that adequate contingency planning (including data backup and disaster recovery plans) is in place and would be effective when implemented in the event of an actual disaster or emergency situation. Organizations are required to periodically test their plans and revise as necessary.

Upcoming Guidance and FAQs

OCR also identified upcoming guidance and FAQs that it will use to address the following areas:

  • Privacy and security issues related to the Precision Medicine Initiative’s All of Us research program
  • Text messaging
  • Social media
  • Use of Certified EHR Technology (CEHRT) & compliance with HIPAA Security Rule (to be release with the Office of the National Coordinator for Health Information Technology (ONC))
  • The Resolution Agreement and Civil Monetary Penalty process
  • Updates of existing FAQs to account for the Omnibus Rule and other recent developments
  • The “minimum necessary” requirement

Long-term Regulatory Agenda

The presentation also identifies two long-term regulatory goals to implement certain provisions of the HITECH Act. One regulation will relate to providing individuals harmed by HIPAA violations with a percentage of any civil monetary penalties or settlements collected by OCR, while the second will implement a HITECH Act provision related to the accounting of disclosures of PHI.

Audit Program Status

The presentation discussed the current status of OCR’s audit program. As we have previously discussed, OCR is in the process of conducting desk audits of covered entities and business associates. These audits consist of a review of required HIPAA documentation that is submitted to OCR. According to Ms. Peters, OCR has conducted desk audits of 166 covered entities and 43 business associates. Ms. Peters also used the presentation to confirm that on-site audits of both covered entities and business associates will be conducted in 2017 after the desk audits are completed. We will continue to follow and report on developments in the audit program.

Commentary

The list of continuing enforcement issues provides covered entities and business associates with a helpful reminder of the compliance areas that are most likely to get them in compliance trouble. Some of the enforcement issues may require HIPAA-regulated entities to revisit decisions that they previously made as part of a risk analysis. Transmission security (#5, above) is an example of such an area that may warrant reexamination. In the past, encrypting data was often too expensive or too impracticable for many organizations. However the costs of encryption have decreased while it has become easier to implement. A covered entity or business associate that suffers a breach due to transmitting unencrypted PHI over the internet will likely garner little sympathy from OCR going forward. The presentation is also notable for the long list of guidance and FAQs that OCR will be publishing, as well as their plan to issue regulations to address changes ushered in by the HITECH Act that were not captured by the 2013 Omnibus Rule. These regulations, particularly the regulations related to accounting for disclosures of PHI, could have a far-reaching impact on how covered entities and business associates comply with HIPAA in the future.

 

Last week, Snap Inc. (“Snap” or the “Company”) – the parent company of the wildly popular app Snapchat (“Snapchat” or the “App”) – became a publicly traded company on the New York Stock Exchange in the biggest tech IPO since Alibaba in 2014.  Priced at $17 per share, the Snap stock opened at $24 per share on Thursday morning and closed at $24.48 per share, bringing the Company’s market capitalization to approximately $28 billion. In today’s post, we’re taking a closer look at Snap’s S-1 filing (“Snap S-1”) with the U.S. Securities and Exchange Commission (SEC) with a particular focus on the Company’s disclosures of risk factors associated with cybersecurity and privacy risks.  Continue Reading A Deep Dive into Privacy/Security Disclosures in Snap’s S-1

In an effort to combat the growing prevalence of large-scale corporate cyberattacks, the New York Department of Financial Services (“NYDFS”) is rolling out a revamped cybersecurity regulation for financial services companies to take effect TODAY (March 1, 2017). This ambitious regulation is broadly drafted and carries a heavy compliance burden intended to protect consumers and ensure the safety and soundness of New York State’s financial services industry.   Even if you are not directly in banking or insurance, read on to see how these regulations may affect your company. Continue Reading It’s March 1: The Cybersecurity Goal Post Has Been Moved

As our readers know we maintain a summary of U.S. state data breach notification laws, which we refer to as the “Mintz Matrix.”   Our latest update is available here, and it should be part of your incident response “toolbox” and part of your planning.

 During 2016, amendments to breach notification laws in five states went into effect (California, Nebraska, Oregon, Rhode Island and Tennessee).  And by the end of last year, well over twenty states had introduced or were considering new regulations or amendments to their existing security breach laws.  We expect there to continue to be significant regulatory activity in the data security space during 2017.  As always, we will keep you abreast of changes and will release updated versions of our Mintz Matrix to keep pace with developments in the states.

We are keeping an eye out for signs of support for a national breach notification law.  So far, there does not appear to be much political motivation for undertaking this effort.  A key sticking point is anxiety among a number of states that a federal law would offer less protection than their existing state law.  This is a valid concern since a national standard will only alleviate the significant burden of complying with the present patchwork of state laws if it has broad pre-emptive effect.  Only time will tell if state and federal lawmakers can work together to develop a comprehensive nationwide regime for security breach notification and remediation.

In the meantime, we must keep tabs on the forty-seven states (along with the District of Columbia, Guam, Puerto Rico and the Virgin Islands) with their own security breach laws.  Here is what’s been happening since our previous update in the Fall:

 California

 California amended its security breach law in order to require disclosure to affected residents (and to the Attorney General if more than 500 Californians are affected) when encrypted personal data is acquired by an unauthorized person together with an encryption key or security credential that could render the personal data readable or useable.

We note also that former Congressman Xavier Becerra recently took over as Attorney General in California, replacing Kamala Harris who aggressively pursued regulation in the privacy arena during her tenure as AG and who now serves California as one of its U.S. Senators.  Given this change in leadership, it will be interesting to see if the state continues to be a leader in pushing for stringent data security and privacy measures at the state and federal level.

 Illinois

Last summer Illinois passed an amendment to its Personal Information Protection Act (“PIPA”) that significantly broadened protections for personal information and the obligations imposed on businesses that handle such data.  The amendment became effective on January 1, 2017 and made several key changes to PIPA:

  • Definition of Personal Information. PIPA’s definition of “personal information” has now been expanded to include medical information, health insurance information, and unique biometric data used for authentication purposes (examples cited in the statute are a fingerprint, retina or iris image, or unique physical representations or digital representations of biometric data). The amended definition also encompasses a user name or email address in combination with a password or security question and answer that would permit access to an online account when either the user name or email address, or password or security question and answer, are not encrypted or redacted.
  • Encryption Safe Harbor. While PIPA already provided a safe harbor for data collectors if data disclosed due to a security breach was fully encrypted or redacted, the amendment clarified that the safe harbor does not apply if the keys to unencrypt or unredact or otherwise read compromised encrypted or redacted data have also been acquired in connection with the security breach.
  • Nature of Notification. For security breaches involving a user name or email address in combination with a password or security question and answer, data collectors may now provide notice in electronic or other form to affected Illinois residents. Such notice must direct individuals to promptly change their user name or password and security question and answer, or to take other appropriate steps to protect all online accounts for which the affected resident uses the same user name or email address/password or security question and answer. The amended statute also provides an additional option for substitute notice when residents affected by a security breach are confined to one geographic area.
  • New Exemptions. The amendment added an exemption for data collectors who meet their obligations under applicable provisions of the Health Insurance Portability and Accountability Act (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act (“HITECH”). Any data collector that provides notice of a security breach to the Secretary of Health and Human Services pursuant to its obligations under HITECH must also provide this notification to the Illinois Attorney General within five business days of notifying the Secretary. This exemption will primarily apply to certain entities operating in the healthcare space. The amended statute also deems financial institutions subject to applicable provisions of the Gramm-Leach-Bliley Act in compliance with PIPA’s data security requirements.
  • Security Requirements. Beyond addressing breach notification, the amendment requires covered entities to implement and maintain reasonable security measures to protect records containing personal information of Illinois residents and to impose similar requirements on recipient parties when disclosing such personal information pursuant to a contract. The amended statute also requires state agencies to report security breaches affecting more than 250 Illinois residents to the Illinois Attorney General.

 Massachusetts

 For those information junkies out there!  The Office of Consumer Affairs and Business Regulation (the “OCABR”) in Massachusetts has created a public web-based archive of data breaches reported to the OCABR and the Massachusetts Attorney General since 2007.  The data breach notification archive is available at www.mass.gov/ocabr and includes information about which entity was breached, how many Massachusetts residents were affected, if the breach was electronic or involved paper, and the nature of remediation services offered to affected residents.

 It is always a good time to review your incident response plan and data privacy policies to bring everything in line with changes happening on the state level. 

 And now for the disclaimer: The Mintz Matrix is for informational purposes only and does not constitute legal advice or opinions regarding any specific facts relating to specific data breach incidents. You should seek the advice of the Mintz Levin privacy team or other experienced legal counsel when reviewing options and obligations in responding to a particular data security breach.

Make sure to get your February 2017 Mintz Matrix!  Available here for downloading and always linked through the blog’s right-hand navigation bar.

It’s a new year, and time for the Financial Industry Regulatory Authority (FINRA)’s annual Regulatory and Examination Priorities Letter (the “2017 Letter”)    We remind regulated entities of this list of examination priorities every year, because cybersecurity appears high on the list every year.  2017 is no exception.

The 2017 Letter

FINRA has been increasing its on-site examinations and enhanced risk-based surveillance “to apply a nationally consistent approach to identify and focus on material conduct at firms…”   Among the operational risks listed in the 2017 Letter, Cybersecurity is listed first, and according to FINRA, “remain[s] one of the most significant risks many firms face, and in 2017, FINRA will continue to assess firms’ programs to mitigate those risks.”

Firms should be prepared for FINRA reviews of methods for preventing data loss, including understanding of data (e.g., its degree of sensitivity and the locations where it is stored), and its flow through the firm, and possibly to vendors.  FINRA may assess controls firms use to monitor and protect this data, for example, through data loss prevention tools. In some instances, FINRA has been known to review how firms manage their vendor relationships, including the controls to manage those relationships, and this line of examination is expected to continue.  Importantly, the 2017 Letter recognizes the nature of the “insider threat” and expresses FINRA’s intent to inquire into what controls firms have in place to acknowledge and manage that “insider threat”.    According to the 2007 Letter:  “The nature of the insider threat itself is rapidly changing as the workforce evolves to include more employees who are mobile, trusted external partnerships and vendors, internal and external contractors, as well as offshore resources.”

The WORM Actions

As if to emphasize the seriousness of the inquiries, FINRA issued a series of Letters of Consent at the end of December, levying fines totaling $14 million against 12 firms, and discussed the record-keeping requirements at the core of the December regulatory actions in its 2017 Letter.

Specifically, Securities & Exchange Commission and FINRA rules require member firms to maintain certain electronic records in a non-erasable, non-rewritable format, known by the acronym WORM, for  “Write Once, Read Many”.  This format prevents the alteration or destruction of records stored electronically.

in its press release, FINRA explained that WORM format requirements were essential to FINRA’s investigative duties. FINRA noted how the volume of sensitive financial data stored electronically by members had risen exponentially in the past decade. This increase in the amount of sensitive information stored by FINRA members coincides with increasingly aggressive attempts to hack into electronic data repositories. “These disciplinary actions are a result of FINRA’s focus on ensuring that firms maintain accurate, complete and adequately protected electronic records. Ensuring the integrity of these records is critical to the investor protection function because they are a primary means by which regulators examine for misconduct in the securities industry.

FINRA found that the each of the 12 fined firms failed to follow required document retention regulations in various ways outlined in the Letters of Consent.

Brad Bennett, FINRA’s current chief of enforcement, will be stepping down shortly.  #MLWashingtonCyberWatch will be keeping an eye on what, if any, changes may come with the new administration in 2017. Only time will tell whether FINRA will continue its aggressive enforcement actions or if we will see a softening of FINRA’s actions.   Regardless of the regulatory inquiries, firms should continue to take actions to improve cybersecurity resilience and investor protection.   For a quick review of the FINRA Report on Cybersecurity Practices, check out our webinar recording.

For the past few months, the Mintz Levin Privacy Webinar Series has focused on the upcoming EU General Data Protection Regulation (GDPR) to help businesses understand the reach and scope of the GDPR and prepare for the potentially game-changing privacy regulation. The GDPR will affect how US businesses handle and process personal data originating in the EU and may require changes to business process.

This week, we’ll present a webinar examining the criteria that determines whether or not your organization needs to appoint a Data Protection Officer. We will discuss the role of the DPO, the significance of the “independence” requirement, and the qualifications required to hold the position. Make sure to join us for this important webinar!

Registration link is here.

 

Developers and operators of educational technology services should take note.  Just before the election, California Attorney General Kamala Harris provided a document laying out guidance for those providing education technology (“Ed Tech”).  “Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data” provides practical direction that operators of websites and online services of a site or service used for K-12 purposes can use to implement best practices for their business models.

Ed Tech, per the Recommendations, comes in three categories: (1) administrative management systems and tools, such as cloud services that store student data; (2) instructional support, including testing and assessment; (3) content, including curriculum and resources such as websites and mobile apps.  The Recommendations recognize the important role that educational technology plays in classrooms by citing the Software & Information Industry Association; the U.S. Market for PreK-12 Ed Tech was estimated at $8.38 billion in 2015.

The data that may be gathered by through Ed Tech systems and services can be extremely sensitive, including medical histories, social and emotional assessments and test results.  At the Federal level, the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Rule (COPPA) govern the use of student data.  However, according to the Recommendations, these laws “are widely viewed as having been significantly outdated by new technology.”

Recognizing this, California has enacted laws in this space to fill in gaps in the protection.  Cal. Ed. Code § 49073.1, requires that local education agencies (county offices of education, school districts, and charter schools) that contract with third parties for systems or services that manage, access, or use pupil records, to include specific provisions regarding the use, ownership and control of pupil records. On the private side, the Student Online Personal Information Privacy Act (SOPIPA), requires Ed Tech provides to comply with baseline privacy and security protections.

Building on this backdrop of legislation, Attorney General Harris’ office provided six recommendations for Ed Tech providers, especially those that provide services in the pre-kindergarten to twelfth grade space.

  • Data Collection and Retention: Minimization is the Goal 

Describe the data being collected and the methods being used, while understanding that data can be thought of to include everything from behavioral data to persistent identifiers.  If your service links to another service, disclose this in your privacy policy and provide a link to the privacy policy of the external service.  If you operate the external service, maintain the same privacy and security protections for the external service that users enjoyed with the original service.  Minimize the data collected to only that necessary to provide the service, retain the data for only as long as necessary, and be able to delete personally identifiable information upon request.

  • Data Use: Keep it Educational

Describe the purposes of the data you are collecting.  Do not use any personally identifiable data for targeted advertising, including persistent identifiers, whether within the original service, or any other service.  Do not create profiles other than those necessary for the school purposes that your service was intended for.  If you use collected data for product improvement, aggregate or de-identify the data first.

  • Data Disclosure: Make Protections Stick 

Specifically describe any third parties you share personally identifiable data with. If disclosing for school purposes, only do so to further the school specific purpose of your site.  If disclosing for research purposes, only disclose personally identifiable information if you are required by federal or state law, or if allowed under federal and state law, and the disclosure is under the direction of a school, district or state education department.  Service providers should be contractually required to use any personally identifiable data only for the contracted service, not disclose the information, take reasonable security measures, delete the information when the contract is completed, and notify you of any unauthorized disclosure or breach.  Do not sell any collected information, except as part of a merger or acquisition.

  • Individual Control: Respect Users’ Rights 

Describe procedures for parents, legal guardians, and eligible students to access, review and correct personally identifiable data.  Provide procedures for students to transfer content they create to another service, and describe these procedures in your privacy policy.

  • Data Security: Implement Reasonable and Appropriate Safeguards

Provide a description of the reasonable and appropriate security you use, including technical, administrative and physical safeguards, to protect student information.  Describe your process for data breach notification.  Provide training for your employees regarding your policies and procedures and employee obligations.

  • Transparency: Provide a Meaningful Privacy Policy

Make available a privacy policy, using a descriptive title such as Privacy Policy, in a conspicuous manner that covers all student information, including personally identifiable information.  The policy should be easy for parents and educators to understand.  Consider getting feedback regarding your actual privacy policy, including from parents and students.  Include an effective date on the policy and describe how you will provide notice to the account holder, such as a school, parent, or eligible student.  Include a contact method in the policy, at a minimum an email address, and ideally also a toll-free number.

Given the size of the California market, any guidance issued by the California Attorney General’s office should be carefully considered and reviewed.   If you are growing an ed tech company, this is the time to build in data privacy and security controls.   if you are established, it’s time to review your privacy practices against this Guidance and see how you match up.  If you have any questions or concerns as to how these recommendations could be applied to your company, please do not hesitate to contact the team at Mintz Levin.