Developers and operators of educational technology services should take note.  Just before the election, California Attorney General Kamala Harris provided a document laying out guidance for those providing education technology (“Ed Tech”).  “Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data” provides practical direction that operators of websites and online services of a site or service used for K-12 purposes can use to implement best practices for their business models.

Ed Tech, per the Recommendations, comes in three categories: (1) administrative management systems and tools, such as cloud services that store student data; (2) instructional support, including testing and assessment; (3) content, including curriculum and resources such as websites and mobile apps.  The Recommendations recognize the important role that educational technology plays in classrooms by citing the Software & Information Industry Association; the U.S. Market for PreK-12 Ed Tech was estimated at $8.38 billion in 2015.

The data that may be gathered by through Ed Tech systems and services can be extremely sensitive, including medical histories, social and emotional assessments and test results.  At the Federal level, the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Rule (COPPA) govern the use of student data.  However, according to the Recommendations, these laws “are widely viewed as having been significantly outdated by new technology.”

Recognizing this, California has enacted laws in this space to fill in gaps in the protection.  Cal. Ed. Code § 49073.1, requires that local education agencies (county offices of education, school districts, and charter schools) that contract with third parties for systems or services that manage, access, or use pupil records, to include specific provisions regarding the use, ownership and control of pupil records. On the private side, the Student Online Personal Information Privacy Act (SOPIPA), requires Ed Tech provides to comply with baseline privacy and security protections.

Building on this backdrop of legislation, Attorney General Harris’ office provided six recommendations for Ed Tech providers, especially those that provide services in the pre-kindergarten to twelfth grade space.

  • Data Collection and Retention: Minimization is the Goal 

Describe the data being collected and the methods being used, while understanding that data can be thought of to include everything from behavioral data to persistent identifiers.  If your service links to another service, disclose this in your privacy policy and provide a link to the privacy policy of the external service.  If you operate the external service, maintain the same privacy and security protections for the external service that users enjoyed with the original service.  Minimize the data collected to only that necessary to provide the service, retain the data for only as long as necessary, and be able to delete personally identifiable information upon request.

  • Data Use: Keep it Educational

Describe the purposes of the data you are collecting.  Do not use any personally identifiable data for targeted advertising, including persistent identifiers, whether within the original service, or any other service.  Do not create profiles other than those necessary for the school purposes that your service was intended for.  If you use collected data for product improvement, aggregate or de-identify the data first.

  • Data Disclosure: Make Protections Stick 

Specifically describe any third parties you share personally identifiable data with. If disclosing for school purposes, only do so to further the school specific purpose of your site.  If disclosing for research purposes, only disclose personally identifiable information if you are required by federal or state law, or if allowed under federal and state law, and the disclosure is under the direction of a school, district or state education department.  Service providers should be contractually required to use any personally identifiable data only for the contracted service, not disclose the information, take reasonable security measures, delete the information when the contract is completed, and notify you of any unauthorized disclosure or breach.  Do not sell any collected information, except as part of a merger or acquisition.

  • Individual Control: Respect Users’ Rights 

Describe procedures for parents, legal guardians, and eligible students to access, review and correct personally identifiable data.  Provide procedures for students to transfer content they create to another service, and describe these procedures in your privacy policy.

  • Data Security: Implement Reasonable and Appropriate Safeguards

Provide a description of the reasonable and appropriate security you use, including technical, administrative and physical safeguards, to protect student information.  Describe your process for data breach notification.  Provide training for your employees regarding your policies and procedures and employee obligations.

  • Transparency: Provide a Meaningful Privacy Policy

Make available a privacy policy, using a descriptive title such as Privacy Policy, in a conspicuous manner that covers all student information, including personally identifiable information.  The policy should be easy for parents and educators to understand.  Consider getting feedback regarding your actual privacy policy, including from parents and students.  Include an effective date on the policy and describe how you will provide notice to the account holder, such as a school, parent, or eligible student.  Include a contact method in the policy, at a minimum an email address, and ideally also a toll-free number.

Given the size of the California market, any guidance issued by the California Attorney General’s office should be carefully considered and reviewed.   If you are growing an ed tech company, this is the time to build in data privacy and security controls.   if you are established, it’s time to review your privacy practices against this Guidance and see how you match up.  If you have any questions or concerns as to how these recommendations could be applied to your company, please do not hesitate to contact the team at Mintz Levin.

Colorado is the latest state to revisit, and expand upon, its laws pertaining to the use and protection of student data. Colorado Governor John Hickenlooper recently signed into law House Bill 16-1423 (the “Bill”) designed to increase the transparency and security of personal information about students enrolled in Colorado’s public education system (K-12).  Described by its sponsors and the media as “nation-leading” with respect to the extremely broad scope of the definition of “student personally identifiable information”, the Bill imposes additional, detailed requirements on the Colorado Department of Education, the Colorado Department of Education, the Colorado Charter School Institute, school districts, public schools, and other local education providers (each, a “Public Education Entity”) and commercial software providers (including education application providers) with respect to the collection, use, and security of student data. In this blog post, we focus only on the duties of commercial software or education application providers. Continue Reading Colorado Student Data Privacy Bill – What EdTech software providers need to know

The recent data breach of Hong Kong-based electronic toy manufacturer VTech Holdings Limited (“VTech” or the “Company”) is making headlines around the world for good reason: it exposed sensitive personal information of over 11 million parents and children users of VTech’s Learning Lodge app store, Kid Connect network, and PlanetVTech in 16 countries! VTech’s Learning Lodge website allows customers to download apps, games, e-books and other educational content to their VTech products, the Kid Connect network allows parents using a smartphone app to chat with their children using a VTech tablet, and PlanetVTech is an online gaming site. As of December 3rd, VTech has suspended all its Learning Lodge sites, the KidConnect network and thirteen other websites pending investigation. Continue Reading Happy Holidays: VTech data breach affects over 11 million parents and children worldwide

California again has provided a model of privacy legislation for other states to follow.  New Hampshire Governor Maggie Hassan recently signed into law House Bill 520 (the “Bill”), a bipartisan effort to establish guidelines for the protection of student online personal information.

Who is covered by the Bill?

Modeled after California’s Student Online Personal Information Protection Act (SOPIPA), the Bill applies to operators of Internet websites, online services (including cloud computing services), and mobile applications with actual knowledge that their website, service or application is used primarily for K-12 school purposes and was designed and marketed for K-12 school purposes (“Operators”). Like SOPIPA, the Bill imposes certain obligations and restrictions on Operators with respect to the collection, use, storage and destruction of student personal information and becomes effective on January 1, 2016. We discuss SOPIPA in more detail here and provide recommendations for preparing to comply with the SOPIPA requirements.

The Bill does not apply to general audience websites, online services, and mobile applications, even if login credentials created for a covered site, service, or application may be used to access the general audience sites, services, or applications. The Bill also makes it clear that it is not intended to:

  • limit Internet service providers from providing Internet connectivity to schools or students and their families;
  • prohibit operators of websites, online service, or mobile application from marketing educational products directly to parents so long as the marketing did not result from the use of “Covered Information” under the Bill;
  • impose a duty upon a provider of an electronic store, gateway, marketplace, or other means of purchasing or downloading software or applications to review or enforce compliance with the Bill on those applications or software;
  • impose a duty upon a provider of an interactive computer service, as defined in 47 U.S.C. section 230, to review or enforce compliance with the Bill by third-party content providers; or
  • impede the ability of students to download, export, or otherwise save or maintain their own student created data or documents.

What information is covered by the Bill?

The Bill defines “Covered Information” very broadly to include personally identifiable information or materials, in any media or format, created or provided to an Operator by either a student (or his/her parent or guardian) while using the Operator’s site, service, or application or by an employee or agent of the K-12 school, school district, local education agency, or county office of education, as well as information gathered by the Operator that is related to the student, such as information that is “descriptive of a student or otherwise identifies a student, including, but not limited to, information in the student’s educational record or email, first and last name, home address, date of birth, telephone number, unique pupil identifier, social security number, financial or insurance account numbers, email address, other information that allows physical or online contact, discipline records, test results, special education data, juvenile dependency records, grades, evaluations, criminal records, medical records, health records, biometric information, disabilities, socioeconomic information, food purchases, political affiliations, religious information, text messages, documents, other student identifiers, search activity, photos, voice recordings, or geo-location information.”

What do you have to do to comply with the Bill?


Avoid the following prohibited activities:

  • Using any information (including persistent identifiers) created or collected through your site, service, or application to create a profile about a K-12 student;
  • Engaging in targeted advertising (either on your site, service, or application or any other site, service, or application) when the targeting is based on any information (including covered information and persistent identifiers) that you have acquired as a result of the use of your site, service, or application;
  • Selling, leasing, renting, trading, or otherwise making available a student’s information (including covered information), except in connection with a sale of your business provided that the buyer continues to be bound by this restriction with respect to previously acquired student information; or
  • Disclosing protected information, except where the disclosure is mandated to “respond to or participate in judicial process”.

Implement and maintain the following security and deletion requirements:

  • reasonable security procedures and practices (appropriate to the nature of the Covered Information) to protect Covered Information from unauthorized access, destruction, use, modification, or disclosure, and
  • delete covered information if the school or district requests deletion of data under the control of the school or district.

What can you do with Covered Information?

Although, as discussed above, there are many restrictions on the use of Covered Information, Operators are permitted to:

  • Use de-identified Covered Information within their sites, service, or application (or other sites, services, or applications owned by the Operator) to improve educational products and to demonstrate the effectiveness of their products or services (including in their marketing), and
  • Share aggregated de-identified Covered Information for the development and improvement of educational sites, services, or applications.

Although the effective date is January 1, 2016, if you are an “Operator” under the Bill, this is the time to begin thinking about what kind of changes you may need to make in your processes and procedures and to put in place an implementation plan to be compliant with the Bill by its effective date.

If your company has an online presence — or provides marketing or advertising services — you should be registered for the fifth webinar in our 2015 Wednesday Privacy Webinar series:  The Long Reach of COPPA.   Recall the recent FTC settlement agreement with Yelp — clearly a site not targeted at children — that cost the online review company $450,000.

 

Register online here – NY and CA CLE credit is available.

It’s Monday morning — do you know your privacy/security status?

Here are a few bits and bytes to start your week.

SEC to Registered Investment Advisers and Broker-Dealers:  It’s Your Turn to Pay Attention to Cybersecurity

The Division of Investment Management of the Securities & Exchange Commission (SEC) has weighed in on cybersecurity of registered investment companies (“funds”) and registered investment advisers (“advisers”) as an important issue because both funds and advisers increasingly use technology to conduct their business activities, and need to protect confidential and sensitive information related to these activities from third parties.  That information includes information concerning fund investors and advisory clients.   We’ve summarized key points from the recently-issued Guidance.

The Guidance recommends a number of measures that funds and advisers may wish to consider in addressing cybersecurity risk, including:

  • Conduct a periodic assessment of:
    • the nature, sensitivity and location of information that the firm collects, processes and/or stores, and the technology systems it uses;
    • internal and external cybersecurity threats to and vulnerabilities of the firm’s information and technology systems;
    • security controls and processes currently in place; and
    • the impact should the information or technology systems become compromised;  and the effectiveness of the governance structure for the management of cybersecurity risk.
  • Create a strategy that is designed to prevent, detect and respond to cybersecurity threats, such a strategy could include:PrivacyMonday_Image1
    •  controlling access to:
      • various systems and data via management of user credentials;
      • authentication and authorization methods;
      • firewalls and/or perimeter defenses;
      • sensitive information and network resources;
      • network segregation;
      • system hardening; and
      • data encryption.
  • protecting against the loss or exfiltration of sensitive data by:
  • restricting the use of removable storage media; and
  • deploying software that monitors technology systems for:
    • unauthorized intrusions;
    • loss or exfiltration of sensitive data;  or
    • other unusual events.
  • data backup and retrieval; and
  • the development of an incident response plan
    • routine testing of strategies could also enhance the effectiveness of any strategy.
  • Implement the strategy through:
    • written policies and procedures; and
    • training that:
      • provides guidance to officers and employees concerning applicable threats and measures to prevent, detect and respond to such threats; and
      •  monitors compliance with cybersecurity policies and procedures.

Most of this should not be a surprise to any business dealing with sensitive financial information these days, but a recent SEC cybersecurity sweep examination by the SEC’s Office of Compliance Inspections and Examinations (OCIE) found that 88 percent of the broker-dealers (BDs) and 74 percent of the registered investment advisers (RIAs) they visited experienced cyber-attacks directly or indirectly through vendors.

 

Penn State University Confirms Cyberattack Originated in China

If you’re studying at Penn State’s College of Engineering, you will not have access to the Internet for a while.  The University said last week that of two recent cyber attacks at the College, at least one was carried out by a “threat actor” based in China.   Penn State was alerted to a breach by the FBI in November and has been investigating since – during that time, a 2012 breach was also discovered.   The 2012 breach apparently originated in China, and compromised servers containing information on about 18,000 people.

For more:  Cyberattack on Penn State University

 

Digital Advertising Alliance to Enforce Mobile App Principles

Starting September 1, the Digital Advertising Alliance (DAA) will begin to enforce its Application of Self-Regulatory Principles to the Mobile Environment.   The DAA issued the mobile principles back in July of 2013 (see our post here), but delayed enforcement while the DAA implemented a choice mechanism for the mobile environment.  Mobile tools for consumers were released in February:  App Choices and the Consumer Choice Page for Mobile Web.

The Guidance addresses mobile-specific issues such as privacy notices, enhanced notices and opt-out mechanisms for data collected from a particular device regarding app use over time and cross-app data; privacy notices, enhanced notices and opt-in consent for geolocation data; and transparency and controls — including opt-in consent — for calendar, address books, photo/video data, etc. created by a user that is stored on or accessed through a particular device.

After September 1, any entity that collects and uses any of this type of data will be required to demonstrate compliance with the Guidance or risk being subject to the DAA’s accountability mechanism.

 

REMINDER — UPCOMING PRIVACY WEDNESDAY WEBINAR

Don’t forget to register for the next in our Privacy Wednesday Webinar series:  The Long Reach of COPPA.   Webinar is eligible for NY and CA CLE credit — register here.

 

 

 

 

 

On this Privacy Monday, we have some upcoming events that you might want to add to your calendar.Privacy & Security Matters Monday Blog Series Image

Wednesday, May 13 – Mintz Employment Law Summit (Boston)

A discussion of hot topics facing employers, including Privacy in the Workplace.  Free event, breakfast and lunch included.   Register here.

Wednesday, May 13 – National Security, Privacy, and Renewing the USA PATRIOT Act, Hudson Institute, NY

Live streaming starts at noon. #PATRIOTAct.  More information here.

Wednesday, May 13 – Ninth Annual Law & Information Society Symposium – Fordham Law School

Trends in the global processing of data, developments in new technologies, privacy enforcement actions and government surveillance put international privacy at the center of the global law and policy agenda. Government regulators, policymakers, legal experts, and industry players need to find solutions to cross-border conflicts and to the issues presented by innovative technologies. This conference seeks to create a robust, but informal dialog that will explore possible solutions to current questions arising from the international legal framework, infrastructure architecture and commercial practices.   Information here.

Thursday, May 14 – IAPP KnowledgeNet (Boston area)

Learn about data privacy issues posed by wearables, wellness tracking apps, company wellness programs and other technologies and services here in the U.S. and abroad.   Register here.

Monday, May 18 – 36th IEEE Symposium on Security & Privacy – Fairmont Hotel (San Jose)

Since 1980, the IEEE Symposium on Security and Privacy has been the premier forum for presenting developments in computer security and electronic privacy, and for bringing together researchers and practitioners in the field. The 2015 Symposium will mark the 36th annual meeting of this flagship conference.  More information here.

Wednesday, May 27 – Mintz Privacy Wednesday Webinar – The Long Reach of COPPA

The fifth in our Wednesday Webinar series will focus on a discussion of COPPA, the long-awaited amendment and issues.   We’ll also discuss the latest Federal Trade Commission settlements and how to avoid being the next target.   Register here.

 

 

As we predicted in our post late last month, Google’s YouTube Kids app has attracted more than just the “curious little minds” Google was hoping for.  Yesterday, a group of privacy and children’s rights advocates (including the Center for Digital Democracy and the American Academy of Child and Adolescent Psychiatry) asked the Federal Trade Commission “to investigate whether Google’s YouTube Kids app violates Section 5 of the FTC Act . . . .”

The advocacy group downloaded the YouTube Kids app onto an Android device, and two iOS devices.  It then reviewed and assessed the app as it functioned; watching content Google says caters to children while protecting them from questionable or troubling content.

The advocacy group claims this review identified three features of the app it believes are unfair or deceptive.  First, the group faults Google for offering content “intermixed” with advertising content in a manner the group claims “would not be permitted to be shown on broadcast or cable television” under Federal Communications Commission guidelines.  Second, the group worries that much of advertising violates FTC Endorsement Guidelines because it is user-generated in a way capable of masking relationships with product manufacturers.  Finally, the group claims the advertising content violates the YouTube Kids app’s stated policies and procedures.

Taken together, the advocacy group issues all collapse around the same core argument: very young children (generally under 5 years of age) cannot distinguish between actual content and advertising and that makes them “uniquely vulnerable to commercial influence.”  This argument has a lot of emotional appeal: who wouldn’t want to protect small children?  But the implications of this argument extend far beyond the YouTube Kids app, and would call into question any free, advertising supported video platform, including network television.   As such, it seems like the advocacy groups position face significant First Amendment hurdles.

Although the advocacy group does not (yet) take issues with YouTube Kids’ data collection practices, it does question how the app is able to generate video recommendations.  And its letter to the FTC explicitly asks the Commission to investigate whether or not children are being tracked without verifiable parental consent.

The ball is now squarely in the FTC’s court.  It could launch a non-public investigation regarding the app’s practices, or it could do nothing.   However, as the Commission has recently signaled a renewed interest in protecting children online (including entering a $19 million dollar settlement with Google over children’s in-app purchases last September), it seems likely the Commission will have at least some questions for Google following the advocacy group’s letter.

We’ll be sure to keep you posted.

On Friday, the FTC published updates to the COPPA FAQs, the Commission’s compliance guide for businesses and consumers, to address the applicability of COPPA and the Amended COPPA Rule to educational institutions and businesses that provide online services, including mobile apps, to educational institutions. Specifically, nearly a year after the last update to the “COPPA and Schools FAQs”, the Commission revisited its answers to FAQs M.1, M.2, and M.5 and deleted FAQ M.6 in an attempt to streamline the FAQs to provide further clarity on the key topics of notice and consent, best practices for educational institutions, and the interplay between COPPA and other federal and state laws that may apply in the education space. To access our blog post on the prior update to the COPPA and Schools FAQs please click here. Continue Reading Privacy Monday – March 23, 2015: COPPA Refresh

Google made good on the rumors and the company’s subsequent promise last December to create a family-friendly version of its popular YouTube service with its launch on Monday of the YouTube Kids app. Available on both the App Store and Google Play free of cost and only in the United States, the YouTube Kids app is described by Google as an “app designed for curious little minds to dive into a world of discovery, learning, and entertainment…delightfully simple and packed full of age-appropriate videos, channels, and playlists.” Continue Reading The YouTube Kids app is here! Now what?