ThinkstockPhotos-488600674-webIn keeping with its position as the nation’s leader on privacy issues, the state of California recently enacted significant new laws on student privacy and education data. The Student Online Personal Information Protection Act (SOPIPA) sets forth a variety of restrictions on how operators of online services offered in schools can use and disclose student information, and requires operators to implement reasonable security measures to protect student data. A separate law (A.B. 1584) sets forth privacy requirements for providers of digital storage services and educational software used in schools. A final law (A.B. 1442) establishes privacy requirements for companies that collect students’ social media information on behalf of schools. The laws were signed by Governor Jerry Brown on September 29, 2014.
Continue Reading California Enacts Landmark Student Privacy Laws

Recent large-scale data breaches provide a stark reminder of the risks and challenges associated with today’s data-driven economy. The exploding number of devices connected to the Internet and amount of information collected about people by organizations make it increasingly important for officers, directors, and senior management to fully understand the privacy and data security risks faced by their organizations.

One of the most effective techniques for managing those risks is conducting a comprehensive privacy and data security risk assessment. Organizations use such risk assessments to maintain appropriate risk profiles based on the organization’s contractual, regulatory, and governance obligations. Regulatory schemes in some industries, including health1 and finance,2 may require risk assessments for compliance. Organizations that collect payment information to process payments as merchants or payment processors3 or deal with data collected about individuals residing in specific states4 may also have risk assessment obligations. Organizations commonly tailor risk assessments to meet these types of obligations for their risk tolerance and profile. A comprehensive risk assessment may include considerations of scope, documentation, timing, management, and oversight.5
Continue Reading Privacy and Data Security Risk Assessments: An Overview

On March 27, 2014, the Federal Communications Commission (FCC) addressed an outstanding petition1 seeking guidance for compliance with the “prior express consent” requirement of the Telephone Consumer Protection Act (TCPA) for informational text messages.2 In a declaratory ruling, the FCC provided clarification of this requirement, and specifically addressed whether an intermediary may provide such consent. The FCC agreed with group texting service GroupMe, Inc. that, consistent with the TCPA, intermediaries may convey consent provided by others to receive informational text messages.3 However, the FCC made clear that companies ultimately remain liable where intermediaries fail to obtain the required consent. The ruling demonstrates a current trend at the FCC to allow businesses communicating with consumers by text message some flexibility while navigating the TCPA’s increasingly complex requirements.
Continue Reading FCC Clarifies That Consent May Be Provided by Intermediary for Informational Text Messages

In January 2014, President Barack Obama charged his counselor John Podesta with looking at: (a) how the challenges inherent in big data are being confronted in the public and private sectors; (b) whether the United States can forge international norms on how to manage big data; and (c) how the United States can continue to promote the free flow of information in ways that are consistent with both privacy and security. Two reports were published on May 1, 2014, in response to this charge, one focusing on policy and big data (the “Policy Report”)1 and the other complementing and informing the Policy Report with a focus on technology and big data (the “Technology Report”).2

Both reports acknowledge that there is no one definition of “big data.” However, big data is differentiated from data historically collected about individuals (“small data”3) in two ways: big data’s quantity and variety, as well as the scale of analysis that can be applied to big data. And, while both reports view big data as potentially providing great benefits to the economy, society, and individuals, they also identified its potential to cause significant harm.
Continue Reading President’s Counselor Makes Recommendations on Privacy and Other Values in Big Data Age

Data may well be the asset of the 21st century, but selling access to certain data about individuals may raise the risk of attracting unwanted attention from both regulators1 and class action litigants. As organizations collect more types of data about consumers, they are more likely to have data that may constitute “consumer report” data under the Fair Credit Reporting Act (FCRA).2 Organizations that try to monetize such data by selling access to consumer profiles can easily run afoul of the FCRA.

This article discusses recent Federal Trade Commission (FTC) enforcement actions against two background check companies that allegedly failed to avoid the FCRA trip wires and face a combined $1.5 million in fines.3 The FTC aggressively enforces the FCRA and violations commonly occur due to a failure to create and implement adequate policies and procedures. This article also explains how the U.S. Supreme Court may review the Ninth Circuit’s recent decision to join other federal appellate courts in making FCRA class action lawsuits easier to bring for plaintiffs. Given the appellate courts’ interpretations of the FCRA, plaintiffs likely will increasingly make FCRA claims in an effort to obtain compensation for alleged general privacy violations. Any organization that sells access to data profiles about individuals is advised to determine whether it must comply with the FCRA and, if necessary, implement policies and procedures that meet the FCRA’s requirements.
Continue Reading FTC Continues Its Aggressive FCRA Enforcement and Ninth Circuit Lowers Standing Threshold in FCRA Cases

A proposed California law, the Consumer Data Breach Protection Act (A.B. 1710),1 has the potential to upend the calculus of determining liability after retail data breaches, create additional data security requirements for retailers and other consumer-facing businesses operating in California, and establish new standards for data breach reporting for breaches affecting California residents. The bill, introduced by California State Assemblymen Bob Wieckowski and Roger Dickinson in February 2014 and currently pending before the California Assembly Committee on the Judiciary, may in part represent an effort to respond to the recent data breaches affecting Target Corp. and Neiman Marcus Ltd., and aims to strengthen one of the most prescriptive state statutes already in existence.

The heightened concern over data privacy in recent months might enable the passage of the bill, which is a variation of past bills that were vetoed by former Governor Arnold Schwarzenegger.2 If passed, A.B. 1710 would place California alongside Washington, Minnesota, and Nevada as the states mandating particular data security provisions with respect to payment card data,3 and would increase the data breach reporting requirements and liability associated with breaches for entities doing business in California.
Continue Reading Proposed California Law Would Impose Data Breach Liability on Retailers and Create More Stringent Data Security Requirements for Businesses