We saw a few developments on the privacy and security front these past few weeks, so rather than our usual approach of focusing on one issue, this post will highlight a few noteworthy stories.

Cookies

France’s data protection regulator (CNIL) slapped Google and Amazon with fines for dropping tracking cookies without users’ prior consent, as required under the ePrivacy Directive incorporated into France’s Data Protection Act. Google was fined €100 million while Amazon received a €35 million fine – both in connection with their French (.fr) domains. In investigating the websites, the CNIL found that in both instances, tracking cookies were automatically dropped when a user visited the domains in violation of the Data Protection Act. As a high-level reminder, EU law mandates that non-essential cookies not be dropped by a website operator until and unless a user consents to those cookies – meaning that having a banner merely informing visitors that they “agree to the use of cookies” is in violation of the law. Such was the case with Amazon’s banner, despite its use of tracking cookies (i.e., non-essential) cookies. Moreover, transparency is required as to the use of cookies, and in both cases, the CNIL found violations as to transparency (or lack thereof) in addition to improper consent mechanisms and implementation. Finally, with respect to Google, the CNIL also found that even when a user deactivated all personalized advertising, one remained in violation of the law, highlighting the often overlooked importance of ensuring that language (and choice) are aligned with technical implementation. Continue Reading Cookies, Opt-Out Choices, IoT Security: Recent Developments in Data Protection

Data retention. It’s not something that excites and invigorates businesses. But it is a necessary cost of doing business, not only to ensure one retains certain data for as long as each applicable law requires, but also as an increasingly important risk mitigation strategy.

Determining how long to retain a type of data or record depends on several factors. Various laws and regulations mandate that businesses retain certain records and data for minimum time periods. Statutes of limitation on certain types of claims also guide businesses as to how long to retain certain data. Conversely, privacy laws have always included a component of data minimization requiring businesses not to “hoard” data. Taken as a whole, these different rules require businesses to strike the right balance between retaining data for at least the properly mandated period and not retaining it for longer than necessary. Now, getting this right is even more important with the increasing risks of class action lawsuits for data security breaches. Quite simply, the more data you have, the more data you can lose. Having bastions of data will further complicate the tracking of data that may have been accessible to or taken by an unauthorized third party. This is one reason why, despite requirements to promptly notify individuals that their data was accessed in a security breach, businesses may take months to provide the notifications to individuals. Continue Reading Data Retention – More than Meets the Eye

With the adoption of more stringent privacy laws across the globe, we have seen an exponential increase in privacy technology (or “tech”) vendors offering automated privacy compliance solutions. Among other things, privacy tech vendors provide software and services to assist companies with a whole range of services, including data inventory and mapping, privacy assessments, compliance reports and risk management, as well as policies, individual rights automation and other records that may be required as part of an organization’s compliance obligations. Many privacy tech vendors pitch their solutions to organizations as designed to “easily” assist with ongoing global compliance with the aid of automation and/or algorithms. Depending on the price that you are willing to pay, relying on these solutions can indeed be helpful to automate certain aspects of privacy compliance, including data mapping, consent mechanisms, records of processing or individual rights management. However, in their marketing materials, many privacy tech vendors – big or small, paid or free – caution companies that using legal professionals for privacy compliance will simply be too costly. Based on our experience, this is misleading at best, and a recent situation really reveals just that. Continue Reading Lessons on Hidden Costs of Privacy Tech Vendors

Consistent with California’s history of prioritizing consumer privacy protections, Proposition 24 (full text here), a.k.a the California Privacy Rights Act (“CPRA”), was placed on the November ballot and handily approved by voters last week. The measure’s background itself indicates that the CPRA was being put forward to make privacy more transparent to users, similar to “ingredient labels on foods.” Background information also indicates a willingness to strengthen privacy rights over time rather than diluting them (particularly as regards to children), and in fact this push for increased transparency and protection is consistent with how certain platforms are requiring clearer policies (we discuss Apple’s new requirements here). While the CPRA will be fully effective and enforceable January 1, 2023, certain provisions take effect earlier and have a look-back provision. Businesses should start to familiarize themselves with the new or updated definitions and additional requirements contained in the CPRA. Continue Reading Voters Approve the California Privacy Rights Act: What Businesses Need to Know

This past summer, Apple introduced significant changes for iOS 14 in the data privacy realm (we discussed these here). Among those changes are Apple’s so-called privacy “nutrition labels” intended to better inform consumers of the data collection and privacy practices of individual applications. Apple announced a few days ago that developers will be required to provide these new privacy details to app users starting December 8. This applies to both new apps and any apps already in the App Store that are updated, and developers may already submit details through the through App Store Connect. Apple has provided more information here. Continue Reading Apple’s Privacy ‘Nutrition Labels’ Will Be Required Starting on December 8

With COVID-19 driving so much business online, like most people, I am increasingly seeing offers from companies vying for new customers to hand over my contact information in exchange for discounts or rewards. This includes businesses that seek to use personal information obtained through loyalty or rewards programs, those that offer price or service differences such as with free versus paid subscriptions to a service (e.g., music streaming), or those that simply want to increase their marketing reach and attract new consumers by offering a discount in exchange for personal information. There is really nothing new to these types of marketing strategies, but for companies that are subject to the California Consumer Privacy Act (CCPA), providing discounts, rewards or free-versus-paid services to California consumers has become trickier because the CCPA contains very specific – and quite stringent – obligations when it comes to financial incentives. The CCPA defines a “financial incentive” as a program, benefit, or other offering (including payments to consumers) related to the collection, retention, or sale of personal information – or, put simply, you give me your personal information and I will give you a discount code or rewards. Many businesses that are subject to CCPA, however, are not complying with the CCPA’s complex requirements regarding financial incentives. Failing to comply could spell trouble. Below we explain the challenges of implementing the CCPA’s requirements with respect to financial incentives. Continue Reading Providing Financial Incentives Under CCPA

The Federal Trade Commission has broadly relied on Section 5 of the Federal Trade Commission Act (FTC Act) to investigate and enforce against consumer protection violations, including in the context of data privacy and security. Specifically, Section 5 of the FTC Act prohibits unfair or deceptive acts or practices in or affecting commerce. With respect to data privacy and security, the FTC has repeatedly taken the position that under Section 5 of the FTC Act, a company’s failure to implement and maintain appropriate measures to protect consumers’ information may constitute an unfair practice. Likewise, making false or misleading representations (including omissions) about a company’s data privacy and security practices – notably in consumer-facing privacy notices – has been deemed by the FTC to constitute a deceptive trade practice. In its enforcement actions for data privacy and security violations, the FTC has sought – and obtained – both injunctive and equitable monetary relief (e.g., restitution or disgorgement) against companies whose practices violated Section 5 of the FTC Act. But how the FTC obtains equitable monetary relief – and whether it may even continue to do so under Section 13(b) of the FTC Act – is now before the Supreme Court. Continue Reading How the FTC’s Enforcement of Data Privacy and Security May be Impacted by the U.S. Supreme Court’s Upcoming Review of the FTC’s Use of Section 13(b)

Long gone are the days when companies could claim ownership in their employees’ data, at least in California. As our prior posts have indicated, the definition of “consumer” under the CCPA is extremely broad and extends to employees. A consumer is not only a customer or user of a business’ services, products or websites, but also a business’ employees, contractors and job applicants.

However, despite taking effect in January 1, 2020, the CCPA’s application is currently limited with regard to personal information of employees, contractors, and job applicants collected and used in the employment context. This hold delays application of some provisions of the CCPA with respect to personal information collected in the employment context (originally until January 1, 2021 and now as extended to January 1, 2022 or 2023 as set forth below), including the rights to access data and deletion of data. As a reminder, the exemption also only applies to the extent that the employer collects/uses the personal information in the context of its employment relationship and for employment purposes. Thus, any use of such personal information by an employer outside the scope of the strict employment relationship would remain covered under all of the provisions of the CCPA. For example, if an employer were to allow its insurance company to collect employee data in order to market other insurance services to those individuals, this would not be within the scope of employment and therefore subject to all of the consumer rights otherwise available under CCPA. Continue Reading Employee Data under CCPA

As we (remotely) head back to school, we thought it timely to post our “annual” reminder that collecting, using and/or disclosing children’s personal information comes with some restrictions (see last year’s post here). With this unprecedented back-to-school season, nearly all children’s activities, products and services are moving online for the foreseeable future. As such, now more than ever organizations should really take the time to determine whether they collect any data from children (or have actual knowledge of doing so), and ensure that they are taking the proper steps to comply with applicable rules. Continue Reading Children’s Privacy Check-Up

As we all know, the EU-U.S. Privacy Shield framework, the cross-border transfer mechanism relied upon by over 5,000 U.S. entities until just over a month ago, was recently invalidated by the CJEU in the Schrems II case (see here for our last post following the ruling). So what next? Continue Reading Addressing Cross-Border Transfers from the EU Following the Schrems II Ruling