Happy new year from our team at Hopkins & Carley! With each new year comes a host of bright new intentions. As each of us knows all too well, some will stick and others will quickly be forgotten. As a reminder to stay the course when it comes to data privacy and security, this year we kick off our 2021 to-do (and not-to-do) list. Rather than focus on privacy and security predictions for 2021, we wanted to share a list of action items based on some of the hard-learned lessons from 2020, as well as trends that we expect to continue into 2021. 2020 was a very busy and tumultuous year in the privacy and security world, and this will certainly also be the case in 2021. Companies that handle personal information must juggle an increasing number of laws, regulations, business-mandated requirements and risks. With that, here are a few things to keep in mind as we enter 2021:
Continue Reading Our Privacy & Security 2021 To-Do (and Not-To-Do) List: Lessons Learned From a Year Like No Other

We saw a few developments on the privacy and security front these past few weeks, so rather than our usual approach of focusing on one issue, this post will highlight a few noteworthy stories.

Cookies

France’s data protection regulator (CNIL) slapped Google and Amazon with fines for dropping tracking cookies without users’ prior consent, as required under the ePrivacy Directive incorporated into France’s Data Protection Act. Google was fined €100 million while Amazon received a €35 million fine – both in connection with their French (.fr) domains. In investigating the websites, the CNIL found that in both instances, tracking cookies were automatically dropped when a user visited the domains in violation of the Data Protection Act. As a high-level reminder, EU law mandates that non-essential cookies not be dropped by a website operator until and unless a user consents to those cookies – meaning that having a banner merely informing visitors that they “agree to the use of cookies” is in violation of the law. Such was the case with Amazon’s banner, despite its use of tracking cookies (i.e., non-essential) cookies. Moreover, transparency is required as to the use of cookies, and in both cases, the CNIL found violations as to transparency (or lack thereof) in addition to improper consent mechanisms and implementation. Finally, with respect to Google, the CNIL also found that even when a user deactivated all personalized advertising, one remained in violation of the law, highlighting the often overlooked importance of ensuring that language (and choice) are aligned with technical implementation.
Continue Reading Cookies, Opt-Out Choices, IoT Security: Recent Developments in Data Protection

Data retention. It’s not something that excites and invigorates businesses. But it is a necessary cost of doing business, not only to ensure one retains certain data for as long as each applicable law requires, but also as an increasingly important risk mitigation strategy.

Determining how long to retain a type of data or record depends on several factors. Various laws and regulations mandate that businesses retain certain records and data for minimum time periods. Statutes of limitation on certain types of claims also guide businesses as to how long to retain certain data. Conversely, privacy laws have always included a component of data minimization requiring businesses not to “hoard” data. Taken as a whole, these different rules require businesses to strike the right balance between retaining data for at least the properly mandated period and not retaining it for longer than necessary. Now, getting this right is even more important with the increasing risks of class action lawsuits for data security breaches. Quite simply, the more data you have, the more data you can lose. Having bastions of data will further complicate the tracking of data that may have been accessible to or taken by an unauthorized third party. This is one reason why, despite requirements to promptly notify individuals that their data was accessed in a security breach, businesses may take months to provide the notifications to individuals.
Continue Reading Data Retention – More than Meets the Eye

Consistent with California’s history of prioritizing consumer privacy protections, Proposition 24 (full text here), a.k.a the California Privacy Rights Act (“CPRA”), was placed on the November ballot and handily approved by voters last week. The measure’s background itself indicates that the CPRA was being put forward to make privacy more transparent to users, similar to “ingredient labels on foods.” Background information also indicates a willingness to strengthen privacy rights over time rather than diluting them (particularly as regards to children), and in fact this push for increased transparency and protection is consistent with how certain platforms are requiring clearer policies (we discuss Apple’s new requirements here). While the CPRA will be fully effective and enforceable January 1, 2023, certain provisions take effect earlier and have a look-back provision. Businesses should start to familiarize themselves with the new or updated definitions and additional requirements contained in the CPRA.
Continue Reading Voters Approve the California Privacy Rights Act: What Businesses Need to Know

This past summer, Apple introduced significant changes for iOS 14 in the data privacy realm (we discussed these here). Among those changes are Apple’s so-called privacy “nutrition labels” intended to better inform consumers of the data collection and privacy practices of individual applications. Apple announced a few days ago that developers will be required to provide these new privacy details to app users starting December 8. This applies to both new apps and any apps already in the App Store that are updated, and developers may already submit details through the through App Store Connect. Apple has provided more information here.
Continue Reading Apple’s Privacy ‘Nutrition Labels’ Will Be Required Starting on December 8

With COVID-19 driving so much business online, like most people, I am increasingly seeing offers from companies vying for new customers to hand over my contact information in exchange for discounts or rewards. This includes businesses that seek to use personal information obtained through loyalty or rewards programs, those that offer price or service differences such as with free versus paid subscriptions to a service (e.g., music streaming), or those that simply want to increase their marketing reach and attract new consumers by offering a discount in exchange for personal information. There is really nothing new to these types of marketing strategies, but for companies that are subject to the California Consumer Privacy Act (CCPA), providing discounts, rewards or free-versus-paid services to California consumers has become trickier because the CCPA contains very specific – and quite stringent – obligations when it comes to financial incentives. The CCPA defines a “financial incentive” as a program, benefit, or other offering (including payments to consumers) related to the collection, retention, or sale of personal information – or, put simply, you give me your personal information and I will give you a discount code or rewards. Many businesses that are subject to CCPA, however, are not complying with the CCPA’s complex requirements regarding financial incentives. Failing to comply could spell trouble. Below we explain the challenges of implementing the CCPA’s requirements with respect to financial incentives.
Continue Reading Providing Financial Incentives Under CCPA

With the Covid-19 crisis, many companies that may have traditionally only done business offline are transitioning and expanding into e-commerce. Others are starting new businesses and innovating new technologies and platforms. There are a multitude of considerations that go into these new ventures, an important one of which is security.
Continue Reading Data Security and the New York SHIELD Act: Going Beyond New York Companies

During a recent keynote presentation with the IAPP following the July 1 enforcement deadline of the CCPA, Stacey Schesser, Supervising Deputy Attorney General for the State of California (“Deputy AG”), provided a bit of a roadmap for CCPA enforcement actions from the California Attorney General (“AG”) that are both currently underway and expected in the near future.
Continue Reading CCPA Enforcement: What to Expect Next

Despite three annual reviews by European Union Commissioners, the European Court of Justice (CJEU) invalidated the Privacy Shield and called into question many transfers of personal data pursuant to the Standard Contractual Clauses on July 16.  At stake are transfers of EU personal data to thousands of U.S. companies that rely on personal data being transferred from the EU. The case is colloquially known as “Schrems II” as it is the second case involving Maximillian Schrems (Case C-311/18 Data Protection Commissioner v Facebook Ireland and Maximillian Schrems). Mr. Schrems’ first case resulted in an invalidation of the EU-US Safe Harbor, the Privacy Shield’s predecessor in 2015.
Continue Reading Schrems II: EU Personal Data Transfers to the U.S. and the Invalidation of the Privacy Shield

The California Attorney General’s final proposed regulations under CCPA (“Regulations”) have been submitted, and pending approval by the California Office of Administrative Law, will soon become enforceable by law. One often overlooked requirement of the CCPA is the obligation of covered businesses to provide notices that are “reasonably accessible.” All drafts of the Regulations have provided more detail about the accessibility requirement contained in the CCPA, and the final Regulations make clear that for notices provided online, businesses must follow generally recognized industry standards, such as the Web Content Accessibility Guidelines, version 2.1 (WCAG) from the World Wide Web Consortium. While companies have largely focused on updating the language or substance of their notices to comply with CCPA, this requirement as to form has, by and large, slipped through the cracks, but is certain to generate some discussion (if not litigation) in coming months.

By way of background, the Americans with Disabilities Act (ADA) requires, among other things, that places of “public accommodation” remove barriers to access for individuals with disabilities. While this has long been considered the rule for physical establishments, including privately-owned, leased or operated facilities like hotels, restaurants, retail merchants, health clubs, sports stadiums, movie theaters, and so on, virtual accessibility has been much less consistent, and generally the exception rather than the norm. In fact, web accessibility hardly ever appears on businesses’ radars, due perhaps to a very short-sighted perception of what, in fact, qualifies as a disability as well as a lack of overall guidance.

Web accessibility means ensuring that websites, mobile applications, and other virtual platforms can be used by everyone, including those with disabilities, such as impaired vision. However, what exactly is required is a source of confusion. In 2019, the Department of Justice (DOJ), which is responsible for establishing regulations pursuant to the ADA, withdrew regulations that had been drafted for website accessibility, and has since yet to promulgate any such regulations. This has left courts with the task of determining how and to what extent web accessibility is required under the ADA when it comes to businesses that offer goods and services online, with varying results.
Continue Reading CCPA and Web Accessibility