Photo of Céline Guillou

Céline Guillou is a member of Hopkins & Carley’s Corporate Practice and focuses on data privacy law and compliance.  Céline holds the certificate for Certified Information Privacy Professional Europe (CIPP/E) from the International Association of Privacy Professionals (IAPP), the global standard in privacy certification.

In recent months, there has been increased chatter about “dark patterns” in user interfaces, and it’s only getting louder. When we think of dark patterns, we often think of features that make it more difficult to cancel subscriptions, or that (mis)lead us to sign up for a product or service despite our best intentions. However, dark patterns also impact data privacy in a number of ways.
Continue Reading How Dark Patterns May be Chipping Away at Your Company’s Privacy Compliance Efforts

The Commonwealth of Virginia is on the verge of becoming the second state with a consumer data protection law. The Consumer Data Protection Act (“CDPA”), which awaits signature by Governor Northam (who is expected to sign the bill into law), would go into effect on January 1, 2023. Like California’s CCPA (and CPRA, also set to take effect January 1, 2023), the CDPA establishes a “comprehensive” framework for the collection and use of personal data of Virginia residents while also (and ironically) not applying to companies across the board. The CDPA would apply “to persons that conduct business in the Commonwealth or produce products or services that are targeted to residents of the Commonwealth and that (i) during a calendar year, control or process personal data of at least 100,000 consumers or (ii) control or process personal data of at least 25,000 consumers and derive over 50 percent of gross revenue from the sale of personal data.” Unlike CCPA, the CDPA does not contain an “annual revenue” threshold.
Continue Reading States Are “Stepping Up” to the Privacy Plate. Who’s Next? Virginia.

Online advertising – or “adtech”, as it is often referred to – does not mix well with many privacy laws, beginning with the GDPR. In recent years since GDPR went into effect, privacy advocates have increased their demands on EU regulators to more deeply scrutinize targeting practices and how data is shared within the advertising ecosystem, in particular when it comes to real-time bidding (RTB). Complaints have been filed by many privacy-minded organizations, and all of them allege that, by its very nature, RTB constitutes a “wide-scale and systemic” breach of Europe’s privacy laws. This is because RTB relies on the massive collection, accumulation and dissemination of detailed behavioral data about individuals who use the internet.
Continue Reading Key Takeaways from the Recent Grindr Decision and “Tentative” $11M Fine

Happy new year from our team at Hopkins & Carley! With each new year comes a host of bright new intentions. As each of us knows all too well, some will stick and others will quickly be forgotten. As a reminder to stay the course when it comes to data privacy and security, this year we kick off our 2021 to-do (and not-to-do) list. Rather than focus on privacy and security predictions for 2021, we wanted to share a list of action items based on some of the hard-learned lessons from 2020, as well as trends that we expect to continue into 2021. 2020 was a very busy and tumultuous year in the privacy and security world, and this will certainly also be the case in 2021. Companies that handle personal information must juggle an increasing number of laws, regulations, business-mandated requirements and risks. With that, here are a few things to keep in mind as we enter 2021:
Continue Reading Our Privacy & Security 2021 To-Do (and Not-To-Do) List: Lessons Learned From a Year Like No Other

We saw a few developments on the privacy and security front these past few weeks, so rather than our usual approach of focusing on one issue, this post will highlight a few noteworthy stories.

Cookies

France’s data protection regulator (CNIL) slapped Google and Amazon with fines for dropping tracking cookies without users’ prior consent, as required under the ePrivacy Directive incorporated into France’s Data Protection Act. Google was fined €100 million while Amazon received a €35 million fine – both in connection with their French (.fr) domains. In investigating the websites, the CNIL found that in both instances, tracking cookies were automatically dropped when a user visited the domains in violation of the Data Protection Act. As a high-level reminder, EU law mandates that non-essential cookies not be dropped by a website operator until and unless a user consents to those cookies – meaning that having a banner merely informing visitors that they “agree to the use of cookies” is in violation of the law. Such was the case with Amazon’s banner, despite its use of tracking cookies (i.e., non-essential) cookies. Moreover, transparency is required as to the use of cookies, and in both cases, the CNIL found violations as to transparency (or lack thereof) in addition to improper consent mechanisms and implementation. Finally, with respect to Google, the CNIL also found that even when a user deactivated all personalized advertising, one remained in violation of the law, highlighting the often overlooked importance of ensuring that language (and choice) are aligned with technical implementation.
Continue Reading Cookies, Opt-Out Choices, IoT Security: Recent Developments in Data Protection

Data retention. It’s not something that excites and invigorates businesses. But it is a necessary cost of doing business, not only to ensure one retains certain data for as long as each applicable law requires, but also as an increasingly important risk mitigation strategy.

Determining how long to retain a type of data or record depends on several factors. Various laws and regulations mandate that businesses retain certain records and data for minimum time periods. Statutes of limitation on certain types of claims also guide businesses as to how long to retain certain data. Conversely, privacy laws have always included a component of data minimization requiring businesses not to “hoard” data. Taken as a whole, these different rules require businesses to strike the right balance between retaining data for at least the properly mandated period and not retaining it for longer than necessary. Now, getting this right is even more important with the increasing risks of class action lawsuits for data security breaches. Quite simply, the more data you have, the more data you can lose. Having bastions of data will further complicate the tracking of data that may have been accessible to or taken by an unauthorized third party. This is one reason why, despite requirements to promptly notify individuals that their data was accessed in a security breach, businesses may take months to provide the notifications to individuals.
Continue Reading Data Retention – More than Meets the Eye

With the adoption of more stringent privacy laws across the globe, we have seen an exponential increase in privacy technology (or “tech”) vendors offering automated privacy compliance solutions. Among other things, privacy tech vendors provide software and services to assist companies with a whole range of services, including data inventory and mapping, privacy assessments, compliance reports and risk management, as well as policies, individual rights automation and other records that may be required as part of an organization’s compliance obligations. Many privacy tech vendors pitch their solutions to organizations as designed to “easily” assist with ongoing global compliance with the aid of automation and/or algorithms. Depending on the price that you are willing to pay, relying on these solutions can indeed be helpful to automate certain aspects of privacy compliance, including data mapping, consent mechanisms, records of processing or individual rights management. However, in their marketing materials, many privacy tech vendors – big or small, paid or free – caution companies that using legal professionals for privacy compliance will simply be too costly. Based on our experience, this is misleading at best, and a recent situation really reveals just that.
Continue Reading Lessons on Hidden Costs of Privacy Tech Vendors

This past summer, Apple introduced significant changes for iOS 14 in the data privacy realm (we discussed these here). Among those changes are Apple’s so-called privacy “nutrition labels” intended to better inform consumers of the data collection and privacy practices of individual applications. Apple announced a few days ago that developers will be required to provide these new privacy details to app users starting December 8. This applies to both new apps and any apps already in the App Store that are updated, and developers may already submit details through the through App Store Connect. Apple has provided more information here.
Continue Reading Apple’s Privacy ‘Nutrition Labels’ Will Be Required Starting on December 8

With COVID-19 driving so much business online, like most people, I am increasingly seeing offers from companies vying for new customers to hand over my contact information in exchange for discounts or rewards. This includes businesses that seek to use personal information obtained through loyalty or rewards programs, those that offer price or service differences such as with free versus paid subscriptions to a service (e.g., music streaming), or those that simply want to increase their marketing reach and attract new consumers by offering a discount in exchange for personal information. There is really nothing new to these types of marketing strategies, but for companies that are subject to the California Consumer Privacy Act (CCPA), providing discounts, rewards or free-versus-paid services to California consumers has become trickier because the CCPA contains very specific – and quite stringent – obligations when it comes to financial incentives. The CCPA defines a “financial incentive” as a program, benefit, or other offering (including payments to consumers) related to the collection, retention, or sale of personal information – or, put simply, you give me your personal information and I will give you a discount code or rewards. Many businesses that are subject to CCPA, however, are not complying with the CCPA’s complex requirements regarding financial incentives. Failing to comply could spell trouble. Below we explain the challenges of implementing the CCPA’s requirements with respect to financial incentives.
Continue Reading Providing Financial Incentives Under CCPA

The Federal Trade Commission has broadly relied on Section 5 of the Federal Trade Commission Act (FTC Act) to investigate and enforce against consumer protection violations, including in the context of data privacy and security. Specifically, Section 5 of the FTC Act prohibits unfair or deceptive acts or practices in or affecting commerce. With respect to data privacy and security, the FTC has repeatedly taken the position that under Section 5 of the FTC Act, a company’s failure to implement and maintain appropriate measures to protect consumers’ information may constitute an unfair practice. Likewise, making false or misleading representations (including omissions) about a company’s data privacy and security practices – notably in consumer-facing privacy notices – has been deemed by the FTC to constitute a deceptive trade practice. In its enforcement actions for data privacy and security violations, the FTC has sought – and obtained – both injunctive and equitable monetary relief (e.g., restitution or disgorgement) against companies whose practices violated Section 5 of the FTC Act. But how the FTC obtains equitable monetary relief – and whether it may even continue to do so under Section 13(b) of the FTC Act – is now before the Supreme Court.
Continue Reading How the FTC’s Enforcement of Data Privacy and Security May be Impacted by the U.S. Supreme Court’s Upcoming Review of the FTC’s Use of Section 13(b)