We saw a few developments on the privacy and security front these past few weeks, so rather than our usual approach of focusing on one issue, this post will highlight a few noteworthy stories.


France’s data protection regulator (CNIL) slapped Google and Amazon with fines for dropping tracking cookies without users’ prior consent, as required under the ePrivacy Directive incorporated into France’s Data Protection Act. Google was fined €100 million while Amazon received a €35 million fine – both in connection with their French (.fr) domains. In investigating the websites, the CNIL found that in both instances, tracking cookies were automatically dropped when a user visited the domains in violation of the Data Protection Act. As a high-level reminder, EU law mandates that non-essential cookies not be dropped by a website operator until and unless a user consents to those cookies – meaning that having a banner merely informing visitors that they “agree to the use of cookies” is in violation of the law. Such was the case with Amazon’s banner, despite its use of tracking cookies (i.e., non-essential) cookies. Moreover, transparency is required as to the use of cookies, and in both cases, the CNIL found violations as to transparency (or lack thereof) in addition to improper consent mechanisms and implementation. Finally, with respect to Google, the CNIL also found that even when a user deactivated all personalized advertising, one remained in violation of the law, highlighting the often overlooked importance of ensuring that language (and choice) are aligned with technical implementation.

We will be talking about the importance of ensuring that banners and other consent mechanisms are properly implemented in an upcoming short video (we will post here). Cookies and their misuse remain a BIG issue in the EEA, and while many organizations have skirted the law’s requirements for a long time, this may be a signal that it’s time to get it right because supervisory authorities can and will enforce it.


The CA AG has sent out a yet another (fourth, now) set of proposed modifications to the CCPA Regulations. They are short but not entirely without significance, and the notable changes relate to the opt-out choices of California consumers – which has been signaled as a priority for enforcement by the AG:

  • Opt-out for offline collection of personal information: if a business sells personal information that it collects in the course of interacting with consumers offline, the new language requires it to inform consumers by an offline method of their right to opt-out, as well as provide instructions on how to submit a request to opt-out by an offline method. This was already a requirement but the AG obviously felt the need to clarify it. A couple of examples are included. A brick-and-mortar business subject to CCPA can provide the notice by printing it on the paper forms that collect the personal information or by posting signage in the area where personal information is collected, directing consumers to where the notice opt-out information can be found online. Likewise, where the sale concerns personal information collected over the phone, the notice may be provided orally during the call.
  • Methods for submitting requests to opt-out: this is probably the most “interesting” addition. The proposed modifications require that methods for submitting requests to opt-out must be easy for consumers to execute and require minimal steps. To summarize:
    • Businesses may not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out;
    • Nor may they use confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”), when providing consumers the choice to opt-out, or require consumers to click through or listen to reasons dissuading them from opting-out;
    • In addition, the proposed modifications clarify that the process for submitting a request to opt-out should not require consumers to provide personal information that is not necessary to implement the request; and
    • Finally, when a consumer clicks on the “Do Not Sell My Personal Information” link, the business cannot send the consumer on a wild goose chase, which is to say forcing the consumer to search the text of a privacy policy or other document in order to locate the mechanism for submitting a request to opt-out.

Just today I attempted to opt-out from the sale of my information from an advertising company, and when I clicked on the “Do Not Sell My Personal Information” link, it took me to the top of the company’s privacy policy – nowhere near the language on opting out. When I finally found it, it redirected me to a web form that required a lot more information than needed to honor my request. So I gave up. These specific proposed changes from the AG seem to address one of the big issues online: dark patterns. In fact, the AG included among underlying materials lengthy research about dark patterns and how they unfairly mislead consumers. In one of those documents (Shining a Light on Dark Patterns by Jamie Luguri and Lior Jacob Strahilevitz), its authors explain that dark patterns “are user interfaces whose designers knowingly confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions. They typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want, or to reveal personal information they would prefer not to disclose.” Worth a read.

  • Optional Opt-out button: businesses may include an opt-out button in addition to posting the notice of right to opt-out, but not in lieu of any requirement to post the notice of right to opt-out or a “Do Not Sell My Personal Information” link. It looks like this:

The text explains where it should be and what it should do in more detail, however it’s all a bit confusing. I’m not sure what additional purpose this optional button serves and given the confusion to which it may give rise, would probably not recommend adding it.

IoT Security

On the security front, the IoT Cybersecurity Improvement Act was signed into law on December 4. The industry of Internet-connected devices is growing massively, but not without some major security issues. The new law addresses the supply chain risk to the Federal government stemming from insecure IoT devices, and imposes minimum security requirements for IoT devices procured by the Federal government. Among other things, the law requires the National Institute of Standards and Technology (NIST) to establish security standards and guidelines on the use and management of IoT devices purchased by the Federal government, meaning that vendors will be required to adhere to higher security standards. As explained to Security Magazine by Chris Morales, head of security analytics at Vectra, “IoT manufacturers have been building devices based on cost and speed to market with no thought to security. The exposed attack surface of all these devices is crippling. There are some basic things that should be required, like an ability to patch devices, authentication, and secure coding practices …Vendors should also be held accountable for the data they collect and store from all these devices, which is held in some cloud storage.”

While the scope of the law is fairly limited in that it only raises security standards for vendors selling IoT devices to the Federal government, it’s likely to have a broader reach as it requires vendors to adhere to security protocols – and may ultimately serve as the impetus for more much-needed legislation in the consumer space. Note that California has an IoT law that applies to all devices manufactured or sold in California and includes similar security mandates – though limited to manufacturers, it does apply to consumer-directed devices.