Answering the most difficult question in IT Support

By:  Special Guest Author, Robert Gillette

The number one question I get as an Outsourced IT Provider is “am I secure?” This question is difficult to answer because I am on the outside looking in. I am not your IT Provider. Can’t I just provide a security audit? Yes, but it won’t help you actually be any more secure. An audit will simply cost you thousands of dollars, and bring to the forefront the problems your IT resource either (1) didn’t know were problems, or (2) have been ignoring because they can’t fix them. Even then, an audit will give you a sense of your security concerns today… but what about tomorrow, or a week from now?

The world of security is changing in real-time, faster than you can imagine. Security is not just about best practice anymore, response matters too. Your IT resource must be as nimble as the bad actors. A list of IT vulnerabilities from weeks ago when an audit was completed will not make you secure.

Instead, organizations like the Center for Internet Security have published helpful guidelines for security professionals to follow. Authorities like the California Attorney General have pointed at the CIS 18 as a reasonable standard for security. Yet again though, these are highly technical and difficult for many business leaders to properly chaperon.

Given this, I developed a targeted set of 3 questions to serve as a canary in the coal mine. These questions are a test to see if an  IT environment might be compliant with the 18 controls.

The goal is NOT to ask the IT resource to start doing these things. We are simply testing to see if there is evidence that the much harder to complete (and assess) work has already been  done. The things these questions ask for are a reasonable outcome of properly deployed controls. If a business answers NO to any of these questions, the result should not be to undertake a project to turn them into a YES. These are leading indicators that something essential to security is not being done. I cannot say that you are definitely secure, but you pass the first test. If the answer to any of these is NO, then I can say that  it’s time for swift and dramatic action to address cyber security.

 

Last July you will recall that in the Schrems II Case (“Schrems”) the Court of Justice of the European Union (“CJEU”) invalidated the European Union/United States (“EU/US”) Privacy Shield framework, while also reiterating that companies could rely on the standard contractual clauses (“SCC”s).  However, the CJEU also made clear that transfers of personal data from the European Economic Area (“EEA”) to non-adequate countries were not always permissible, requiring supplemental measures and in some cases transfer impact assessments.

In order to address the Schrems II holding and to improve ill-adapted SCCs that pre-dated the General Data Protection Regulation (“GDPR”) amid an exponential increase in cross-border transfers, the European Commission adopted two new sets of SCCs June 4, 2021:  Third Country Transfer SCCs and Controller-Processor SCCs.  As detailed below, these new SCCs must be used commencing September 27, 2021, for all new data transfers.  Companies have until December 27, 2022 to amend contracts for data transfers that previously were made under the old SCCs. Continue Reading Addressing Data Transfers from the European Union Starting September 27, 2021

This summer, Colorado passed the new data privacy law called the Colorado Privacy Act (“CPA”), granting Colorado residents new rights and creating new obligations for businesses that are located in or conduct business with those in Colorado. CPA regulates the collection of personal data, or information relating or reasonably linkable to an identifiable person, such as a person’s name, social security numbers, email address, transaction data, Internet browsing history, and geolocation. Continue Reading Colorado Privacy Act

In April, Rob Bonta became the new California Attorney General. In swift form, and not taking any summer break, he has made it clear that privacy and CCPA compliance is a priority, and that enforcement won’t be limited to a handful of requirements under the CCPA, as many previously believed.

First, the Attorney General posted several examples of enforcement actions, including those addressing the following issues: Continue Reading The Summer of CCPA Enforcement

A few weeks ago, many Americans on the east coast spent several days scrambling for gas when Colonial Pipeline halted systems for 5,500 miles of pipeline as a precautionary measure after being hit by a ransomware attack. Highly publicized, the Colonial Pipeline ransomware attack is just one of many that have been hitting companies small and large. Healthcare has been a prime target, but other industries are equally at risk, and critical national infrastructure now appears to be a target. In 2020, over 2,000 local governments, health care facilities and schools were victims of ransomware. Continue Reading STEP UP SECURITY TO PROTECT YOUR ORGANIZATION FROM CRIPPLING RANSOMWARE ATTACKS

Organizations large and small across all industries collect and process personal information, be it user information, customer information or employee information. Some of this information may be sensitive, other information may be subject to stricter laws in other countries. In our practice, among the many data protection requirements to which an organization may be subject (depending on a number of circumstances), security is one that too many organizations overlook. To some extent, this may be due to the fact that – with the exception of some sector-specific rules – many laws relating to the protection of personal information are non-specific when it comes to security standards. For a variety of reasons, data protection laws tend to espouse a somewhat esoteric notion of “reasonable” security measures commensurate to the sensitivity of the data and the nature of the processing – much to the chagrin of organizations hoping to easily ascertain the practical scope of their obligation to protect data. However, encryption is one method that is often specifically cited when it comes to data protection standards.

What is Encryption?

At a high level, encryption is the process of scrambling information and rendering it unintelligible, such that only someone with a “key” can decipher and read it. Simply put, an algorithm encrypts the data and the encryption key enables the receiving party to decrypt it. Data prior to encryption is referred to as plaintext, while the scrambled information is referred to as ciphertext. Only an authorized party with a key should (in theory) be able to revert ciphertext to plaintext – hence the term “decipher” – in order to access and read the data in its original state.

Data can be encrypted at rest and/or in transit, and there are different categories/levels of encryption, none of which we address here because this is, after all, a legal blog. Suffice to say, however, that encryption, when properly implemented, is generally viewed as a strong way to secure personal information.

Encryption for Data Protection

Many, if not most, U.S. state data breach laws exempt companies from their notification requirements where the personal information subject to the unauthorized access is encrypted, provided of course that the encryption key has also not been accessed or acquired. This is because encrypted data is unintelligible to those who have nefariously gained access to it, so long as the encryption key was not also accessed. Likewise, personal information that is encrypted will not trigger the CCPA’s limited (and potentially costly) private right of action in the event of a data breach. On the flip side, if certain types of personal information are subject to an unauthorized access and exfiltration, theft, or disclosure and are unencrypted, the CCPA carries statutory damages ranging from $100-$750 per consumer per incident or actual damages, whichever is greater. This can add up very quickly. Organizations that appropriately encrypt personal information and suffer a data breach may therefore be significantly more protected from fines and litigation.

On the other side of the pond, the GDPR specifically mentions the use of encryption as a technical and organizational measure (security), and in 2018, the then-Article 29 Working Party noted that the availability of strong and efficient encryption is a necessity in order to guarantee the protection of individuals with regard to the confidentiality and integrity of their data which are the elementary underpinning of the digital economy. More recently, the CJEU’s Schrems II case, which invalidated the Privacy Shield and generated confusion as to the validity of transfers of personal data outside of the EEA (read more here), brought encryption to the forefront. Among the technical supplementary measures identified to mitigate risks to data subjects, many pointed to the need for encryption, including the European Data Protection Board, which noted that encryption could constitute an adequate safeguard so long as keys remain within the EU or trusted third countries. Organizations that appropriately encrypt personal information may therefore also be in a better position to receive transfers of personal data from controllers in the EU. If your organization provides services that involve processing personal data to customers in the EEA, it will help as you navigate the increasingly prevalent requests for transfer impact assessments.

Other laws also specifically refer to encryption as a method for protecting personal information. Overall, while encryption is often a topic of debate especially when it comes to law enforcement, the general consensus is that appropriate encryption can enable organizations to demonstrate at least some level of data protection (though encryption alone is not sufficient).

What Should Organizations Consider with Respect to Encryption?

There are, of course, residual risks to encryption: even if a system uses encryption, certain data (e.g., metadata) may still be subject to an unauthorized access. In addition, encryption keys must be kept secure to avoid compromise. The loss of a key (even in the absence of an unauthorized access) is equally problematic because this will preclude anyone from accessing the data, and could, under certain circumstances, constitute a data breach. Additionally, key re-use should be avoided. In other words, using cryptography is one thing, but getting it right is equally as important. Nevertheless, organizations should learn to love encryption – and properly implement it. Among other things, it can be a “get out of jail free” card for data breaches, and it will boost your customers’ confidence in your ability (and willingness) to secure data.

 

Data privacy and security terms have become ubiquitous in software license agreements, including in both hosted service agreements and software license agreements. Security terms have been the norm for many years in the SaaS world, where the software licensor is hosting a customer’s data. However, in more recent years, much to the chagrin of small start-up software licensors of on-premise software, security terms and guarantees are now an expected part of the deal, even if the terms tend to be shorter in length and more limited in scope. Given the increasing importance – and inherent risks – of storing data, customers are understandably still concerned with data privacy and security even where a vendor is not actually hosting or storing their data. Continue Reading Why Security Matters Even More for On-premise Software Vendors

In recent months, there has been increased chatter about “dark patterns” in user interfaces, and it’s only getting louder. When we think of dark patterns, we often think of features that make it more difficult to cancel subscriptions, or that (mis)lead us to sign up for a product or service despite our best intentions. However, dark patterns also impact data privacy in a number of ways. Continue Reading How Dark Patterns May be Chipping Away at Your Company’s Privacy Compliance Efforts

The Commonwealth of Virginia is on the verge of becoming the second state with a consumer data protection law. The Consumer Data Protection Act (“CDPA”), which awaits signature by Governor Northam (who is expected to sign the bill into law), would go into effect on January 1, 2023. Like California’s CCPA (and CPRA, also set to take effect January 1, 2023), the CDPA establishes a “comprehensive” framework for the collection and use of personal data of Virginia residents while also (and ironically) not applying to companies across the board. The CDPA would apply “to persons that conduct business in the Commonwealth or produce products or services that are targeted to residents of the Commonwealth and that (i) during a calendar year, control or process personal data of at least 100,000 consumers or (ii) control or process personal data of at least 25,000 consumers and derive over 50 percent of gross revenue from the sale of personal data.” Unlike CCPA, the CDPA does not contain an “annual revenue” threshold. Continue Reading States Are “Stepping Up” to the Privacy Plate. Who’s Next? Virginia.

Online advertising – or “adtech”, as it is often referred to – does not mix well with many privacy laws, beginning with the GDPR. In recent years since GDPR went into effect, privacy advocates have increased their demands on EU regulators to more deeply scrutinize targeting practices and how data is shared within the advertising ecosystem, in particular when it comes to real-time bidding (RTB). Complaints have been filed by many privacy-minded organizations, and all of them allege that, by its very nature, RTB constitutes a “wide-scale and systemic” breach of Europe’s privacy laws. This is because RTB relies on the massive collection, accumulation and dissemination of detailed behavioral data about individuals who use the internet. Continue Reading Key Takeaways from the Recent Grindr Decision and “Tentative” $11M Fine