Browse by Date • Publication
Thursday
Sep192019

Privacy by Degree

Some thoughts on privacy rights in the U.S., and the implications of impending privacy legislation.
Early this month, I had the opportunity to participate in continuing legal education on the future of privacy and privacy law. As I prepared to listen and learn, I wasn’t expecting to be jarred by an issue that, in all honesty (and said with a bit of guilt), had never crossed my mind. That is: Privacy isn’t a right in the U.S., and your position in society says a great deal about how much actual privacy you have.

To that latter point, if you need access to government services or support, you must disclose private information in order to receive assistance. Health and medical information? Financial information? Address information? Social Security numbers? Granted, no reputable bank is going to lend money to a borrower without first obtaining some personal and otherwise confidential information, but it’s clear that the more socially and personally vulnerable an individual may be, the more likely it is that their private information will be “out there.” As a direct consequence of their vulnerabilities, the personal information (PI) of these individuals that could be expected to be private, isn’t.

California Leading the Way… Here

In the U.S., unlike in the European Union, privacy isn’t a guaranteed right. The California Consumer Privacy Act (CCPA) of 2018 is the closest we’ve come on this side of the pond to the EU’s Global Data Protection Regulation, enacted more than a year ago to secure individual privacy rights.

The CCPA, which will take effect this coming January, will apply to consumers who are California residents (including households and individuals), and to three types of businesses: those that operate in the state with gross revenues in excess of $25 million; those that buy, receive, or otherwise obtain the PI of 50,000 or more consumers, households, or devices for commercial purposes; or those that derive 50% or more of their annual revenues from selling consumers’ personal information. Although there’s enough here for an entire new column, the CCPA broadly defines PI to mean “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.”

The act specifies that PI includes, but isn’t limited to:

  • Identifiers — these include names, aliases, addresses, and IP addresses
  • Characteristics of protected classifications — as specified under California or federal law
  • Commercial information — records of personal property, products or services purchased, or consuming histories or tendencies fall in this category
  • Biometric information
  • Internet or other electronic network activity information, including browsing history
  • Geolocation data
  • Audio, electronic, visual, thermal, olfactory, or similar information
  • Professional or employment-related information
  • Education information
  • Any inferences drawn from any of the information identified to create a consumer’s profile

Like all laws, the CCPA has exceptions, but there’s not room here to address them (click here for more information). The law applies to information collected by brick-and-mortar businesses as well as by electronic or Internet-based operations — and while California is the first state to take such action, other states will follow. Although privacy watchers expect that any new pieces of legislation will look very much like California’s, each will be slightly different, making state privacy regulation a management a looming nightmare for enterprises.

Striking a Balance
As states hash out their privacy laws, the balance between privacy and security will remain a topic of discussion in business, government, academia, and certainly the military. It’s a delicate balance (think “head of pin”) to be sure, and opinions change with the wind. I’ve seen this firsthand, among students of a graduate-level class in IT ethics I’ve taught for the past five years. Students from across the world, many of whom have military experience and strong opinions on the topic, attend this class. Near the end of the term, we discuss the balance of privacy and security, and invariably end up talking about Edward Snowden, the CIA employee who blew the whistle on the National Security Agency back in 2013.

Five years ago, students, most of whom are mid-career, were apoplectic about Snowden, quick to call him a traitor among other unkind words. With time, however, this has changed. This past Spring term, students, while perhaps still disturbed by Snowden’s release of classified information, acknowledged, almost to a person, that what he did by revealing the extent of government surveillance opened our eyes to the extent to which our private information isn’t really private. Further, students also recognized that he took this action, which has changed the way we view information that we thought was private and clearly wasn’t, at great personal cost.

The big takeaway is this: Those who are most vulnerable for any number of reasons (economic, social, education, age, race, gender) are at greater risk of victimization when data breaches occur because they tend to have been required to share more personal information to receive any number of services they might require. The more personal data that’s out there, the greater the opportunity for hackers and miners to access, share, and abuse, and the greater the risk to those who have shared information, wittingly or not. State regulation is a positive step, but with each state crafting its own nuanced legislation, privacy management for enterprises will become… um, challenging. The answer lies in federal regulation, but with lots of constituencies arguing for their own points of view don’t expect that to happen anytime soon.

PrintView Printer Friendly Version

EmailEmail Article to Friend

« Appeals Court Net Neutrality Review: Mixed Bag | Main | The Not-So-Private Elevator »