The Dawn of AI Litigation: What CCaas and UCaaS Implementors Need to Know
November 14, 2024
[Your Name Here] in AI, No Jitter, Privacy/data security

While most vendors position themselves as the invisible layer facilitating customer communications for their clients, they’re about to find out their data-handling policies may bring them new visibility.
As AI continues to be deployed, it’s only reasonable to assume that litigation around its use will become morecommon. We are at the dawn of the age where cases based on some form of AI-generated outcomes will provide the basis for a litigable cause of action. There is also a very hungry group of class action lawyers looking to jump on the next big fertile area of profitable litigation. In any case, it’s time to consider the legal vulnerabilities that are likely to be associated with AI use—particularly in the UCaaS and CCaaS spaces so that entities deploying these tools do so with their eyes open.

The first area of concern involves how these technologies have the potential to jeopardize individual privacy rights. An ad for a now frowned-upon company once had a tagline of “if [the data is] not out there, it can’t be stolen.” I would also add, if the data’s not out there, it can’t be misused either. The fact is that many of us are way too cavalier about sharing personal information in the name of convenience. We know the information is "out there." And whether a prompt warns us that the products and technologies that support such interactions will be capturing and using our personal data -- as callers, chat bot users or texters -- is provided or not, we are unlikely to know exactly who has our data and what they’re planning to do with it.

According to Blair Pleasant, president & principal analyst, COMMfusion, and BCStrategies co-founder, “We know that some CCaaS vendors use the content of their customers’ calls to train their AI - that’s no industry secret. But – do the end customers know that their conversations are being used to train the CCaaS vendors’ models? The customers may have given consent to the company they’re interacting with, but not to the CCaaS platform that is intercepting their calls. It’s a very gray area.”

Many of the biggest names touting their AI-powered offerings in the CCaaS and UCaaS space have been sued. More suits are guaranteed to follow. And when there’s a cause of action, the plaintiff’s strategy is almost always “sue everyone and see what sticks.” The same can be said about throwing spaghetti against the wall, but I digress. With this in mind, what follows will highlight one case that has recently been filed. I have reviewed the initial pleadings, and while the arguments made are somewhat unique (of the four causes of action, two seem valid and the remaining two, in my opinion and based on what I know, are a stretch), this case stands above the others simply because it’s illustrative of what I see as the biggest issue - privacy. The case has yet to be either settled or have its outcome determined, but the facts, as presented in the initial pleadings, are both interesting and illustrative.

The class action case (and the fact that it's a class action is telling in and of itself) is Crowder v. Pre-Paid Legal Services, Inc., d/b/a LegalShield and Talkdesk, Inc., filed August 22, 2024. I am going to focus only on the portions that involve Talkdesk. According to the pleadings, LegalShield uses “Talkdesk’s products to manage its customer service emails, chatbot, calls and texts,” while from the perspective of the customer/client of LegalShield, there is simply no way for a person calling LegalShield to know that LegalShield’s customer interaction vendor Talkdesk is not only listening to the calls, but recording them and using those recordings for its own purposes. This process raises issues with respect to sacrosanct attorney-client privilege (WHICH ARE SIGNIFICANT). However, let’s focus on the issues in the communications platform and AI: customers had every right to assume their data was confidential, and they learned their data is now in the hands of a third party (Talkdesk) who, according to the court filing, can use—and has used--the data for its own purposes. They were neither informed, nor did they consent to this.

The most compelling argument that the plaintiff has made is that she believed that the information she shared with LegalShield was, at all times, confidential and that, because LegalShield shared her information with Talkdesk, her privacy had clearly been violated without her consent.

While this argument seems valid, as is always the case, it’s never as simple as it seems. First, nobody in the United States has overarching privacy protections; the privacy protections in place have been written and adopted for specific purposes. HIPAA protects health care information and other federal statutes provide their own siloed privacy regulations, but unlike the European Union, there is no national privacy law or standard. In addition, there are state privacy laws—the most well-known and probably most restrictive are in California. While other states may have similar rules and regulations, many states have no all-encompassing privacy statutes, thus creating a wild west of privacy enforcement requirements.

Secondly, the EU has intentionally taken a leadership role in the drafting and enacting enforceable legislation regarding the use of AI. Like GDPR which came before it, the obligations/requirements of both pieces of EU-based legislation reach far beyond its physical borders to cover both citizens of the EU who may be out of the country, and non-EU citizens who are present in the EU at the time the privacy violation occurs.

As end users continue to deploy AI-based tools in their operations (whether they be inward or outward facing), their technical and legal staffs should be acutely aware of how the data provided to the AI process will be not only used, but stored and shared. Customers also need to have the opportunity to know about the security of the information that’s being shared: who has it, what those actors are doing with it and how safe the information REALLY is.

Originally published in No Jitter, November 13, 2024

Article originally appeared on Martha Buyer Telecommunications Law (https://www.marthabuyer.com/).
See website for complete article licensing information.