House Committee Postpones Markup Amid New Privacy Bill Updates

On June 27, 2024, the U.S. House of Representatives cancelled the House Energy and Commerce Committee markup of the American Privacy Rights Act (“APRA” or “Bill”) scheduled for that day, reportedly with little notice. There has been no indication of when the markup will be rescheduled; however, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers issued a statement reiterating her support for the legislation.

On June 20, 2024, the House posted a third version of the discussion draft of the APRA. On June 25, 2024, two days before the scheduled markup session, Committee members introduced the APRA as a bill, H.R. 8818. Each version featured several key changes from earlier drafts, which are outlined collectively, below.

Notable changes in H.R. 8818 include the removal of two key sections:

  • “Civil Rights and Algorithms,” which required entities to conduct covered algorithm impact assessments when algorithms posed a consequential risk of harm to individuals or groups; and
  • “Consequential Decision Opt-Out,” which allowed individuals to opt out of being subjected to covered algorithms.

Additional changes include the following:

  • The Bill introduces new definitions, such as “coarse geolocation information” and “online activity profile,” the latter of which refines a category of sensitive data. “Neural data” and “information that reveals the status of an individual as a member of the Armed Forces” are added as new categories of sensitive data. The Bill also modifies the definitions of “contextual advertising” and “first-party advertising.”
  • The data minimization section includes a number of changes, such as the addition of “conduct[ing] medical research” in compliance with applicable federal law as a new permitted purpose. The Bill also limits the ability to rely on permitted purposes in processing sensitive covered data, biometric and genetic information.
  • The Bill now allows not only covered entities (excluding data brokers or large data holders), but also service providers (that are not large data holders) to apply for the Federal Trade Commission-approved compliance guideline mechanism.
  • Protections for covered minors now include a prohibition on first-party advertising (in addition to targeted advertising) if the covered entity knows the individual is a minor, with limited exceptions acknowledged by the Bill. It also restricts the transfer of a minor’s covered data to third parties.
  • The Bill adds another preemption clause, clarifying that APRA would preempt any state law providing protections for children or teens to the extent such laws conflict with the Bill, but does not prohibit states from enacting laws, rules or regulations that offer greater protection to children or teens than the APRA.

For additional information about the changes, please refer to the unofficial redline comparison of all APRA versions published by the IAPP.

The Privacy Patchwork: Beyond US State “Comprehensive” Laws

We’ve cautioned before about the danger of thinking only about US state “comprehensive” laws when looking to legal privacy and data security obligations in the United States. We’ve also mentioned that the US has a patchwork of privacy laws. That patchwork is found to a certain extent outside of the US as well. What laws exist in the patchwork that relate to a company’s activities?

There are laws that apply when companies host websites, including the most well-known, the California Privacy Protection Act (CalOPPA). It has been in effect since July 2004, thus predating COPPA by 14 years. Then there are laws the apply if a company is collecting and using biometric identifiers, like Illinois’ Biometric Information Privacy Act.

Companies are subject to specific laws both in the US and elsewhere when engaging in digital communications. These laws include the US federal laws TCPA and TCFAPA, as well as CAN-SPAM. Digital communication laws exist in countries as wide ranging as Australia, Canada, Morocco, and many others. Then we have laws that apply when collecting information during a credit card transaction, like the Song Beverly Credit Card Act (California).

Putting It Into Practice: When assessing your company’s obligations under privacy and data security laws, keep activity specific privacy laws in mind. Depending on what you are doing, and in what jurisdictions, you may have more obligations to address than simply those found in comprehensive privacy laws.

Understanding the Enhanced Regulation S-P Requirements

On May 16, 2024, the Securities and Exchange Commission adopted amendments to Regulation S-P, the regulation that governs the treatment of nonpublic personal information about consumers by certain financial institutions. The amendments apply to broker-dealers, investment companies, and registered investment advisers (collectively, “covered institutions”) and are designed to modernize and enhance the protection of consumer financial information. Regulation S-P continues to require covered institutions to implement written polices and procedures to safeguard customer records and information (the “safeguards rule”), properly dispose of consumer information to protect against unauthorized use (the “disposal rule”), and implementation of a privacy policy notice containing an opt out option. Registered investment advisers with over $1.5 billion in assets under management will have until November 16, 2025 (18 months) to comply, those entities with less will have until May 16, 2026 (24 months) to comply.

Incident Response Program

Covered institutions will have to implement an Incident Response Program (the “Program”) to their written policies and procedures if they have not already done so. The Program must be designed to detect, respond to, and recover customer information from unauthorized third parties. The nature and scope of the incident must be documented with further steps taken to prevent additional unauthorized use. Covered institutions will also be responsible for adopting procedures regarding the oversight of third-party service providers that are receiving, maintaining, processing, or accessing their client’s data. The safeguard rule and disposal rule require that nonpublic personal information received from a third-party about their customers should be treated the same as if it were your own client.

Customer Notification Requirement

The amendments require covered institutions to notify affected individuals whose sensitive customer information was, or is reasonably likely to have been, accessed or used without authorization. The amendments require a covered institution to provide the notice as soon as practicable, but not later than 30 days, after becoming aware that unauthorized access to or use of customer information has occurred or is reasonably likely to have occurred. The notices must include details about the incident, the breached data, and how affected individuals can respond to the breach to protect themselves. A covered institution is not required to provide the notification if it determines that the sensitive customer information has not been, and is not reasonably likely to be, used in a manner that would result in substantial harm or inconvenience. To the extent a covered institution will have a notification obligation under both the final amendments and a similar state law, a covered institution may be able to provide one notice to satisfy notification obligations under both the final amendments and the state law, provided that the notice includes all information required under both the final amendments and the state law, which may reduce the number of notices an individual receives.

Recordkeeping

Covered institutions will have to make and maintain the following in their books and records:

  • Written policies and procedures required to be adopted and implemented pursuant to the Safeguards Rule, including the incident response program;
  • Written documentation of any detected unauthorized access to or use of customer information, as well as any response to and recovery from such unauthorized access to or use of customer information required by the incident response program;
  • Written documentation of any investigation and determination made regarding whether notification to customers is required, including the basis for any determination made and any written documentation from the United States Attorney General related to a delay in notice, as well as a copy of any notice transmitted following such determination;
  • Written policies and procedures required as part of service provider oversight;
  • Written documentation of any contract entered into pursuant to the service provider oversight requirements; and
  • Written policies and procedures required to be adopted and implemented for the Disposal Rule.

Registered investment advisers will be required to preserve these records for five years, the first two in an easily accessible place.

On July 1, 2024, Texas May Have the Strongest Consumer Data Privacy Law in the United States

It’s Bigger. But is it Better?

They say everything is bigger in Texas which includes big privacy protection. After the Texas Senate approved HB 4 — the Texas Data Privacy and Security Act (“TDPSA”), on June 18, 2023, Texas became the eleventh state to enact comprehensive privacy legislation.[1]

Like many state consumer data privacy laws enacted this year, TDPSA is largely modeled after the Virginia Consumer Data Protection Act.[2] However, the law contains several unique differences and drew significant pieces from recently enacted consumer data privacy laws in Colorado and Connecticut, which generally include “stronger” provisions than the more “business-friendly” laws passed in states like Utah and Iowa.

Some of the more notable provisions of the bill are described below:

More Scope Than You Can Shake a Stick At!

  • The TDPSA applies much more broadly than any other pending or effective state consumer data privacy act, pulling in individuals as well as businesses regardless of their revenues or the number of individuals whose personal data is processed or sold.
  • The TDPSA applies to any individual or business that meets all of the following criteria:
    • conduct business in Texas (or produce goods or services consumed in Texas) and,
    •  process or sell personal data:
      • The “processing or sale of personal data” further expands the applicability of the TDPSA to include individuals and businesses that engage in any operations involving personal data, such as the “collection, use, storage, disclosure, analysis, deletion, or modification of personal data.”
      • In short, collecting, storing or otherwise handling the personal data of any resident of Texas, or transferring that data for any consideration, will likely meet this standard.
  • Uniquely, the carveout for “small businesses” excludes from coverage those entities that meet the definition of “a small business as defined by the United States Small Business Administration.”[3]
  • The law requires all businesses, including small businesses, to obtain opt-in consent before processing sensitive personal data.
  • Similar to other state comprehensive privacy laws, TDPSA excludes state agencies or political subdivisions of Texas, financial institutions subject to Title V of the Gramm-Leach-Bliley Act, covered entities and business associates governed by HIPAA, nonprofit organizations, and institutions of higher education. But, TDPSA uniquely excludes electric utilities, power generation companies, and retail electric providers, as defined under Section 31.002 of the Texas Utilities Code.
  • Certain categories of information are also excluded, including health information protected by HIPAA or used in connection with human clinical trials, and information covered by the Fair Credit Reporting Act, the Driver’s Privacy Protection Act, the Family Educational Rights and Privacy Act of 1974, the Farm Credit Act of 1971, emergency contact information used for emergency contact purposes, and data necessary to administer benefits.

Don’t Mess with Texas Consumers

Texas’s longstanding libertarian roots are evidenced in the TDPSA’s strong menu of individual consumer privacy rights, including the right to:

  • Confirm whether a controller is processing the consumer’s personal data and accessing that data;
  • Correct inaccuracies in the consumer’s personal data, considering the nature of the data and the purposes of the processing;
  • Delete personal data provided by or obtained about the consumer;
  • Obtain a copy of the consumer’s personal data that the consumer previously provided to a controller in a portable and readily usable format, if the data is available digitally and it is technically feasible; and
  • Opt-out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of a decision that produces legal or similarly significant legal effects concerning the consumer.

Data controllers are required to respond to consumer requests within 45 days, which may be extended by 45 days when reasonably necessary. The bill would also give consumers a right to appeal a controller’s refusal to respond to a request.

Controller Hospitality

The Texas bill imposes a number of obligations on data controllers, most of which are similar to other state consumer data privacy laws:

  • Data Minimization – Controllers should limit data collection to what is “adequate, relevant, and reasonably necessary” to achieve the purposes of collection that have been disclosed to a consumer. Consent is required before processing information in ways that are not reasonably necessary or not compatible with the purposes disclosed to a consumer.
  • Nondiscrimination – Controllers may not discriminate against a consumer for exercising individual rights under the TDPSA, including by denying goods or services, charging different rates, or providing different levels of quality.
  • Sensitive Data – Consent is required before processing sensitive data, which includes personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, citizenship or immigration status, genetic or biometric data processed for purposes of uniquely identifying an individual; personal data collected from a child known to be under the age of 13, and precise geolocation data.
    • The Senate version of the bill excludes data revealing “sexual orientation” from the categories of sensitive information, which differs from all other state consumer data privacy laws.
  • Privacy Notice – Controllers must post a privacy notice (e.g. website policy) that includes (1) the categories of personal data processed by the controller (including any sensitive data), (2) the purposes for the processing, (3) how consumers may exercise their individual rights under the Act, including the right of appeal, (4) any categories of personal data that the controller shares with third parties and the categories of those third parties, and (5) a description of the methods available to consumers to exercise their rights (e.g., website form or email address).
  • Targeted Advertising – A controller that sells personal data to third parties for purposes of targeted advertising must clearly and conspicuously disclose to consumers their right to opt-out.

Assessing the Privacy of Texans

Unlike some of the “business-friendly” privacy laws in Utah and Iowa, the Texas bill requires controllers to conduct data protection assessments (“Data Privacy Protection Assessments” or “DPPAs) for certain types of processing that pose heightened risks to consumers. The assessments must identify and weigh the benefits of the processing to the controller, the consumer, other stakeholders, and the public against the potential risks to the consumer as mitigated by any safeguards that could reduce those risks. In Texas, the categories that require assessments are identical to those required by Connecticut’s consumer data privacy law and include:

  • Processing personal data for targeted advertising;
  • The sale of personal data;
  • Processing personal data for profiling consumers, if such profiling presents a reasonably foreseeable risk to consumers of unfair or deceptive treatment, disparate impact, financial, physical or reputational injury, physical or other intrusion upon seclusion of private affairs, or “other substantial injury;”
  • Processing of sensitive data; and
  • Any processing activities involving personal data that present a “heightened risk of harm to consumers.”

Opting Out and About

Businesses are required to recognize a universal opt-out mechanism for consumers (or, Global Privacy Control signal), similar to provisions required in Colorado, Connecticut, California, and Montana, but it would also allow businesses more leeway to ignore those signals if it cannot verify the consumers’ identity or lacks the technical ability to receive it.

Show Me Some Swagger!

The Attorney General has the exclusive right to enforce the law, punishable by civil penalties of up to $7,500 per violation. Businesses have a 30-day right to cure violations upon written notice from the Attorney General. Unlike several other laws, the right to cure has no sunset provision and would remain a permanent part of the law. The law does not include a private right of action.

Next Steps for TDPSA Compliance

For businesses that have already developed a state privacy compliance program, especially those modeled around Colorado and Connecticut, making room for TDPSA will be a streamlined exercise. However, businesses that are starting from ground zero, especially “small businesses” defined in the law, need to get moving.

If TDPSA is your first ride in a state consumer privacy compliance rodeo, some first steps we recommend are:

  1. Update your website privacy policy for facial compliance with the law and make sure that notice is being given at or before the time of collection.
  2. Put procedures in place to respond to consumer privacy requests and ask for consent before processing sensitive information
  3. Gather necessary information to complete data protection assessments.
  4. Identify vendor contracts that should be updated with mandatory data protection terms.

Footnotes

[1] As of date of publication, there are now 17 states that have passed state consumer data privacy laws (California, Colorado, Connecticut, Delaware, Florida, Indiana, Iowa, Kentucky, Maryland, Massachusetts, Montana, New Jersey, New Hampshire, Tennessee, Texas, Utah, Virginia) and two (Vermont and Minnesota) that are pending.

[2] See, Code of Virginia Code – Chapter 53. Consumer Data Protection Act

[3] This is notably broader than other state privacy laws, which establish threshold requirements based on revenues or the amount of personal data that a business processes. It will also make it more difficult to know what businesses are covered because SBA definitions vary significantly from one industry vertical to another. As a quick rule of thumb, under the current SBA size standards, a U.S. business with annual average receipts of less than $2.25 million and fewer than 100 employees will likely be small, and therefore exempt from the TDPSA’s primary requirements.

For more news on State Privacy Laws, visit the NLR Consumer Protection and Communications, Media & Internet sections.

Mid-Year Recap: Think Beyond US State Laws!

Much of the focus on US privacy has been US state laws, and the potential of a federal privacy law. This focus can lead one to forget, however, that US privacy and data security law follows a patchwork approach both at a state level and a federal level. “Comprehensive” privacy laws are thus only one piece of the puzzle. There are federal and state privacy and security laws that apply based on a company’s (1) industry (financial services, health care, telecommunications, gaming, etc.), (2) activity (making calls, sending emails, collecting information at point of purchase, etc.), and (3) the type of individual from whom information is being collected (children, students, employees, etc.). There have been developments this year in each of these areas.

On the industry law, there has been activity focused on data brokers, those in the health space, and for those that sell motor vehicles. The FTC has focused on the activities of data brokers this year, beginning the year with a settlement with lead-generation company Response Tree. It also settled with X-Mode Social over the company’s collection and use of sensitive information. There have also been ongoing regulation and scrutiny of companies in the health space, including HHS’s new AI transparency rule. Finally, in this area is a new law in Utah, with a Motor Vehicle Data Protection Act applicable to data systems used by car dealers to house consumer information.

On the activity side, there has been less news, although in this area the “activity” of protecting information (or failing to do so) has continued to receive regulatory focus. This includes the SEC’s new cybersecurity reporting obligations for public companies, as well as minor modifications to Utah’s data breach notification law.

Finally, there have been new laws directed to particular individuals. In particular, laws intended to protect children. These include social media laws in Florida and Utah, effective January 1, 2025 and October 1, 2024 respectively. These are similar to attempts to regulate social media’s collection of information from children in Arkansas, California, Ohio and Texas, but the drafters hope sufficiently different to survive challenges currently being faced by those laws. The FTC is also exploring updates to its decades’ old Children’s Online Privacy Protection Act.

Putting It Into Practice: As we approach the mid-point of the year, now is a good time to look back at privacy developments over the past six months. There have been many developments in the privacy patchwork, and companies may want to take the time now to ensure that their privacy programs have incorporated and addressed those laws’ obligations.

Listen to this post

White House Publishes Steps to Protect Workers from the Risks of AI

Last year the White House weighed in on the use of artificial intelligence (AI) in businesses.

Since the executive order, several government entities including the Department of Labor have released guidance on the use of AI.

And now the White House published principles to protect workers when AI is used in the workplace.

The principles apply to both the development and deployment of AI systems. These principles include:

  • Awareness – Workers should be informed of and have input in the design, development, testing, training, and use of AI systems in the workplace.
  • Ethical development – AI systems should be designed, developed, and trained in a way to protect workers.
  • Governance and Oversight – Organizations should have clear governance systems and oversight for AI systems.
  • Transparency – Employers should be transparent with workers and job seekers about AI systems being used.
  • Compliance with existing workplace laws – AI systems should not violate or undermine worker’s rights including the right to organize, health and safety rights, and other worker protections.
  • Enabling – AI systems should assist and improve worker’s job quality.
  • Supportive during transition – Employers support workers during job transitions related to AI.
  • Privacy and Security of Data – Worker’s data collected, used, or created by AI systems should be limited in scope and used to support legitimate business aims.

UNDER SURVEILLANCE: Police Commander and City of Pittsburgh Face Wiretap Lawsuit

Hi CIPAWorld! The Baroness here and I have an interesting filing that just came in the other day.

This one involves alleged violations of the Pennsylvania Wiretapping and Electronic Surveillance Act, 18 Pa.C.S.A. § 5703, et seq., and the Federal Wiretap Act, 18 U.S.C. § 2511, et seq.

Pursuant to the Pennsylvania Wiretapping and Electronic Surveillance Act, 18 Pa.C.S.A. § 5703, et seq., a person is guilty of a felony of the third degree if he:

(1) intentionally intercepts, endeavors to intercept, or procures any other person to intercept or endeavor to intercept any wire, electronic or oral communication;

(2) intentionally discloses or endeavors to disclose to any other person the contents of any wire, electronic or oral communication, or evidence derived therefrom, knowing or having reason to know that the information was obtained through the interception of a wire, electronic or oral communication; or

(3) intentionally uses or endeavors to use the contents of any wire, electronic or oral communication, or evidence derived therefrom, knowing or having reason to know, that the information was obtained through the interception of a wire, electronic or oral communication.

Seven police officers employed by the City of Pittsburg Bureau of Police team up to sue Matthew Lackner (Commander) and the City of Pittsburgh.

Plaintiffs, Colleen Jumba Baker, Brittany Mercer, Matthew O’Brien, Jonathan Sharp, Matthew Zuccher, Christopher Sedlak and Devlyn Valencic Keller allege that beginning on September 27, 2003 through October 4, 2003, Matthew Lacker utilized body worn cameras to video and audio records Plaintiffs along with utilizing the GPS component of the body worn camera to track them.

Yes. To track them.

Plaintiffs allege they were unaware that Lacker was utilizing a body worn camera to video and auto them and utilizing the GPS function of the body worn camera. Nor did they consent to have their conversations audio recorded by Lacker and/or the City of Pittsburgh.

Interestingly, Lackner was already charged with four (4) counts of Illegal Use of Wire or Oral Communication pursuant to the Pennsylvania Wiretapping and Electronic Surveillance Act. 18 Pa.C.S.A. § 5703(1) in a criminal suit.

So now Plaintiffs seek compensatory damages, including actual damages or statutory damages, punitive damages, and reasonably attorneys’ fees.

This case was just filed so it will be interesting to see how this case progresses. But this case is an important reminder that many states have their own privacy laws and to take these laws seriously to avoid lawsuits like this one.

Case No.: Case 2:24-cv-00461

FCC Updated Data Breach Notification Rules Go into Effect Despite Challenges

On March 13, 2024, the Federal Communications Commission’s updates to the FCC data breach notification rules (the “Rules”) went into effect. They were adopted in December 2023 pursuant to an FCC Report and Order (the “Order”).

The Rules went into effect despite challenges brought in the United States Court of Appeals for the Sixth Circuit. Two trade groups, the Ohio Telecom Association and the Texas Association of Business, petitioned the United States Court of Appeals for the Sixth Circuit and Fifth Circuit, respectively, to vacate the FCC’s Order modifying the Rules. The Order was published in the Federal Register on February 12, 2024, and the petitions were filed shortly thereafter. The challenges, which the United States Panel on Multidistrict Litigation consolidated to the Sixth Circuit, argue that the Rules exceed the FCC’s authority and are arbitrary and capricious. The Order addresses the argument that the Rules are “substantially the same” as breach rules nullified by Congress in 2017. The challenges, however, have not progressed since the Rules went into effect.

Read our previous blog post to learn more about the Rules.

Listen to this post

U.S. House of Representatives Passes Bill to Ban TikTok Unless Divested from ByteDance

Yesterday, with broad bipartisan support, the U.S. House of Representatives voted overwhelmingly (352-65) to support the Protecting Americans from Foreign Adversary Controlled Applications Act, designed to begin the process of banning TikTok’s use in the United States. This is music to my ears. See a previous blog post on this subject.

The Act would penalize app stores and web hosting services that host TikTok while it is owned by Chinese-based ByteDance. However, if the app is divested from ByteDance, the Act will allow use of TikTok in the U.S.

National security experts have warned legislators and the public about downloading and using TikTok as a national security threat. This threat manifests because the owner of ByteDance is required by Chinese law to share users’ data with the Chinese Communist government. When downloading the app, TikTok obtains access to users’ microphones, cameras, and location services, which is essentially spyware on over 170 million Americans’ every move, (dance or not).

Lawmakers are concerned about the detailed sharing of Americans’ data with one of its top adversaries and the ability of TikTok’s algorithms to influence and launch disinformation campaigns against the American people. The Act will make its way through the Senate, and if passed, President Biden has indicated that he will sign it. This is a big win for privacy and national security.

Copyright © 2024 Robinson & Cole LLP. All rights reserved.
by: Linn F. Freedman of Robinson & Cole LLP

For more news on Social Media Legislation, visit the NLR Communications, Media & Internet section.

CNN, BREAKING NEWS: CNN Targeted In Massive CIPA Case Involving A NEW Theory Under Section 638.51!

CNN is now facing a massive CIPA class action for violating CIPA Section 638.51 by allegedly installing “Trackers” on its website. In  Lesh v. Cable News Network, Inc, filed in the Superior Court of the State of California by Bursor & Fisher, plaintiff accuses the multinational news network of installing 3 tracking software to invade users’ privacy and track their browsing habits in violation of Section 638.51.

More on that in a bit…

As CIPAworld readers know, we predicted the 2023 privacy litigation trends for you.

We warned you of the risky CIPA Chat Box cases.

We broke the news on the evolution of CIPA Web Session recording cases.

We notified you of major CIPA class action lawsuits against some of the world’s largest brands facing millions of dollars in potential exposure.

Now – we are reporting on a lesser-known facet of CIPA – but one that might be even more dangerous for companies using new Internet technologies.

This new focus for plaintiff’s attorneys appears to rely on the theory that website analytic tools are “pen register” or “trap and trace” devices under CIPA §638.51. These allegations also come with a massive $5,000 per violation penalty.

First, let’s delve into the background.

The Evolution of California Invasion of Privacy Act:

We know the California Invasion of Privacy Act is this weird little statute that was enacted decades ago and was designed to prevent ease dropping and wiretapping because — of course back then law enforcements were listening into folks phone calls to find the communist.

638.51 in particular was originally enacted back in the 80s and traditionally, “pen-traps” were employed by law enforcement to record outgoing and/or incoming telephone numbers from a telephone line.

The last two years, plaintiffs have been using these decades-old statues against companies claiming that the use of internet technologies such as website chat boxes, web session recording tools, java scripts, pixels, cookies and other newfangled technologies constitute “wire tapping” or “eavesdropping” on website users.

And California courts who love to take old statutes and apply it to these new technologies – have basically said internet communications are protected from being ease dropped on.

Now California courts will have to address whether these new fangled technologies are also “pen-trap” “devices or processes” under 638.51. These new 638.51 cases involve technologies such as cookies, web beacons, java scripts, and pixels to obtain information about users and their devices as they browse websites and or mobile applications. The users are then analyzed by the website operator or a third party vendor to gather relevant information users’ online activities.

Section 638.51:

Section 638.51 prohibits the usage or installation of “pen registers” – a device or process that records or decodes dialing, routing, addressing, or signaling information (commonly known as DRAS) and “trap and trace” (pen-traps) – devices or processes traditionally used by law enforcement that allow one to record all numbers dialed on outgoing calls or numbers identifying incoming calls — without first obtaining a court order.

Unlike CIPA’s 631, which prohibits wiretapping — the real-time interception of the content of the communications without consent, CIPA 638.51 prohibits the collection of DRAS.

638.51 has limited exceptions including where a service provider’s customer consents to the device’s use or to protect the rights of a service provider’s property.

Breaking Down the Terminology:

The term “pen register” means a device or process that records or decodes DRAs “transmitted by an instrument or facility from which a wire or electronic communication is transmitted, but not the contents of a communication.” §638.50(b).

The term “trap and trace” focuses on incoming, rather than outgoing numbers, and means a “device or process that captures the incoming electronic or other impulses that identify the originating number or other dialing, routing, addressing, or signaling information reasonably likely to identify the source of a wire or electronic communication, but not the contents of a communication.” §638.50(c).

Lesh v. Cable News Network, Inc “CNN” and its precedent:

This new wave of CIPA litigation stems from a single recent decision, Greenley v. Kochava, where the CA court –allowed a “pen register” claim to move pass the motion to dismiss stage. In Kochava, plaintiff challenged the use of these new internet technologies and asserting that the defendant data broker’s software was able to collect a variety of data such as geolocation, search terms, purchase decisions, and spending habits. Applying the plain meaning to the word “process” the Kochava court concluded that “software that identifies consumers, gathers data, and correlates that data through unique ‘fingerprinting’ is a process that falls within CIPA’s pen register definition.”

The Kochava court noted that no other court had interpreted Section 638.51, and while pen registers were traditionally physical machines used by law enforcement to record outbound call from a telephone, “[t]oday pen registers take the form of software.” Accordingly the court held that the plaintiff adequately alleged that the software could collect DRAs and was a “pen register.”

Kochava paved the wave for 638.51 litigation – with hundreds of complaints filed since. The majority of these cases are being filed in Los Angeles Country Superior Court by the Pacific Trial Attorneys in Newport Beach.

In  Lesh v. Cable News Network, Inc, plaintiff accuses the multinational news network of installing 3 tracking software to invade users’ privacy and track their browsing habits in violation of CIPA Section 638.51(a) which proscribes any “person” from “install[ing] or us[ing] a pen register or a trap and trace device without first obtaining a court order.”

Plaintiff alleges CNN uses three “Trackers” (PubMatic, Magnite, and Aniview), on its website which constitute “pen registers.” The complaint alleges to make CNN’s website load on a user’s browser, the browser sends “HTTP request” or “GET” request to CNN’s servers where the data is stored. In response to the request, CNN’s service sends an “HTTP response” back to the browser with a set of instructions how to properly display the website – i.e. what images to load, what text should appear, or what music should play.

These instructions cause the Trackers to be installed on a user’s browsers which then cause the browser to send identifying information – including users’ IP addresses to the Trackers to analyze data, create and analyze the performance of marketing campaigns, and target specific users for advertisements. Accordingly the Trackers are “pen registers” – so the complaint alleges.

On this basis, the Plaintiff is asking the court for an order to certify the class, and statutory damages in addition to attorney fees. The alleged class is as follows:

“Pursuant to Cal. Code Civ. Proc. § 382, Plaintiff seeks to represent a class defined as all California residents who accessed the Website in California and had their IP address collected by the Trackers (the “Class”).

The following people are excluded from the Class: (i) any Judge presiding over this action and members of her or her family; (ii) Defendant, Defendant’s subsidiaries, parents, successors, predecessors, and any entity in which Defendant or their parents have a controlling interest (including current and former employees, officers, or directors); (iii) persons who properly execute and file a timely request for exclusion from the Class; (iv) persons whose claims in this matter have been finally adjudicated on the merits or otherwise released; (v) Plaintiff’s counsel and Defendant’s counsel; and (vi) the legal representatives, successors, and assigns of any such excluded persons.”

Under this expansive definition of “pen-register,” plaintiffs are alleging that almost any device that can track a user’s web session activity falls within the definition of a pen-register.

We’ll keep an eye out on this one – but until more helpful case law develops, the Kochava decision will keep open the floodgate of these new CIPA suits. Companies should keep in mind that unlike the other CIPA cases under Section 631 and 632.7, 638.51 allows for a cause of action even where no “contents” are being “recorded” – making 638.51 easier to allege.

Additionally, companies should be mindful of CIPA’s consent exceptions and ensure they are obtaining consent to any technologies that may trigger CIPA.