In the U.S., 20 states now have comprehensive privacy laws requiring the protection of customer and employee data. New York State is not among them. California, Colorado, Connecticut, Delaware, Florida, Indiana, Iowa, Tennessee, Utah, and Virginia, were joined in 2024 by Kentucky, Maryland, Minnesota, Montana, Nebraska, New Hampshire, New Jersey, Oregon, Rhode Island, and Texas.
Most of these laws apply to companies located anywhere and doing business in the state, based upon three criteria: whether the business controls data, or processes data on behalf of others; the number of people in a state for whom a business has personal data; and/or the revenue derived by a company from data business activity in the state. Generally speaking, small businesses are excluded. There are three exceptions, Texas and Nebraska apply to all. The California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act (CPRA), applies equally to consumers, employees, and business-to-business commercial contacts.
Generally, businesses must know what personal data they have and take care to store it securely. They must employ good data-minimization practices and know what data they passed on to third parties. They must obtain contractual assurances about third-party handling of data. Two new and growing requirements are: covered businesses must obtain permission for using personal information, especially for anything not essential and necessary; and they must obtain parental opt-in before targeting or selling the personal data of minors (persons under 16-18 years of age).
Sensitive personal data is generally defined broadly, as any information that is reasonably linked to the identity of a single person or individual household. This includes precise ID or financial account numbers, demographic, biometric, neural, biologic, purchasing and browsing habits, and geolocation data.
It excludes permanently de-identified (anonymized) data and any information that is publicly available. Increasingly, “pseudonymous data,” i.e., information that has identifiers stripped but can still be linked back to a person using an ID code, is required to be stored separately.
State privacy laws also regulate the “sale” of personal data, defined broadly to mean either for monetary or other valuable consideration.
Following the lead of California and the European Union’s General Data Protection Regulation (GDPR), all of the new U.S. state privacy laws of 2024 provide consumers with a number of rights regarding their data, including:
- The right to know who is processing their data, and the rights to access it, correct it, and delete it.
- Portability, or the right to copy it.
- The right to a clear and conspicuous method to opt out of the sale and processing of their personal information for targeted advertising, and the requirement that businesses must honor preference signals like GPC.
- The right to non-discrimination in services and products regardless of their privacy preferences.
- The right to opt out of automated decision-making or profiling that affects the consumer’s legal rights.
- The right to appeal the denial of any data right.
- The continued right to specific and clear contractual provisions.
Businesses must conduct a data privacy impact assessment (DPIA) before performing targeted advertising, sales, profiling, or sensitive data processing.
Businesses must continue to provide privacy policies that contain certain detailed information.
Businesses must establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data.
Dark Patterns were specifically singled out and expressly prohibited by each new law, echoing the Federal Trade Commission (FTC)’s prohibition of “dark patterns” to obtain any consumer consent.
States Attorneys General will be enforcing these laws, most with varying mandatory cure periods. They do not include a private right of action.
Minnesota’s new law requires the maintenance of an “inventory record of processing” sensitive data, that protects its confidentiality, integrity, and accessibility. Data inventories, or records of processing, are required under international data protection regimes like the GDPR. But this is the first time a U.S. state has included this requirement in its privacy laws.
Maryland’s new law will impose strict data minimization obligations that allow collection of personal information only in connection with products or services specifically requested by a consumer. This standard is much narrower than the existing, “adequate, relevant, and reasonably necessary” purposes at the time of collection.
NEW CYBERSECURITY LAWS
Continued cyber threats drove greater data security and breach notification laws in 2024.
The FTC, U.S. Securities and Exchange Commission (SEC), and Federal Communications Commission (FCC) all expanded regulations that formerly applied only to companies within specialized sectors or with respect to particular types of data in their jurisdiction.
The FTC’s new sweeping breach notification requirements apply to nonbank financial institutions.
The Health Breach Notification Rule applies to companies working with health records not otherwise regulated by HIPAA.
The SEC revised Regulation S-P to include enhanced security and notice obligations for broker-dealers and investment advisors.
The FCC updated and expanded breach notification obligations.
Draft regulations for the Cyber Incident Reporting for Critical Infrastructure Act of 2022 (CIRCIA) were released and are expected to be finalized late next year. They require mandatory reporting of security breaches and ransomware payments by a broad group of businesses to include even retailers and software providers that may not have previously considered themselves “critical infrastructure.”
Data breach notification laws were revised across the country in 2024, becoming more complex and onerous. Pennsylvania added a new requirement that companies provide free credit reports to certain individuals affected by a breach.
The FTC and SEC brought actions in 2024 alleging companies were not transparent enough in their public statements following breach incidents.
NEW ARTICIFICIAL INTELLIGENCE LAWS
In 2024, countries around the world saw significant AI privacy legislation expansions, including Brazil, China, and the UK. Here in the U.S. and in the E.U., new AI laws impacted how companies can use AI and their responsibilities in handling personal information. Regulators became more active in policing AI and privacy issues.
At the federal level in 2024, the U.S. had 120 bills introduced in Congress, but most failed to make it out of committee. At the state level in 2024, 45 states introduced almost 700 AI-related bills. Of these, 31 states passed laws on various AI issues, involving disclosure of the use of AI, for private use, government use, and in deepfake generation.
New York had 91 AI-related bills, almost all of which are currently pending. California, Colorado, Utah, and Illinois had better success.
New California laws require developers of generative AI systems to mark and disclose AI-generated content, make available a free AI detection tool, and post information on their website about the training data used for models released after 2021. The California Consumer Privacy Act (CCPA) was amended to include AI formats in its definition of personal information.
Colorado passed the nation’s first “comprehensive” AI law regulating the creation and use of high-risk AI systems such as targeting AI systems used to make consequential decisions relating to the provision or denial of material services including: employment, lending, healthcare, and housing.
Utah’s AI law requires disclosure to consumers of certain services that they are interacting with an AI system, like a chatbot or AI phone assistant.
Illinois passed a new AI law targeting employers using AI for employment decisions including recruiting, hiring, discipline, and terms of employment. Such employers must notify the employee that AI is being used for employment decision-making and are not allowed to use AI to discriminate based on protected classes such as relying on zip codes as a proxy for race.
Both state and federal regulators pursued companies using AI for unfair or deceptive practices. The FTC pursued deceptive AI claims and schemes including an AI tool to create fake reviews, a company selling an “AI Lawyer,” and multiple AI companies selling online storefronts.
The FTC forced Avast and X-Mode to destroy their AI models and algorithms after it was determined they contained improperly collected data.
The SEC fined a pair of investment advisers for making false and misleading statements about their use of AI.
The Texas Attorney General settled claims against a company for making false and misleading statements about the accuracy and safety of an AI tool it offered to healthcare providers for summarizing patient records.
On Oct. 30, the White House published its progress to date on the implementation of AI accountability measures and guidance, standards, and best practices by federal agencies.
The EU has the world’s first and most comprehensive AI legislation. It has a list of AI-prohibited uses that will be banned starting in February. It includes items such as conducting social scoring, exploiting vulnerabilities based on weaknesses such as age or disability, and creating facial recognition databases through untargeted scraping.
AI that is defined as “General Purpose AI Models” (GPAI) with large amounts of data using self-supervision will have to provide technical documentation, a compliance policy, training content, testing and security, and enhanced safety testing and security in areas such as public safety, security, and public health.
The EU Act imposes significant transparency obligations, such as disclosing the use of AI or marking AI-generated content, for AI systems that interact with a consumer, generate content, recognize emotions, or categorize biometric data.
Those who use, create, import or export so called “High-Risk AI,” used in areas such as education, employee management, and certain biometric activities, or in products regulated by EU product safety laws, will have a variety of new AI requirements. They include registering in an EU database, creating a risk management system, and performing incident monitoring.
Cookies, cookie trackers and “Accept All” cookie banners are the focus of current Privacy regulators. Stay tuned.
[A]ll 10 of the comprehensive consumer privacy laws enacted or effective in 2024 protect the personal data of “consumers” within their states.