Last month, the Siskinds Privacy, Cyber and Data Governance team introduced our Privacy Pulse series, focused on providing businesses and professionals with monthly updates on the world of technology, privacy, and artificial intelligence laws in both the U.S. and Canada.
April has been busy, with many updates coming from the U.S. compared to Canada, and the artificial intelligence arms race continues amongst the technology giants.
The “Luring Test”, AI Arms Race and “Prompt Shields” as a Responsible AI capability
The “Luring Test”: In the US, the FTC Act, in part, prohibits unfair or deceptive practices. Michael Atleson, an Attorney with the FTC, in a blog post highlights a key concern that businesses may use generative AI to “steer people unfairly or deceptively into harmful decisions in areas such as finances, health, education, housing, and employment.” For example, a generative AI could prioritize the content of an advertiser when it answers a consumers’ queries. Businesses should ensure that their generative AI output, among other things, “distinguish between what is organic and what is paid. People should know if an AI product’s response is steering them to a particular . . . product because of a commercial relationship.”
March 28, 2024: Microsoft announced a range of safety tools designed to secure and make more trustworthy generative AI. Including “Prompt Shields”, which protect its generative artificial intelligence (AI) models from outputting harmful content in response to user inputs.
April 1, 2024: The AI arms race heats up. Alex Hern at The Guardian discusses the recent releases of OpenAI, Google, and Mistral’s new versions of their AI models. Meta will soon be releasing its own model, Llama. Additionally, Apple is rumoured to be adding AI features into its latest iOS 18 that would run on the device (in contrast to running off servers “in the cloud”). Depending on how such AI is implemented, it could enhance privacy as the AI won’t necessarily be transmitting your personal information to Apple’s servers.
Operational technology security, Google’s stand on data and third-party cookies
April 4, 2024: The Hacker News published an article detailing considerations for cybersecurity professionals on security Operational Technology. Recall that in 2014, hackers were able to intrude into Target’s systems through its HVAC systems.
April 17, 2024: The European Data Protection Board published its non-binding opinion that large online platforms like Facebook, in most cases cannot “comply with the requirements for valid consent if they confront users only with a binary choice between consenting to processing of personal data for behavioural advertising purposes and paying a fee.”
April 23, 2024: Google is again delaying its plan to phase out third-party cookies, and says that it is working closely with the United Kingdom Competition and Markets Authority (CMA) to address the CMA’s concerns.
Canada: AI investments, OPC in privacy group, and misaddressed mail reminder
April 7, 2024: Canada announced a $2.4 billion CDN package to invest in its domestic AI technology sector.
April 9, 2024: The Office of the Privacy Commission of Canada joined the Global Cooperation Agreements for Privacy Enforcement.
April 16, 2024: The Nunavut Information and Privacy Commissioner published a report criticizing the Nunavut Departments of Health and Finance for not making “reasonable security arrangements to reduce the risk of misaddressed mail” after Canada Post changed the mailing address of Iqaluit residents. This serves as a reminder for businesses to confirm the mailing or email address of the recipient before sending out sensitive information, especially if the recipient has not updated the address in some time.
United States: Legislative developments, U.S. Congress acts on AI, privacy, and national security
March 29, 2024: Axios reported that the U.S. House of Representatives has enacted a strict ban on congressional staffers’ use of Microsoft’s Co-Pilot generative AI.
April 7, 2024: The American Privacy Rights Act (APRA) was announced in Congress, which is a latest bipartisan draft privacy legislation on the federal level. Importantly, it would pre-empt and eliminate the patchwork of state laws by setting one national consumer privacy law, and it creates a small business exception. The APRA includes a private right of action. In an April 16 letter, the California Privacy Protection Agency (CPPA) wrote a letter to Congress criticizing the APRA—most prominently the proposed pre-emption. The CPPA urged the APRA should be set a baseline and allow states to develop stronger privacy protections.
April 20, 2024: Section 702 of the Foreign Intelligence Surveillance Act (FISA) was reauthorized with certain new restrictions on electronic surveillance. Generally, that section of FISA permits the National Security Agency to intercept communications from non-U.S. individuals outside of the U.S. for foreign intelligence purposes, and this Section has generated significant controversy in the privacy world, and has made data transfers from Europe to the US more difficult to effectuate (e.g., see the Schrem II decision).
April 24, 2024: Congress passed the “Protecting Americans’ Data from Foreign Adversaries Act” (PADFAA), which generally prohibits data brokers from selling, licensing, renting, trading, transferring, releasing, disclosing, providing access to, or otherwise making available sensitive personal information of a U.S. individual to any “foreign adversary country” or “any entity that is controlled by a foreign adversary.” The definition of “sensitive data” (which I’ve called “sensitive personal information”) closely resembles the definition of “Sensitive data” in the APRA.
April 24, 2024: Congress also passed the “Protecting Americans from Foreign Adversary Controlled Applications Act” (PAFACAA). PAFACAA is the law in the news that has given ByteDance, Ltd. a choice: divest from TikTok or have TikTok be banned in the U.S.
April 26, 2024: The Department of Health and Human Services (HHS) issued a final rule to amend the Privacy Rule of the Health Insurance Portability and Accountability Act (HIPPA) to support privacy in reproductive health care. The new rule comes into effect on June 25, 2024, and largely limits the permitted uses or disclosures of an individual’s protected health information (PHI) about reproductive health care for certain non-health care related purposes.
FTC files Actions: Misleading users about confidentiality and deceptive subscription tactics; amends Health Breach Notification Rule
April 11, 2024: The FTC has filed a proposed settlement against Monument, Inc. where it is alleged that Monument “claimed on its website and/or in other communications with consumers that users’ personal information [most importantly, the personal health information Monument collected about addiction] would be “100% confidential” and that the company would not disclose such data to third parties without users’ consent.” According to the FTC, from 2020 to 2022, Monument “allegedly disclosed users’ personal information, including their health information, to numerous third-party advertising platforms via tracking technologies, known as pixels and application programming interfaces (APIs), which Monument integrated into its website.” The proposed order, among others, imposes a $2.5 million civil penalty (which was suspended due to Monument’s inability to pay).
See also the FTC’s similar proposed settlement against Cerebral, Inc., a telehealth provider, which alleged Cerebral to have represented to the public that it offered “safe, secure, and discreet” services, yet buried in its privacy notices how it shared its users’ personal health information with third parties for advertising purposes. The FTC also alleged that Cerebral violated the Restore Online Shoppers’ Confidence Act by imposing a multi-step, multi-day process that users had to navigate in order to cancel the paid subscription service.
April 25, 2024: The FTC is taking action against Doxo, Inc. and its two co-founders for violating the FTC Act, the Gramm-Leach-Bliley Act, and the Restore Online Shoppers’ Confidence Act. Doxo is a third-party payment platform that permits users to pay the bills sent from other companies. The FTC alleges that Doxo engages in deceptive tactics by purchasing search engine ads that “intercept consumers attempting to reach their billers directly and styles the headlines of ads and other weblinks—often featuring only the biller’s name, not Doxo’s—so that they appear to be the biller’s own page”, and then such pages “[dupe] consumers into using its [paid] service . . .” The FTC alleges that Doxo’s deception has cost consumers millions in junk fees, among other harms.
April 26, 2024: The FTC amended its Health Breach Notification Rule, which generally requires vendors of personal health records and certain other entities (that are not covered by HIPAA) to notify individuals, the FTC, and in some cases, the media, of a breach of unsecured personally identifiable health data. See the link for a summary of the amendments, which includes clarifying the definition of a “breach of security” to include, “an unauthorized acquisition of unsecured [personal health record] identifiable health information in a personal health record that occurs as a result of a data breach or an unauthorized disclosure.”
FCC fines Wireless Carriers for Sharing Customer Data
April 29, 2024: The Federal Communications Commission (FCC) has fined AT&T, Inc. $57 million, Sprint Corporation $12 million, T-Mobile USA, Inc. $80 million, and Verizon Communications $47 million for allegations that the four carriers “sold access to its customs’ location information to ‘aggregators,’ who then resold access to such information to third-party location-based service providers. In doing so, each carrier attempted to offload its obligations to obtain customer consent onto downstream recipients of location information, which in many instances meant that no valid customer consent was obtained. This initial failure was compounded when, after becoming aware that their safeguards were ineffective, the carriers continued to sell access to location information without taking reasonable measures to protect it from unauthorized access.”
If you recall our March blog post, these allegations align with the trend that location data is considered sensitive personal information, and businesses that collect location information should pay close attention to how it collects, uses, and discloses such information.
Maryland: Next state to pass a comprehensive privacy law
April 6, 2024: Maryland passed the Online Data Privacy Act (MODPA), which is currently awaiting Governor Wes Moore’s signature. MODPA is unique in relation to other recently enacted state privacy laws in that it, like California and Colorado, requires data controllers to limit the collection of personal information to what is “reasonably necessary and proportionate to provide or maintain a specific product or service requested by the consumer to whom the data pertains.” Section 14-4607(B)(1)(I).
Colorado: Definition of “Sensitive Data” expanded to include “Neural Data”
April 17, 2024: Colorado amended the Colorado Privacy Act to expand its definition of “Sensitive Data” to include “Biological Data”, which includes “Neural Data”. Neural Data was defined as “information that is generated by the measurement of the activity of an individual’s central or peripheral nervous system and that can be processed by or with the assistance of a device.”
Siskinds LLP Privacy Concierge Program: Your path to compliance starts here
Establishing a comprehensive privacy program is essential for businesses. At Siskinds, our Privacy Concierge program offers custom subscription programs.
To discover how Siskinds can assist you in meeting your privacy compliance needs, or if you have any questions related to this blog post, contact myself, Savvas Daginis at [email protected], or a lawyer on our Siskinds’ Privacy, Cyber & Data Governance Team.