Site icon Siskinds Law Firm

Privacy pulse: New Ontario OIPC guidance, privilege in data breach investigations and further developments in state privacy law

The Siskinds Privacy, Cyber and Data Governance team is focused on providing businesses and professionals with monthly updates on technology, privacy, and artificial intelligence (A.I.) laws in both the U.S. and Canada.

Before diving into this month’s update, I’m pleased to mention I will attend the International Association of Privacy Professionals (IAPP) Canadian Privacy Symposium in Toronto on June 10-11. Staying current on Canadian and non-Canadian privacy laws and regulations is critical in the ever-changing privacy landscape, and the symposium promises to bring together Canadian data protection leaders to offer training and thought-provoking discussions. I’m looking forward to attending, and if you’re planning to attend, send me an email or a message on LinkedIn so we can connect at the event.

Hardware privacy, Telegram’s “Find People Nearby” and hallucinations on Google’s AI Overviews feature

According to Forbes, Police in the United States are testing a new device that can scan moving vehicles for anything that “emits a signal, including phones, smartwatches, cat and dog tracking chips and even library books [(i.e., through the implanted RFID chips)].” The technology can even “identify specific models of devices like iPhones and Bose headphones . . .[,] unique signals emitted by pet chips, Wi-Fi and Bluetooth devices, wearable tech like fitness trackers, in-car infotainment systems and tire pressure sensors, and can even . . . [identify] the RFID of a library book . . . For law enforcement, all that data can be linked to a car’s license plate number, becoming a unique ‘fingerprint.’”

We have often focused on privacy within software and on demanding the software be made transparent. It’s time that we also focus on hardware privacy: hardware made in a transparent manner and be made to limit identifying signals emitted from such device.

May 9, 2024: Please note that a program entitled Close-Circuit Telegram Vision (which can easily be found online) allows “anyone to search a specific set of coordinates for Telegram users that have a certain setting enabled, and then plot their approximate physical location on a map.” The program allegedly uses the Telegram feature, “Find People Nearby.” If you use Telegram, go into your settings and ensure this feature is turned off in an abundance of caution.

May 15, 2024: The United Kingdom announced two codes of practice to enhance cybersecurity in A.I. and software. Both codes will “set out requirements for developers to make their products resilient against tampering, hacking, and sabotage.” You can find here the currently drafted Cyber Security of A.I. and Code of Practice for Software Vendors.

May 23, 2024: The Verge published an article about how Google’s AI Overviews feature is hallucinating. For example, in a query about how to keep the cheese from falling off the pizza, Google answered, “Add some glue . . . Mix about 1/8 cup of Elmer’s glue in with the sauce. Non-toxic glue will work.”

May 29, 2024: The U.S. Department of Justice (DOJ) announced that it dismantled the 911 S5 Botnet—likely the world’s largest Botnet, which “infected computers in nearly 200 countries and facilitated a whole host of computer-enabled crimes, including financial frauds” and identity theft. FYI, a “botnet” is “a network of private computers infected with malicious software and controlled as a group without the owners’ knowledge, e.g., to send spam messages.” See Google’s definition.

May 30, 2024: Microsoft’s Threat Intelligence Team reminds businesses the “critical need to protect internet-exposed Operational Technology [OT] devices.” OT is generally concerned with controlling physical devices and processes, whereas in contrast, Information Technology is about managing networks, databases, hardware, and software for the processing of information.

Canada: Privilege in Data Breach Investigations, Quebec’s regulation respecting anonymization of personal information and newly released OIPC guidance

April 30, 2024: The Ontario Superior Court held that, in LifeLabs LP v. Information and Privacy Commr. (Ontario), 2024 ONSC 2194, LifeLabs could not assert solicitor-client or litigation privilege over five sets of disputed documents and the information within them: the investigation report prepared by a cybersecurity firm; the email correspondence between the cyber intelligence firm and the cyber attackers; an internal data analysis prepared by LifeLabs to describe which PHI had been affected; a submission from LifeLabs to the Commissioners in response to certain specific questions, communicated through legal counsel; and the report of Kevvie Fowler, Deloitte LLP, which was prepared as part of the representations by LifeLabs and submitted to the Commissioners for that purpose.

The Court held that Ontario’s Office of the Information and Privacy Commission (OIPC) had a duty to inquire into the data breach, and LifeLabs had a duty to respond, and that “privilege does not protect information that would otherwise have to be disclosed.”

May 6, 2024: The Office of the Privacy Commissioner (OPC) (Canada) published a survey that explored the privacy views and practices of businesses across Canada: for example, only 56% have designated a privacy officer; 53% have procedures to deal with complaints; 50% have internal privacy policies; 50% have procedures to deal with access requests; and 33% train their staff. These five actions are low-hanging fruit, and businesses should at a minimum aim to meet them, and if they do, they will be well on the road to being compliant with privacy law.

May 13, 2024: Ontario introduced Bill 194, An Act to enact the Enhancing Digital Security and Trust Act, and to amend the Freedom of Information and Protection of Privacy Act (FIPPA). The first part of the Act addresses cybersecurity and A.I. systems at public sector entities. The second part of the Act amends FIPPA to create reporting requirements for institutions regarding data breaches, add requirements to undertake privacy impact assessments, and provide the Ontario Information Privacy Commission with the power to review the information practices of institutions.

May 14, 2024: The British Columbia Supreme Court held that the B.C. Personal Information Protection Act, S.B.C. 2003, c. 63 applies to federal political parties. Liberal Party of Canada v. the Complainants, 2024 BCSC 814. As noted by the Court at para 6, “[t]his is the first time that a Canadian Superior Court has considered the constitutional applicability of provincial privacy legislation to [federal political parties].”

May 15, 2024: The Regulation respecting the anonymization of personal information has become law in Quebec. If, as part of your business model, you anonymize personal information that you collect from Quebec, you should ensure your anonymization practices comply with QC law.

May 22, 2024: British Columbia’s Information and Privacy Commissioner told CBC that, “it’s important to recognize that they should be collecting the minimum amount of information necessary to do their work, and retaining the minimum amount of information necessary to do their work.” This is the principle of data minimization.

May 24, 2024: The OPC has launched “launched a new online breach reporting form for federal institutions subject to the Privacy Act as well as updated its online breach reporting form for businesses subject to the Personal Information Protection and Electronic Documents Act (PIPEDA).” See the announcement.

May 29, 2024: Ontario’s OIPC released guidance in the sharing of information without consent in situations involving intimate partner violence.

May 30, 2024: Ontario’s OIPC released a guidance providing advice for public sector institutions contracting with third-party service providers. In the event you are a privacy officer in an institution covered by FIPPA or MFIPPA, I would highly recommend reading this guidance to understand the process of involving service providers.

United States: Rise in sophisticated phishing attacks using A.I. and FCC proposal on AI-generated content in political ads

May 8, 2024: The Federal Bureau of Investigation (San Francisco division) is warning businesses that cyber criminals are increasingly using A.I. to conduct “sophisticated phishing / social engineering attacks and voice / video cloning scams.” This warning reminds me of a news event a couple of months ago when an employee at a large company was tricked on a video conference call into paying $25,000,000 to a “deepfaked” version of the company’s chief financial officer.

May 13, 2024: The recent amendments to the Standards for Safeguarding Customer Information Rule (referred to as the “Safeguards Rule”) of the Gramm Leach Bliley Act (GLBA) are now in effect. Financial institutions are regulated (note the broad and expansive definition of “financial institution”).

May 13, 2024: The New York State Department of Financial Services (DFS) released its Cybersecurity Program Template pursuant to its cybersecurity regulation found at 23 NYCRR 500. This Cybersecurity Program template is to help individually owned businesses develop a cybersecurity program, which is required by the regulation.

May 14, 2024: National Institute of Science and Technology (NIST) updates guidelines for protecting sensitive personal information for data known as “controlled unclassified information” (CUI). “These guidelines require organizations to safeguard CUI such as intellectual property and employee health information. Systems that process, store and transmit CUI often support government programs involving critical assets, such as weapons systems and communications systems, which are potential targets for adversaries.”

May 15, 2024: The Department of Commerce is planning on issuing proposed rules on connected vehicles from China (i.e., vehicles that have onboard integrated network hardware that allows internet access, allowing them to share data with devices both inside and outside the vehicle). The concern is that these vehicles have thousands of sensors that collect “where the driver goes, what the driving patterns are, and what you’re saying in your car”, and these vehicles are sending that data right back to China.

May 22, 2024: The head of the Federal Communications Commission (FCC) shared a new proposal that, if adopted, would look into “whether the [FCC] should require disclosure when there is AI-generated content in political ads on radio and TV.”

Illinois legislature amends the Biometric Information Privacy Act (BIPA)

May 16, 2024: The amendment, found under SB2979, clarifies that a person whose biometric identifier or biometric information is collected in contravention of BIPA by a private entity more than once using the same method of collection may only recover damages for one violation. This amendment was in response to the IL. Supreme Court decision in Cothron v. White Castle System, Inc., 2023 IL 128004, held that a BIPA claim accrued with every scan or transmission without prior informed consent. The IL Supreme Court has also requested that the legislature review and make clear its intent regarding the assessment of damages under the Act.

May 21, 2024: A class action lawsuit was filed against Marriott for allegedly violating BIPA. The complaint alleges that Marriott required its hourly workers to clock in and out of shifts and breaks with a fingerprint scanner, and that such scanner was connected to Marriott’s timekeeping and payroll system. The scanner, in addition to collecting fingerprints, also stored the fingerprints on the servers of Marriott’s timekeeper vendor, Kronos Inc. The plaintiffs allege that Marriott did not explain the biometrics system to its workers, did not explain how it used the data, how long it kept the data, and that the data was shared with a third-party vendor. The plaintiffs allege that they did not consent to this collection, use, and disclosure.

Colorado is the first state to enact an A.I. Law

May 17, 2024: The Colorado A.I. Act (Consumer Protections for Artificial Intelligence) was signed into law, and will come into effect on February 1, 2026. This Act focuses on establishing obligations for developers and deployers of “high-risk artificial intelligence systems”, which is defined to include any A.I. “system that, when deployed, makes or is a substantial factor in making, a consequential decision.”

Consequential decision is defined to mean “a decision that has a material legal or similarly significant effect on the provision or denial to any consumer or, or the cost or terms of: (a) education enrollment or an education opportunity; (b) employment or an employment opportunity; (c) a financial or lending service; (d) an essential government service; (e) health-care services; (f) housing; (g) insurance; or (h) a legal service. Additionally, the Colorado A.I. Act also addresses the concept of “algorithmic discrimination”, and pairs it with obligations onto both the Developer and the Deployer to use reasonable care to avoid it.

Vermont and Minnesota pass the next Comprehensive Privacy Laws

May 08, 2024: Vermont’s legislature passed an act relating to enhancing consumer privacy and the age-appropriate design code (H.121), and the Act is waiting for the governor’s approval. The Act has an innovative private right of action that targets data brokers and large data brokers and deals largely with violations relating to sensitive data (see § 2427(d)(1)). The Act’s data minimization standards also resemble Maryland’s recent comprehensive privacy law, which requires limiting the collection of personal data “to what is reasonably necessary and proportionate to provide or maintain a specific product or service requested by the consumer to whom the data pertains.” (See § 2419(a)(1)).

May 24, 2024: Minnesota’s enacted its own comprehensive privacy law: the Minnesota Consumer Data Privacy Act (MCDPA), which takes effect on July 31, 2025. The MCDPA’s main provisions are consistent with existing state privacy laws with a few differences. For example, Consumers, if profiled to advance a decision made, may request to be informed of the reason that the profiling resulted in such decision, and to be informed of what actions the consumer might have taken to secure a different decision. Another example is the MCDPA has aligned its definition of “small business” with the definition created by the U.S. Small Business Administration.

Putting privacy first: Your path to compliance starts here.

It is crucial for businesses to establish a comprehensive privacy program. At Siskinds, our Privacy Concierge program offers custom subscription programs.

To discover how Siskinds can assist you in meeting your privacy compliance needs, or if you have any questions related to this blog post, contact myself, Savvas Daginis at savvas.daginis@siskinds.com, or a lawyer on our Siskinds’ Privacy, Cyber & Data Governance Team.

Exit mobile version