Be Careful Using AI for Note Taking in Meetings -...

EverythingPeople This Week!

EverythingPeople gives valuable insight into the developments both inside and outside the HR position.

Latest Articles

Be Careful Using AI for Note Taking in Meetings

Whether it’s Teams, Zoom, Slack or another tool that has an AI note taking feature, using the feature could lead to liability.  In a recent lawsuit that was filed in August 2025, Brewer v. Otter.ai, in the federal Northern District Court of California, Brewer alleges that Otter.ai recorded, accessed, and used the contents of private conversations without obtaining proper consent.

Specifically, Otter.ai, Inc. (“Otter”) has developed and provides to the public an artificial intelligence-powered meeting assistant called Otter Notetaker.  Otter Notetaker engages in real-time transcription of Google Meet, Zoom, and Microsoft Teams meetings for Otter accountholders and other users.  The complaint alleges that Otter.ai records meeting participants’ conversations even if they are not Otter accountholders, which Brewer was not.  It also alleges that Otter.ai uses the recordings to train Otter’s automatic speech recognition (ASR) and machine learning models. Finally, Otter.ai attempts to avoid liability for its tool by providing little or no notice to non- accountholders and shifts the burden of obtaining permissions onto its accountholders.

Brewer sued both under federal and California law.  Under federal law it was violations of the Electronic Communications Privacy Act (ECPA) and the Computer Fraud and Abuse Act (CFAA). Under California law it was violations of the California Invasion of Privacy Act (CIPA), the Comprehensive Computer Data and Fraud Access Act, common law intrusion upon seclusion and conversion, and the Unfair Competition Law (UCL).

Under normal circumstances, Otter.ai may not have any liability.  However, if the complaint is true and it recorded conversations for its own purposes, liability may arise because there is no expectation by a participant that a third-party vendor would not only have access to possible confidential discussions and information but having those discussions used for machine learning.  There is no understanding of what safeguards are in place.  They may mitigate the issue by fencing in all conversations recorded, but it is unclear of this stage of the lawsuit.

Further, it is unknown whether the company using the tool understood what the recordings were being used for other than record keeping and reference.  Further, if the company did know, did they have a policy and practice in place to inform outsiders that any conversation may be used for machine learning purposes?

This raises a question for HR: Does your organization use an AI note taking tool with people outside the organization?  Are the recorded conversations being used for machine learning or do you not know?  Regardless, do the employees provide notice it is being used and give participants the option to opt out?

Although Michigan is a one-party consent state, in this situation, it might be advisable to inform all participants and give them the option to opt out. 

If the recordings are used for machine learning, be sure to know if the data is de-identified. Even if data is de-identified,  de-identification is imperfect, particularly with voice data and conversational context according to the complaint.

If HR does not have a policy for using AI tools for recording and note taking, it should develop one.  Questions that need to be answered include: How does the vendor store, use, and share transcription data? Who has access to them? Do they contain personal information that must be safeguarded? Can recordings be deleted upon request? Are they used for training or product development? Does the use of third-party AI notetakers risk waiving attorney-client privilege or exposing trade secrets?
Finally, some vendors offer features that allow any participant to pause or prevent recording – an important safeguard. Organizations should evaluate in conjunction with legal counsel whether their chosen tool provides these protections.

 

Source: Jackson Lewis 8/29/25


Related Events

Webinar: Applying AI in Recruiting – Tools, Risks, and Best Practices

05/28/2026 09:00 AM - 05/28/2026 10:00 AM

AI is rapidly changing how organizations attract, engage, and evaluate talent, but many HR teams are still figuring out how to use it effectively without creating risk. From automated screening tools to AI-generated outreach, the opportunities are significant, but so are the potential legal and operational pitfalls.

In this session, we’ll explore how AI is being used in recruiting today and what it means for HR teams. You’ll gain a practical perspective on integrating AI into your workflow, including which tools are worth considering, where they can drive meaningful impact, and where they can introduce risk. We’ll also cover key legal considerations such as bias and compliance obligations.

Whether you’re evaluating AI for the first time or refining your current approach, you’ll leave with a clearer understanding of how to think about AI in recruiting – balancing efficiency, oversight, and risk.


ASE Members: Complimentary

Non-Members: $49

Filter:

Filter by Authors

Position your organization to THRIVE.

Become a Member Today