USING AI TO RECORD MEETINGS: DATA PRIVACY RISKS PEOS NEED TO KNOW

BY KRISTEN FRADIANI

Content Marketing Manager
BLR

August 2025

 

Artificial intelligence (AI) tools are changing how PEOs work. One growing trend is the use of AI notetakers—tools that can record, transcribe, and summarize meetings. They’re helpful because they let people stay focused on the conversation instead of scrambling to take notes. And if someone misses a meeting, they can still catch up on what was said.

But, before using AI to capture meetings, it’s critical to think through the risks. Privacy laws and data protection must be top of mind. Because PEOs support both businesses and employees, there’s added responsibility around how data is handled.

In this article, we’ll walk through the biggest issues to consider.

RECORDING MEETINGS: KNOW YOUR STATE’S CONSENT LAWS

Meeting recording laws vary from state to state. In some states, only one party on the call needs to consent to a recording. In others, everyone must give permission. If someone records a meeting without meeting the legal consent standard, the organization could face legal action—and so could its clients.

Here’s what you should do:

  • Confirm consent requirements in each state where your organization or your clients operate.
  • Create a clear consent process for internal and external meetings.
  • Train employees and managers on these consent rules.

It’s also wise to communicate with clients about whether and how recordings are made—especially in joint meetings or collaborative spaces.

SENSITIVE DATA AND LEGAL RESPONSIBILITIES

AI notetakers can transcribe everything—names, payroll figures, strategies, performance data, and more. If any of that information counts as personally identifiable information (PII) or confidential business data, you must treat it carefully.

There are both federal and state laws that govern how employee and consumer information can be used, stored, and disposed of. These may include: consent and notification requirements, limits on how long data can be retained, and laws specifically protecting employee data (such as Social Security numbers or disciplinary records).

Some state laws are strict about handling employment data, even if they’re framed as “consumer data privacy” laws.

The regulatory environment is also expanding quickly. In 2024 alone, U.S. federal agencies introduced 59 AI-related regulations—more than double the number from 2023—and those regulations came from twice as many agencies.

This rapid pace of regulation makes it even more important for PEOs and their clients to stay on top of AI-related compliance—and to consult legal counsel to understand which laws apply based on their footprint and the types of data they manage.

EMPLOYEE USE OF AI: SET YOUR POLICY

It’s not enough to think about AI tools used by the company. Employees may also use AI to help with their work. That could include feeding information into public AI systems without realizing the risks.

Employers should decide their stance on employee use of AI. It could be:

  • Open use (encouraged with guardrails)
  • Limited use (allowed in some cases, with approvals)
  • Prohibited use (banned completely for certain tasks or systems)

Whatever the stance, be sure to: train employees on what’s considered confidential or proprietary, prohibit entering sensitive data into public tools, monitor use for compliance, and provide clear examples of what is acceptable and what is not.

ACCURACY, BIAS, AND QUALITY CONTROL

AI transcription tools aren’t perfect. They may misinterpret accents, jargon, or industry-specific terms. They may generate summaries that are incomplete or misleading. In some cases, bias can sneak into outputs, even when demographic information isn’t included in the input.

PEOs should:

  • Review all outputs from AI tools before relying on them.
  • Use de-identification processes to scrub sensitive data from records.
  • Set up a quality control process to catch and fix errors.
  • Ensure tools are tested for fairness and accuracy.

Errors in meeting notes or summaries can lead to serious misunderstandings—especially if decisions are being made based on inaccurate AI-generated content.

DATA STORAGE, ACCESS, AND OWNERSHIP

Once a meeting is recorded, the data doesn’t just disappear. You need to decide:

  • How long will recordings or transcripts be stored?
  • Who has access to them?
  • What happens if a third party (such as a client or vendor) was in the meeting?

Depending on the content, there may be legal requirements around retention, destruction, or even access to logs. Loop in IT and legal teams to make sure data is being handled correctly.

Also, pay attention to who owns AI-generated content. Some AI vendors claim ownership rights, or at least shared rights. This could create legal problems down the line—especially if the output includes client or employee information.

NEXT STEPS FOR PEO LEADERS

If your organization or your clients are thinking about—or already using—AI notetakers, consider these steps.

Get clear on recording laws by state. Some states need all parties to consent to being recorded; others only need one. Make sure your team knows the difference—and help your clients understand it too. It’s the kind of thing that seems minor until someone makes a complaint or a lawsuit lands.

Educate clients on the risks of “open” AI tools. Platforms like Claude or ChatGPT might seem safe, but if someone enters private employee or client information, that data could be stored and used by the system indefinitely. That opens the door to serious privacy issues. If their team is using AI, they need clear guidance on what should be shared and what should not.

Help them build a policy. Most small businesses don’t have the time or know-how to write an AI use policy—but you can help with that. Even something basic is better than nothing: what tools they can use, what data they’re allowed to share, and who has access to AI-generated content.

Push for training. Ensure that both your employees and your clients’ teams know what “confidential” really looks like in day-to-day work. That includes not putting performance issues, payroll info, or business plans into AI tools that might store or share that data.

Talk about data storage and access. If meetings are being recorded, you need to know exactly where those recordings are stored, who has access to them, and how long they’re kept. This becomes especially crucial when the conversation includes sensitive or legal topics—because how and where that data is stored carries legal risk.

Loop in IT and legal support. Involve IT and legal early in the process. Many clients don’t have those resources in-house, so they may not know what to look out for.

AI tools can absolutely make work easier, but they need to have boundaries. As their PEO, you’re in the perfect spot to guide them—set expectations, share best practices, and even help them draft the policies they don’t know they need yet.

It’s one more way you show up as a partner, not just a provider. That kind of support builds trust and keeps clients with you for the long haul.

SHARE


RELATED ARTICLES

2023 DIGITAL TRENDS

Lorem ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into …

BY James Joyce

June/July 2023
CYBERSECURITY - TECHNOLOGY

AI IN CYBERSECURITY: THE GOOD, THE BAD AND BEING ON THE PRECIPICE OF A NEW ERA IN TECHNOLOGY

As you might expect with cybersecurity, battlelines are being drawn between the people creating AI solutions to help protect companies and the people making AI software that is designed to find vulnerabilities in areas designed to protect data; systems; financial and personal information; intellectual property (IP); and Industrial Internet of Things (IIoT) and other IoT devices.

BY Dwayne Smith

September 2023
RISK

TIME ON YOUR SIDE: FIVE SCRAPPY WAYS YOUR PEO CAN USE AI TO SHRINK THE GROUP HEALTH SALES CYCLE

In your group health sales cycle, time is of the essence. Shorter sales cycles generally lead to larger volumes, higher revenues, more satisfied account execs, and repeat customers, especially for an annual purchase like group health insurance. You can shrink the time you turn a lead into a customer by adding a speedy new member to your sales team: artificial intelligence. AI can help you close deals faster than your competitors can get their boots on.

BY Kaitlyn Fischer

September 2023
CYBERSECURITY - TECHNOLOGY

ASK THE EXPERT: A Q&A WITH PAUL NASH OF BEAZLEY

Paul Nash is an employment practices liability (EPL) underwriter with Beazley. He is the EPL and Safeguard product leader for both the UK and US teams and was instrumental in developing the first SAM/SML policy issued by Beazley in 2006. He has more than 30 years of experience in the insurance. He recently spoke with Paul Hughes of Libertate Insurance about the state of the EPLI market, how he has seen the PEO industry evolve and more. PEO Insider captured their conversation.

BY PAUL HUGES

August 2023

ADVERTISEMENT

Ad for Sentara Health Plans