.png)
AI meeting notes (AI-powered call notes) automate the documentation of meetings, customer meetings, and sales appointments. For companies in Germany, Austria and Switzerland, this raises key data protection issues: What is the legal basis under the GDPR, when are there criminal consequences under Section 201 StGB and what technical requirements must an AI meeting tool meet? This guide explains the five most important checkpoints for GDPR-compliant AI meeting notes and shows what sales managers, data protection officers and IT decision makers should pay attention to when choosing tools.
AI meeting notes (automatic AI call notes) are software solutions that convert spoken content from meetings into text and derive structured summaries, to-dos and CRM updates from this. The spectrum ranges from simple real-time transcription (live transcript) to conversation intelligence (conversation analysis with trend and sentiment recognition).
AI meeting notes are relevant under data protection law because they process personal data: names, voices, conversation content and potentially biometric features such as voice profiles. The decisive question is not whether AI-based meeting documentation can be GDPR-compliant, but under which conditions. Loud Specialist lawyer Cornelius Matutis in IMPULSE magazine Is the entrepreneur who uses an AI meeting tool always the person responsible for data protection law.
Especially in B2B sales, the need for automated meeting notes is growing: According to industry estimates, sales managers and account managers spend 15 to 30 minutes per appointment with manual documentation. AI tools promise to save this time while improving CRM data quality. But without legal protection, companies risk fines of up to 4% of annual turnover.
The following five points determine whether the use of AI meeting notes in the DACH region is legally compliant.
Any processing of personal data requires a legal basis in accordance with Art. 6 GDPR. In practice, there are two options for AI meeting notes: consent (Art. 6 para. 1 lit. a) and legitimate interest (Art. 6 para. 1 lit. f).
Consent sounds obvious at first, but is problematic in practice. The employment relationship often lacks the necessary voluntary nature because employees are in a situation of dependency vis-à-vis the employer. According to one legal analysis by IMSCHWEILER-LEGAL The perceived dependence between employee and employer may make consent ineffective. Consent is also fragile when it comes to customer contact, as it can be withdrawn at any time.
The legitimate interest under Article 6 (1) (f) GDPR provides a more stable basis. Die LUTZ Business Law Firm | ABEL Their analysis concludes that AI transcription can be based on legitimate interest under certain technical conditions. Requirements include the elimination of permanent audio storage and speaker recognition.
Die Baumgartner Baumann law firm Evaluates the legitimate interest as a more flexible alternative, as it does not depend on consent that can be withdrawn at any time. Important: The transparency obligation under Article 13 GDPR remains in place even with legitimate interests. All discussion participants must be informed in advance. Die Datenschutzkanzlei.de recommends that you include this information in the meeting invitation.
In addition to the GDPR, AI meeting notes are also subject to criminal law. Section 201 StGB protects the confidentiality of non-publicly spoken word. The central question: Is an AI transcription a “recording” within the meaning of the law?
that Specialist portal unternehmensstrafrecht.de With reference to a Federal Constitutional Court resolution of July 2025, clarifies: Pure real-time transcription in volatile working memory (RAM) does not represent a recording within the meaning of Section 201 StGB and is not relevant under criminal law. Only when a tool permanently stores or caches audio does the criminal offence become apparent.
How that Dr. Data Protection portal from intersoft consulting explained, AI transcription software, which caches audio data even for a short time, can be considered a recording within the meaning of Section 201 StGB. The distinction between tools that save audio and those that only generate text in real time is therefore crucial.
that IT security institute SIDIT points out that a purely content-based AI summary without previous audio recording may not fall under § 201 StGB. The decisive factor is whether the spoken word is cached during the technical process.
Anyone who uses an external AI meeting tool commissions a service provider to process personal data. This requires an order processing contract (AV contract) in accordance with Art. 28 GDPR. Without an AV contract, use is illegal, regardless of how well the tool works technically.
The server location is particularly critical. With AI meeting tools based and servers in the USA, a third-country transfer takes place in accordance with Art. 44 ff. GDPR. This requires additional protective measures such as standard contractual clauses or an adequacy decision. Tools with an EU server location avoid this risk from the outset.
The PCS CAMPUS practice guide In addition, recommends that you communicate the use of AI transcription transparently as early as the meeting invitation and give participants an active option to object.
The data protection impact assessment (DSFA) in accordance with Article 35 GDPR is a structured risk analysis that must be carried out before risky data processing is introduced. When using AI meeting tools, a DSFA is often mandatory.
Die Data Protection Conference (DSK) states in its guidance from May 2024 that a DSFA is often required when using AI applications with personal data. Die ISICO data protection advice confirmed: When using AI tools, at least two criteria of the threshold analysis are usually met, making the DSFA mandatory.
Die Federal Commissioner for Data Protection and Freedom of Information (BfDI) refers to the mandatory lists of data protection supervisory authorities, which list processing operations for which a DPO must be carried out in any case. AI-supported processing is often included.
Another risk: shadow AI. Employees install AI meeting tools themselves, without the knowledge of the IT department or the data protection officer. Without a DSFA and AV contract, this use is illegal. Companies should therefore set clear guidelines for the use of AI and provide approved tools centrally.
The DSFA itself documents the planned data processing, assesses the risks to the rights and freedoms of those affected and defines technical and organizational measures (TOM) to minimize risks. In the case of AI meeting tools, this includes encryption, access controls, automatic deletion periods and contractual protection with the provider.
The EU AI Act (AI Regulation) supplements the GDPR with AI-specific requirements. According to a Practical guide from the Hamburg Chamber of Commerce From August 2026, AI systems with limited risk, including potentially AI transcription tools, will be subject to special transparency obligations under Article 50 of the AI Ordinance.
For AI meeting notes, this means in concrete terms: Users and discussion participants must be informed that an AI is involved in the conversation. Recognition of emotions (sentiment analysis) in meetings is classified as a prohibited practice under the EU AI Act. The obligation to have AI competence for employees has been in place since February 2025.
The sanctions are significant: up to 35 million euros or 7% of annual global turnover. Companies should already pay attention to EU-AI-Act compliance when selecting tools today.
Carolin Loy, Head of Digital Economy and AI Legal Issues at the Bavarian State Office for Data Protection Supervision, warns against automatically activating all available additional functions with AI transcription tools. Each function must be tested separately for its necessity.
The Bliro AI Notetaker addresses the five central GDPR requirements through a special technical architecture. According to the manufacturer, bliro works with a proprietary real-time transcription engine from TU Munich, which does not create audio or video recordings. Data processing is carried out exclusively on EU servers in Frankfurt am Main.
Because Bliro does not make audio recordings, the problem of § 201 StGB does not apply. Pure real-time transcription in volatile working memory does not represent a recording within the meaning of criminal law. According to BLIRO manufacturer information, the platform works on the basis of legitimate interest in accordance with Art. 6 para. 1 lit. f DSGVO and does not require the express consent of the interlocutors.
This is particularly relevant for sales teams in the DACH region: The bliro AI Notetaker not only works in online meetings (Zoom, Teams, Google Meet), but also for on-site appointments and telephone calls. Bliro thus covers all typical meeting formats in B2B sales without customers having to see a bot or agree to a recording.
Other privacy features of Bliro AI Notetaker at a glance:
According to its own statements, bliro is used by over 1,500 companies and was founded as part of a research project at TU Munich, funded by the Federal Ministry of Economics and Climate Protection and the EU Commission.
Consent is not necessarily required to use AI to transcribe meetings. Under certain technical requirements, AI meeting transcription can be based on legitimate interest in accordance with Art. 6 para. 1 lit. f GDPR. The prerequisite is, among other things, that no permanent audio recordings are made and that the transparency obligation under Article 13 GDPR is met. The commercial law firm LUTZ | ABEL confirms this approach in their legal analysis.
AI transcription is not automatically punishable. The technical implementation is decisive: According to the specialist portal, pure real-time transcription without audio storage is required unternehmensstrafrecht.de not under § 201 StGB. Only when a tool caches audio data permanently or temporarily can the criminal offence be opened. Before selecting a tool, companies should check whether the tool used stores audio or only generates text.
In most cases, a data protection impact assessment (DSFA) should be carried out before introducing an AI meeting tool. Die DSK guidance notes that AI applications with personal data often require a DSFA in accordance with Art. 35 GDPR. Data protection consultancy isico confirms that at least two threshold analysis criteria are regularly met when using AI. The DSFA should be completed before the tool is used productively for the first time.
So-called shadow AI is a growing compliance risk. If employees install AI meeting tools without the IT department's knowledge, the AV contract, DSFA and transparency notices are missing. The use is therefore illegal. GDPR violations may result in fines of up to 4% of annual turnover. Companies should define clear guidelines for the use of AI and provide approved tools centrally.
The Bliro AI Notetaker According to the manufacturer, works without audio/video recordings and without a visible bot in the meeting. bliro uses a proprietary real-time transcription engine from TU Munich, processes data on EU servers in Frankfurt am Main and is ISO 27001 certified. The platform is based on the legitimate interest under Article 6 (1) (f) GDPR as the legal basis.