GDPR & AI: A Practical Compliance Guide for Law Firms
Using AI to process client documents triggers GDPR obligations most firms haven't considered. Here's what you need to know before your next AI deployment.
The GDPR Problem with AI Tools
Most law firms understand GDPR at a basic level: you need consent or a legal basis to process personal data, you must secure it, and breaches must be reported. What fewer firms have considered is how deploying AI tools changes the compliance picture—significantly.
When you upload a client document to an AI service, you are engaging a third-party data processor. Under GDPR, this triggers a specific set of obligations that standard enterprise agreements almost never satisfy.
Five GDPR Obligations Triggered by AI Document Processing
1. Article 28 — Data Processing Agreement (DPA) Required
If you use an external AI service to process documents containing personal data, you must have a written DPA with that provider. This isn’t optional. The DPA must specify:
- The subject matter and duration of the processing
- The nature and purpose of the processing
- The type of personal data and categories of data subjects
- That the processor only acts on documented instructions
The problem: Many AI providers offer enterprise DPAs only at higher tiers, and the terms often contain carve-outs that undermine their usefulness—particularly around model training and subprocessor access.
2. Article 13/14 — Transparency Obligations
You must inform data subjects (your clients, in most cases) that their personal data will be processed by an AI system. This means updating your privacy notice to explain:
- That you use AI for document processing
- Who the AI provider is (or the categories of providers)
- The purpose and legal basis for the processing
Most law firms have not updated their privacy notices to reflect AI usage.
3. Article 5(1)(b) — Purpose Limitation
Data collected for legal representation cannot be used for other purposes without a new legal basis. This creates tension with AI providers that use customer data to train or improve their models.
Even if the provider claims data is “anonymized” before training, purpose limitation applies at the point of processing, not the point of storage. Uploading client files to a provider that may use them for model improvement is almost certainly a purpose limitation violation.
4. Article 35 — Data Protection Impact Assessment (DPIA)
A DPIA is likely required when deploying AI for document processing because:
- AI involves new technology (EDPB guidance explicitly flags this)
- Processing may occur at scale
- Documents may contain special category data (health records, immigration status, criminal proceedings)
Many firms have deployed AI tools without conducting a DPIA—a significant compliance gap.
5. Article 5(1)(f) — Integrity and Confidentiality
GDPR requires appropriate security measures for all personal data processing. When you use a cloud AI service, you are delegating security decisions to a third party. “Appropriate” security must be assessed against the risk—and for privileged legal documents, that bar is high.
Why Standard Enterprise Agreements Don’t Satisfy GDPR
Enterprise agreements from major AI providers are written primarily for US legal frameworks. When scrutinized against GDPR requirements, they typically fail in several areas:
| GDPR Requirement | Standard Enterprise Agreement |
|---|---|
| Art. 28 DPA with specific terms | Often a short addendum with broad carve-outs |
| Art. 28(3)(b) — data only processed on instructions | Usually includes training/improvement clauses |
| Art. 28(4) — subprocessor restrictions | Long, changeable lists of subprocessors |
| Art. 28(3)(g) — deletion after service ends | Often retains data for undefined periods |
| Art. 46 — transfer safeguards | Relies on SCCs, which have been challenged repeatedly |
The EDPB has been clear: a DPA that doesn’t actually restrict processing is not a valid DPA.
The GDPR + CLOUD Act Double Bind
Law firms using US-based AI providers face a specific double bind described in our CLOUD Act guide:
- CLOUD Act can compel the US-based provider to disclose your client data to US law enforcement
- GDPR Article 48 states that complying with such orders (without an international agreement like an MLAT) is not a valid basis for transfer under GDPR
This means that if a US authority serves a CLOUD Act warrant on your AI provider:
- The provider must comply under US law
- That compliance may itself violate GDPR
- Your firm may bear responsibility for engaging a processor subject to this conflict
This is not a theoretical risk. It is the current legal state of using any US-headquartered AI service for EU personal data processing.
What a Compliant AI Setup Looks Like
The cleanest solution to the third-party processor problem is to eliminate the third party.
When AI runs on-premises—on hardware you control—there is no Article 28 relationship to establish. You are both the controller and the processor. The CLOUD Act doesn’t apply because there’s no US company holding your data. Subprocessor clauses are irrelevant because there are no subprocessors.
This is the architecture behind the Tacitus Cortex: AI infrastructure that processes your documents locally, under your jurisdiction, with no data ever transmitted to a third party.
For organizations not ready for on-premises deployment, the Tacitus Cloud Bridge offers a GDPR-compliant alternative: EU-headquartered, single-tenant, with a genuine DPA that restricts processing to your instructions only.
Practical Checklist: 5 Questions Before Deploying AI
Before your next AI deployment, ask:
1. Do we have a valid Article 28 DPA with this provider? Not just a checkbox in their portal—a genuine DPA with the specific terms Article 28(3) requires. Read it carefully for training and subprocessor clauses.
2. Have we updated our privacy notices? Clients must be informed that AI will process their documents. Generic “we use technology to improve our services” language doesn’t satisfy Article 13/14 specificity requirements.
3. Have we conducted a DPIA? If you’re processing personal data at scale, with new technology, or involving special category data, a DPIA is likely required. Document it either way.
4. Where is the parent company headquartered? EU data center ≠EU jurisdiction. If the parent company is American, your data is subject to CLOUD Act demands regardless of server location.
5. What happens to our data if we terminate? GDPR requires data to be deleted or returned after processing ends. Confirm the timeline and mechanism—and get it in writing.
Using AI without proper GDPR compliance isn’t just a regulatory risk—it’s a client trust issue. If your firm is evaluating AI deployment, request a briefing to discuss compliant infrastructure options.