Skip to main content
StudioMeyer

AI GDPR Consulting

Compliance without lawyer costs. Practice instead of paragraph wall.

We tell you what you actually need to do when deploying AI. Which data goes in which tool, what the data processing agreement should look like, when a DPIA is due. Not formal legal advice — solid practical orientation. For formal review we have contacts with specialised law firms.

What's in the compliance check

What you actually need — and what not

Many SMBs fear GDPR and EU AI Act and over-engineer their measures, others underestimate the real risks. We do the realistic check.

Data flow map
We map which data flows into which AI tool. Who is the controller, who the processor, where is data stored. A map you can show to employees.
DPA check per tool
We go through your AI tools (ChatGPT, Claude, Make, n8n, etc.) and check whether you have the right data processing agreement. With OpenAI/Anthropic this is often an extra step many miss.
Data taboo list
Clear list: which data does NOT go into which tools. Customer data, employee data, client data, contract data. With reasoning why the line is there.
DPIA assessment
Do you need a data protection impact assessment for your AI deployment? We rate the risk level and give a clear answer. If yes, we help structure it.
EU AI Act classification
Which risk class does your AI deployment fall into per EU AI Act? Minimal, limited, high, prohibited. With consequences per level and dates when which obligations apply.
Practical recommendations
Three to five concrete measures you can implement in the coming weeks. Tool switches, contracts to renegotiate, employee briefings, privacy notice updates.

Who this is for

Typical GDPR stress-test situations

Law firm with client data

Lawyer or tax consultant wants to use AI for research or draft writing. Sensitive client data must not go anywhere uncontrolled. We build the taboo list and recommend suitable setups.

Practice or clinic (special data)

Health data is a special category under Art. 9 GDPR. Practically: standard ChatGPT is off-limits, custom solutions or local models are possible. We show where the limits are.

AI-assisted applicant screening

EU AI Act classifies AI-assisted recruiting as high-risk (from August 2026). We check whether your planned deployment falls under it and which obligations then apply.

Marketing agency with customer data

You work with customer data in AI tools for personalisation. We clarify the processor chain: end customer → your agency → AI tool. Plus DPA status per tool.

E-commerce with AI recommendations

You use AI for product recommendations or dynamic pricing. EU AI Act + GDPR intersection: profiling rules, transparency duties, right to object. We check the stack.

How it works

From tool inventory to recommendation

  1. 01

    Tool inventory

    You fill out a short pre-checklist: which AI tools do you use or plan? What data goes in? Who has access? 30-45 minutes effort.

  2. 02

    Data flow workshop

    60-90 minutes call with you and your data protection officer if you have one. We go through the tool list, identify risks and open questions.

  3. 03

    Compliance report

    10-15 page PDF with data flow map, DPA status per tool, DPIA assessment, taboo list and practical recommendations. With citations to relevant GDPR articles and EU AI Act sections.

  4. 04

    Handover

    60 minutes handover call: we go through the recommendations, you ask, we refine. If formal lawyer review is needed, we recommend specialised firms from our network.

Pricing

From 150 EUR/h or 990 EUR compliance check

Hourly rate for individual questions: 150 EUR/h. Standard compliance check (tool inventory up to 8 tools, data flow workshop, report, handover): 990 EUR. Extended check with DPIA outline and EU AI Act classification: 1,890 EUR. Monthly GDPR office hours 299 EUR/mo (1h plus Slack). Intro call (30 min) free.

See pricing and packages

FAQ

Common questions

Is this legal advice?

No, not legal advice in the formal sense. We give practical orientation based on our own experience with GDPR and AI tools. For legally binding statements or contract review we recommend specialised law firms from our network. This separation is also professionally important.

What exactly is the EU AI Act and when does it apply?

The EU AI Act has been in force since August 2024, most obligations apply from August 2026 (high-risk applications) or February 2026 (prohibited practices). It classifies AI systems into risk classes with different obligations. We help classify where your deployment stands.

We use ChatGPT — what do I need to do?

Three minimum steps: (1) sign DPA with OpenAI (available in settings under Compliance), (2) create data taboo list for employees, (3) update privacy notice to cover AI usage. For sensitive sectors (law, medical) this is often not enough, you need a different setup then.

What if I self-host on Hetzner / AI Server?

Self-hosting in Germany makes GDPR significantly easier (Hetzner Frankfurt/Falkenstein, EU). But LLM API calls (Claude, GPT) still go to US providers — that still requires a DPA. We show which paths carry which risks.

Do we need a data protection officer?

Required from 20 employees with automated data processing or with special data (health, criminal records, etc.). AI deployment often lowers the threshold. We assess whether you need a DPO and whether external or internal.

What does external lawyer review cost when you recommend it?

We work with two or three specialised firms. Typical prices: compliance statement 1,500-3,500 EUR, DPA negotiation per contract 800-1,500 EUR. We refer and can fix an introductory call if helpful.

Next step

30-minute intro call, free.

We look whether a compliance check is sensible for your AI plans or whether you should go straight to a law firm. Honest assessment, no sales pressure.