Top 5 Things the AI Act Changes for Small Companies
🧠 Introduction: AI Regulation Isn’t Just for Big Tech Anymore
The EU Artificial Intelligence Act (AI Act) is the first law in the world to regulate AI across all sectors. While large tech platforms dominate headlines, this law also affects small businesses.
If you run:
-
A SaaS platform
-
An HR tool
-
An online shop using smart recommendations
-
A marketing agency using AI-generated content
…then this law applies to you.
Even if you use third-party tools like OpenAI, Claude, or Gemini, the way you use those tools could trigger legal obligations.
⚖️ 1. You Might Be Using a High-Risk AI System
The AI Act defines four categories of risk:
❌ Unacceptable risk → Banned completely
⚠️ High risk → Strict controls and documentation required
🟡 Limited risk → Transparency obligations apply
✅ Minimal risk → No legal duties beyond basic safety
You’re likely using high-risk AI if your system:
-
Scores CVs or filters job candidates
-
Evaluates credit scores or eligibility
-
Manages biometric access
-
Automates decisions in healthcare, education, or finance
Even using third-party software like a CRM or ATS plugin can make your business responsible, depending on how the tool is used.
🔍 Key takeaway: Risk is based on the context — not the vendor.
🛠️ 2. You May Be a “Provider” — Not Just a User
The AI Act separates roles. Each comes with different legal responsibilities.
You’re a provider if you:
-
Build your own AI model
-
Retrain or fine-tune someone else’s model
-
Embed or repackage general-purpose AI into your product
You’re a deployer if you:
-
Use the AI in internal workflows
-
Offer it to users as part of your services
Sometimes, you’re both.
If you’re a provider, you’re legally required to:
-
Perform conformity assessments
-
Maintain detailed technical documentation
-
Monitor and report system performance after release
💡 Even adjusting a prompt or retraining a chatbot can qualify you as a provider under the law.
🧾 3. FRIAs Are the New DPIAs (And SMEs Must Do Them)
A Fundamental Rights Impact Assessment (FRIA) is required before using high-risk AI in:
-
Education
-
Employment
-
Credit scoring
-
Healthcare
-
Public services
Even if you’re using third-party AI, you’re expected to evaluate its risks to people’s rights.
FRIA focuses on:
-
Risk of bias or discrimination
-
Loss of transparency
-
Restricted access to opportunities
-
Misuse of profiling or categorization
🧩 Pro tip: Align your FRIA with existing DPIAs for GDPR compliance. The overlap is significant.
🧮 4. Transparency Rules Apply to Most AI Use Cases
Even if your AI use is not “high-risk,” you still have to follow transparency rules. These include:
-
Letting people know when they interact with AI (e.g. chatbot)
-
Explaining how AI-generated results are created
-
Labeling AI-generated content (text, video, audio)
-
Disclosing whether user data was used to train the AI
This applies to:
-
Marketing chatbots
-
Dynamic pricing tools
-
Predictive analytics
-
AI-generated emails or proposals
🧠 Create a reusable “AI Use Notice” and include it in your privacy policy or onboarding flows.
📊 5. You Must Monitor and Document AI Use — Continuously
Even small teams are expected to keep records of how they use AI.
Your documentation should include:
-
A simple AI system register
-
Logs of usage, retraining, or API modifications
-
Known risks and how they’re addressed
-
Any incidents, complaints, or biases discovered
✅ Use a Notion board, Google Sheet, or Excel file.
You don’t need fancy software — you just need proof that you’re paying attention.
🤖 What If You Use ChatGPT, Claude, or Gemini?
That’s where most SMEs land — and yes, you still need to comply.
You must:
-
Identify whether the system is general-purpose AI (GPAI)
-
Check if the provider has done their EU paperwork
-
Keep documentation on how you use it and for what purpose
💬 Example: If you prompt GPT to screen CVs and make hiring suggestions, you’re deploying a high-risk system — and may even count as a provider.
🧰 SME Action Plan
Here’s how to get compliant, step-by-step:
✅ 1. List all AI tools
Include marketing, HR, analytics, customer service, etc.
✅ 2. Define your role
Are you a user, integrator, modifier, or full developer?
✅ 3. Check the risk
Use Annex III of the AI Act to classify your use.
✅ 4. Document it
Create a simple AI register with:
-
Tool name
-
Purpose
-
Data used
-
Risk category
-
Compliance status
✅ 5. Prepare a FRIA (if needed)
Use our coming template to evaluate risks in a legally sound way.
📣 SME Support Coming Soon
We’re building the first AI compliance toolkit designed for small businesses.
🛠️ Inside the AI Act Toolkit:
-
✅ FRIA Templates
-
✅ AI System Registers (Excel + Notion)
-
✅ “AI Use Notice” builder
-
✅ Compliance checklists for OpenAI, Meta, and Google tools
🎯 Launching soon at privacy-docs.com
📬 Join the waitlist now to receive it the moment it drops.
🔗 Keep Exploring:
👉 What is GDPR+? A Simple Guide for SMEs