What is EUAIA (EU AI Act)?
The EU's regulatory framework for artificial intelligence systems, in force since August 1 2024 with full applicability from August 2 2026 for high-risk Annex III systems. Penalties: up to €35M or 7% of global turnover.
Full explanation
The EU AI Act classifies AI systems by risk: prohibited practices (Article 5), high-risk systems (Annex III: biometric / employment / education / credit / law enforcement / migration / critical infra), limited-risk systems (transparency obligations), and minimal-risk systems (no requirements). High-risk providers must satisfy Article 11 (technical documentation per Annex IV), Article 9 (risk management), Article 14 (human oversight), Article 61 (post-market monitoring), and choose a conformity assessment route (self per Annex VI or Notified Body per Annex VII). Effective enforcement: Aug 2 2026.
Example
A startup builds a credit-scoring AI for retail loans (Annex III #5(b)). It must: (1) draft Article 11 technical documentation, (2) emit a CycloneDX AIBOM, (3) document risk management per Article 9, (4) design human-oversight per Article 14, (5) establish post-market monitoring per Article 61, (6) self-assess (Annex VI) and affix CE marking before placing the system on the EU market.
Related
FAQ
Is my SaaS in scope?
Only if your product is high-risk per Annex III: credit / employment / education / law enforcement / migration / critical infra / biometrics / certain child-protection contexts. Most B2B SaaS is NOT high-risk. The artificialintelligenceact.eu site has a high-level summary you can match against your specific use case.
What if I'm a non-EU provider?
If your system is placed on the EU market or its outputs are used in the EU, you're in scope. Designate an EU representative + comply with Article 11 + Annex IV documentation requirements.