EU’s New AI Law Bans ‘High-Risk’ Systems: here’s what that means

Agencies Ghacks
Feb 3, 2025
Development
|
0

The European Union has officially begun enforcing its AI Act, marking the first compliance deadline on February 2. This regulation allows EU authorities to ban AI systems deemed an "unacceptable risk" to individuals or society.

The AI Act, approved in March 2024 and in effect since August, categorizes AI risks into four levels: minimal, limited, high, and unacceptable. While minimal and limited-risk AI (such as spam filters and chatbots) face little oversight, high-risk systems, like those used in healthcare, are heavily regulated. The most severe category, unacceptable-risk AI, is now prohibited across the EU.

Some of the banned AI applications include social scoring, predictive policing based on appearance, real-time biometric surveillance in public spaces, emotion detection in workplaces or schools, and AI systems that manipulate or deceive individuals. Companies violating these rules could face fines of up to €35 million (~$36 million) or 7% of their annual revenue, whichever is greater.

However, certain exceptions exist. Law enforcement agencies can use biometric AI for targeted searches, such as finding missing persons or preventing imminent threats, provided they receive proper authorization. Similarly, AI systems for workplace or educational emotion analysis may be allowed if justified for medical or safety reasons.

While over 100 companies, including Amazon, Google, and OpenAI, have voluntarily pledged to comply early through the EU AI Pact, major players like Apple, Meta, and Mistral AI have not joined. Despite this, all companies operating in the EU will eventually have to adhere to the new rules.

The next major compliance deadline is in August, when enforcement mechanisms and penalties will officially take effect. As regulatory authorities continue to refine guidelines, businesses must navigate how the AI Act intersects with existing laws like GDPR, raising further compliance challenges.

Advertisement

Previous Post: «
Next Post: «

Comments

There are no comments on this post yet, be the first one to share your thoughts!

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.