Who does the AI Act apply to?

The legislative framework will apply to both public and private entities inside and outside the #EU if the AI system is placed on the Union market or its use affects persons located in the #EU.

It can apply to both providers (e.g. the developer of a resume screening tool) and those implementing high-risk AI systems (e.g. a bank that purchased a resume screening tool). Importers of AI systems will must also ensure that the foreign supplier has already carried out the relevant conformity assessment procedure and that the relevant AI system bears a European Conformity Mark (CE) and is accompanied by the necessary documentation and instructions for use.

In addition, certain obligations are foreseen for providers of general-purpose AI models, including large generative #AI models.

Free open source model providers are exempt from most of these obligations. This exemption does not cover the obligations of providers of general purpose AI models with systemic risks.

The obligations also do not apply to pre-market research, development and prototyping activities, and the regulation does not apply to #AI systems that are exclusively for military and defense purposes or for purposes in the field of national security, regardless of the type of entity performing these activities

Which risks will be covered by the new AI rules?

The deployment of #AI systems has great potential to deliver societal benefits, economic growth and boost #EU innovation as well as global competitiveness. In some cases, however, the specific characteristics of some AI systems may lead to new risks related to consumer safety and fundamental rights. Some powerful AI models that are widely used could even pose systemic #risks.
This leads to legal uncertainty for companies and a potentially slower uptake of AI technologies among businesses and citizens due to a lack of trust. An unsynchronized regulatory response by national authorities risks fragmenting the internal market.