The AI Act should not contain exemptions for ‘national security’

The AI Act should not contain exemptions for 'national security' | INFBusiness.com

(Photo: Tony Gonzalez)

It is not rare for the EU Council to be out of touch with public opinion. This has happened once again with the EU AI Act, by exempting AI systems developed or used for ‘national security purposes’ from oversight and controls. We are looking to the EU Commission and the Parliament to remedy this.

‘National security’ means different things to different people, lacking a strict, agreed definition. Therefore, any exemption for its sake is vague from the outset and open to abuse. It is highly dependent on national government classifications, which could use their definition to label their citizens or interest groups (e.g. climate protestors) as ‘extremist’ or ‘terrorist’.

Within the context of using powerful AI tools, such as facial recognition for surveillance, a ‘national security’ exemption can harm our fundamental rights and freedoms. Surveillance has been already causing a chilling effect on the rights to protest and freedom of speech, as well as being harmful to free and open democracies.

European citizens recognise this threat. In a result from 12 European countries, a representative sample of citizens provided largely unison views on the use of AI by public institutions, especially in the context of national security.

Over half of adults are concerned about the use of AI in national security or defence, and a clear majority (over 70 percent) thought that governments should always respect the rights of all individuals and groups in this context — meaning, without exemptions.

Moreover, nearly two-thirds would feel concerned if another EU country they were travelling to had fewer protections of their rights and freedoms when it comes to the use of AI by secret services — an easily imaginable scenario if this exemption is included

It seems that EU Member States are out of touch with what security means to their citizens. Based on the proposed EU AI Act, any AI systems that would pose a high risk to people’s rights (such as AI in the policing and border control context) would need to be developed and used under increased scrutiny and assessment.

The original Commission’s proposal did not include specific exemptions regarding AI developed or used for national security purposes, leaving the matter to be clarified under the rules on the division of competence between the EU and Member States.

The Council, however, from its first compromise proposal, indicated that member states want to remove the rules for their use of AI in defence and national security. The Council even expanded that blanket exemption in the latest leaked compromise text to include not only AI systems developed or used for national security purposes in the EU, but also AI systems whose outputs are used in the Union for such purposes — regardless if the actors are public or private bodies.

The Parliament, on the other hand, is likely to side with the people, gearing up to fight the exemptions, as these clearly undermine any built-in prohibitions and safeguards in the EU AI Act.

Banning most intrusive biometric identification in public spaces has little meaning if vague exemptions claimed by national governments can apply in a patchwork of 27 different legal rules across the EU. How Brussels diplomats will explain these 27 regulatory frameworks for national security AI systems to the private sector, already grunting about fragmented rules, remains to be seen. The extension might also have an important impact outside the EU, as governments across the globe look to European regulation as a blueprint.

In our latest legal analysis, ECNL debunked the narrative of Member States’ “untouchable national competence” regarding national security issues, favoured by Brussels diplomats.

Sign up for EUobserver’s daily newsletter

All the stories we publish, sent at 7.30 AM.

By signing up, you agree to our Terms of Use and Privacy Policy.

In fact, CJEU rulings shows there is no inherent conflict between EU internal market regulations and Member States’ national security interests. As the EU AI Act will be an internal market regulation, there is no justification to completely exclude systems developed for this purpose from scope.

The commission and parliament must act to stop this exemption from passing into law.

Source: euobserver.com

Leave a Reply

Your email address will not be published. Required fields are marked *