🔥 Civil society’s urgent warning: No to a GPAI Code without fundamental rights protection!
On Tuesday March 27, EU law makers sent a letter to Commissionner Virkkunnen calling for the General Purpose AI Code of Practice to faithfully implement the EU AI Act.
👉 The risk management system at the heart of the Code SHOULD address risks to fundamental rights. This is not an option. The AI Act makes it mandatory.
👉 Any attempt to reduce the scope of the Code risk management system by narrowing down the definition of GPAI model with systemic risks is contrary to the EU AI Act.
In a nutshell: any other interpretations would be an usurpation of legislative power.
🔥🔥🔥 In a letter sent today to Commissionner Virkkunnen, her cabinet, the cabinet of Ursula Von der Leyen, the AI Office, the Chairs and Vice-chairs of the Code Working Group 2 in charge of these issues, civil society organisations, and other independent experts, participating in the negotiations of the Code, are urging them to amend the Code to make it in conformity with the EU AI Act.
Law makers, asked Commissionner Virkkunnen “to reject the Code” if it does not comply with the EU AI Act.
❌ Digihumanism will not endorse and other participants will not endorse either if the final draft betrays the EU AI Act.