Nijta and AFNOR: Crafting comprehensive privacy and confidentiality standards

Written By Anjali Sunil Kumar - Marketing Manager, Nijta



A very important milestone for the regulation, development and implementation of Artificial Intelligence, the AI Act is coming into the forefront after being published in the Official EU Journal on 12 July 2024. Initially proposed in 2021, the AI Act is a comprehensive legal framework that aims to ensure that AI systems used within the EU are deployed ethically and responsibly, to enhance innovation in AI, and to strengthen the trust of the public in the usage of AI. Protecting people’s personal data and securing confidential information of organisations are key requirements to achieve these objectives.How exactly can such an ambitious regulation be deployed?

Standards are the answer.


They can be set at international and national levels to ensure the practical implementation of the principles of the AI Act.

Understanding the Problem - Why privacy and confidentiality standards matter Today, data is more vulnerable than ever. Biometric data, which includes fingerprints, facial recognition, and voiceprints, is uniquely sensitive as it directly ties to an individual’s identity. Personal data including name, age, location, and profession, as well as all kinds of confidential data of organisations such as internal documents, meetings, client feedback, or financial data need similar protection. The misuse of such data can lead to severe privacy breaches, identity theft, loss of intellectual property and competitiveness, or even security threats. To tackle these risks the AI Act classifies AI into four categories according to their intended purpose and potential impact on health, safety or fundamental rights, namely Minimal risk, Limited risk, High-risk and Unacceptable risk. Most of the legal uncertainties around the AI Act are concerned with high-risk businesses, as the law often does not specify actionable compliance criteria and measures to achieve them. This is where standards come into picture.