Skip to main content
Please wait...

 Database index

Soft law

Netherlands, IAMA (FRAIA)

Country
Netherlands
Full name of the document
Dutch: Impact Assessment Mensenrechten en Algoritmes (IAMA) English: Fundamental Rights and Algorithms Impact Assessment (FRAIA)
Body / organism enacting the document
Dutch Ministry of the Interior and Kingdom Relations
Nature of the body / organism
Public Authority
Type of document
Guidelines
Date
Project area
AI and public administration
Law area
Procedural law
Privacy / data protection
Fundamental rights protection

Summary of the document

The document aims at mapping the human rights-related risks in using algorithms and to take measures to address this. Started out as a discussion document meant to answer questions and establish good decision-making practices for government organisations that consider developing, delegating the development of, buying, adjusting and/or using an algorithm. It identifies three stages through which a decision-making process concerning the development and implementation of an algorithm goes: 1) preparation, 2) input and throughput, 3) output, implementation and supervision. It provides a checklist for different questions to be answered in each phase with a focus on the expertise/role required for answering these questions (for example, commissioning client, project leader, domain expert, possibly citizen panel, possibly interest group representative). The authors hope that discussing the issues addressed in this document may contribute to algorithms being used in a careful, thoughtful, and well-embedded manner. For human rights it contains a roadmap of questions that the governmental organisation should address:

1. Fundamental right: does the algorithm affect (or threaten to affect) a fundamental right? 2. Specific legislation: does specific legislation apply with respect to the fundamental right that needs to be considered? 3. Defining seriousness: how seriously is this fundamental right infringed? 4. Objectives: what social, political, or administrative objectives are aimed at by using the algorithm? 5. Suitability: is using this specific algorithm a suitable tool to achieve these objectives? 6. Necessity and subsidiarity: is using this specific algorithm necessary to achieve this objective, and are there no other or mitigating measures available to do so? 7. Balancing and proportionality: at the end of the day, are the objectives sufficiently weighty to justify affecting fundamental rights?

Type of addressees
Government organizations
Territorial scope
National
Situations involved

Wherever government organizations are considering using or are using algorithms

AI system(s) involved
  • All types of AI systems
Fundamental rights involved
  • Freedom of expression
  • Freedom of information
  • Right to access to justice, to a fair trial and to jury trial
  • Right to an effective remedy
  • Right to data protection
  • Right to good administration
  • Right to privacy
  • Right to non-discrimination
Principles expressly addressed
  • Accountability
  • Due process
  • Effective (judicial) protection
  • Equality
  • Explainability
  • Non-discrimination
  • Proportionality
  • Reasonableness
  • Rule of law
  • Transparency
Connection with hard law

Connected mainly to soft law and other policies:

●    Integrated impact assessment framework for policy and legislation
●    Algorithm assessment framework of the Netherlands Court of Audit (2021)
●    Ethics Guidelines for Trustworthy Artificial Intelligence (2019)
●    Guidelines for Algorithm Application by Governments and public education on data analyses
●    An audit framework for algorithms (2020)
●    Algorithm assessment framework Netherlands Court of Audit (2021)
●    Good Digital Governance Code
●    Guideline for public education on data analyses (2021)
●    Data Protection Impact Assessment (DPIA) 
●    Non-discrimination by design guideline (2021)
●    Government information security baseline (BIO)
●    The FAIR and FACT-principles

Possible legal force and impact on national/supranational legal system

FRAIA found its way into the AI act of the EU and is likely to become hard law, possibly with some modifications, as soon as the AI Act enters into force (paragraph 5 of Article 29a of the joint text AI Act). 

Case author
Tobias Nowak; Olivia Davidson; Sara Molinari
Rijksuniversiteit Groningen