• Menu

0 recent results

Freedom of communication

Artificial intelligence

Your data

Artificial intelligence and algorithms are used for automated decision-making. That doesn’t always go right. And if it goes wrong, the consequences for citizens can be far-reaching. We believe that whenever technology is used, fundamental rights should be protected, decisions should always be transparent and verifiable and there should be close and effective supervision.

What's going on?

Everywhere we come, we are grilled. From the baby clinic to the funeral home – and every step in between – they all want access to all kind of personal data. Our information society is overcome with data collection mania. But what do all those organizations need those data for? And what do they do with those data?

Many organizations are trying out new data techniques like big data applications, algorithms, artificial intelligence (or AI) and machine learning. While local authorities still stumble around this new technical playground, the Dutch Tax Authorities have wronged thousands of people by using discriminatory algorithms. Although there is still plenty that goes wrong, it is impossible to imagine today’s society without algorithms and artificial intelligence.

What does Bits of Freedom think?

Automated decisions
Algorithms and artificial intelligence are catch-all terms that cover a multitude of sins. From a simple pancake recipe to predictive policing. Bits of Freedom’s focus lies on (partly) automated decisions. People have the right to know the basis for decisions that affect them. Decisions should always be explainable and the formation and lawfulness of decisions should be verifiable.

The story of data
Where automated decisions are concerned, it is of course important that the decision-makers think carefully about the desired outcome. But just as important is that the data to be used are examined in detail. Data on varying situations and events are collected, processed, interpreted and presented in statistics. Those statistics are seen as factual, objective and neutral. But statistics do not tell the story of how those data came about. They are limited to the “what” question, and disregard the “why” question. If the data show that women often earn less than men, they neglect to say that women are systematically discriminated in the employment market. Where the data indicate that certain groups of migrants show up more often in criminality, they disregard the police’s ethnic profiling. Data on the educational levels of children fail to mention that children from lower social classes, or children with a migration background often receive referrals for secondary schools at levels too low for them.

All those data and statistics tell a story of deep-rooted social problems and injustice. If we enter those data into algorithms and artificial intelligence applications without considering the interpretation of those data, it gets even harder to tackle the social issues in our society.

What’s more, data are not always true reflections of society and the world we live in. White heterosexual men are usually well-represented in data, but any group – and especially any other individual – that is different, is not. Still, data are considered “neutral”, and that is how the “standard” and anything that “deviates from” that standard are institutionalized. And through decision trees, human errors that cause inequalities are also standardized in algorithms.

What does Bits of Freedom do?

  • We give solicited and unsolicited advice at a national and European level to create laws and policies that offer the best possible protection of human rights.
  • We appeal for close supervision by a regulator with adequate resources to take effective action.
  • We update you on AI-issues in plain language. For instance, how do you arm yourself against political parties that try to win your vote on social media? How do the police predict that you might be a member of a suspect group that could commit a criminal offense? And all those cameras downtown, do they recognize your face already? Any other subjects that you’d like to know more about? Please tell us!
  • We know that the use of technology in decision-making comes with many risks. Discrimination, lack of verifiability and unlawful decisions, to mention a few. Is it still possible to turn the tide, or to use technology to create a fairer society? This is what we aim to investigate with others.

Articles on Artificial intelligence

Help us and support us

With my contribution, I support Bits of Freedom, this can be monthly or one-off.

Thank you supporter of free internet!

You'll receive the login code per email.

As a supporter, you will receive a special update every quarter, but for regular updates about our work, subscribing to our newsletters is best. You can subscribe below!

    Success!

    You'll receive our newsletter as well. You can close this popup.

    Something went wrong during payment

    Your payment was not processed correctly, please try again.

    Support and donate!

    Want to know more about donating to us? Read all about it here.