Unprecedented experiment on welfare surveillance algorithm reveals discrimination.

Suspicion Machines Methodology

Graphic Design

Imagine a government-owned system that scores your mental health history and relationships. It may sound like a plot from a Black Mirror episode, but it’s actually happening in reality. An investigation by Lighthouse Reports and WIRED unveils the truth behind it.

Every year, hundreds of people on welfare in a major European city find themselves under investigation because an automated system flagged them as fraud risks. What few of them realise is that they have been surveilled by an automated system which scores their lives from their mental health history, to their relationships, to the languages they speak. They have been placed under investigation by a machine which finds vulnerability suspicious.

The cover image of the methodology depicts a network of intricate lines that symbolize a data network. Each line of data is connected to an individual, representing how the algorithm uses this information to distinguish between different welfare system services, as outlined in the investigation methodology.

Creator: Fanis Kollias
For: Lighthouse Reports
Unprecedented experiment on welfare surveillance algorithm reveals discrimination.

Pin It on Pinterest

Share This