Algorithmic Register

OASI 1st Annual Report

After months of work, we are thrilled to present the first annual report of our Observatory of Algorithms with Social Impact (OASI). We want to share its key findings, and  invite you to take a look at it to get a deeper understanding of this sample of the algorithmic landscape.

Observatory of Algorithms with Social Impact

In October 2021, Eticas launched OASI as an answer to the lack of control over the number of algorithms that make crucial decisions in our lives and with a clear potential to harm vulnerable sectors of the population. This registry started with 57 entries and has been constantly updated ever since, counting over 110 after its first year.

In the report, you can find an exhaustive analysis of the algorithms and the tendencies when it comes to domain, impact, or aim, among others. Here you can find some highlights: 

Abstract
Key Concepts
  • Almost 64% of the registered algorithms are used in public administration.
  • Only 12% of the algorithms registered so far have been properly audited.
  • The domain with the most entries is policing and security.
  •  The three aims with the highest number of entries are complementary to each other: profiling and ranking people, predicting human behavior and evaluating human behavior.

Different types of discrimination against particular social groups is by far the most identified impact, being socioeconomic discrimination the most predominant.

OASI

Read theOASI1st Annual Report


Full Report

52

Socioeconomic discrimination algorithms

39

Racial discrimination algorithms

35

Threat to privacy algorithms

19

Gender discrimination algorithms

What is OASI?

The OASI Register comes from the need to keep an eye on and catalog what algorithms are being developed and implemented by whom, where, since when, primarily so that the general public and experts on algorithms can use the catalog to get informed, and as a first step and reference to conduct further research on the algorithmic field. 

Eticas would like OASI to become a shared international effort, a global database of algorithms managed by an international consortium of different and diverse entities representing every world region, so we are open to teaming up with other organizations. 

This registry of social-impact algorithms hopes to empower citizens to find out more about how crucial decisions about their lives and opportunities are being made and help civil society organizations map how and where algorithms are impacting their constituents. 

The ultimate goal of OASI is to push governments and regulatory bodies around the world to keep comprehensive, transparent registries of the algorithms impacting citizens and rights.

The Social Impact of the algorithms registered in OASI

The OASI Register lists algorithms that may cause or increase the risk of gender, racial or religious discrimination (as well as other types of discrimination), social polarisation or radicalisation, state surveillance, a threat to privacy, generating addiction, manipulation or behavioural change, disseminating misinformation, and weakening of democratic practices.

We are entertained by Youtube, Spotify or TikTok, which algorithms suggest to us what content we should like, making our discovery and openness span smaller.

So it happens with Social Media when we use it to get informed. It’s been proven that Facebook and similar social media companies show us content that’s highly personalized to our profile, which in the long run develops into confirmation bias and as a result, social polarization. 

This polarization is a clear threat to democracies, but what happens when these systems make decisions over our security and it’s used by public administrations to make crucial decisions?

The algorithms' most common aims registered in OASI

The Register also distinguishes between the different possible aims of algorithms: compiling personal data, evaluating and predicting human behaviour, recognising facial features, identifying images of faces, profiling and ranking people, simulating human speech, recognising images, generating automated translations, generating online search results, recognising sounds, carrying out language analysis, making personalised recommendations, ranking bids and other content submissions, and automating tasks.

We found that the three aims with the highest number of entries are quite complementary of each other:

  1.  “profiling and ranking people” (60 algorithms)
  2. “predicting human behavior” (44 algorithms) 
  3. “evaluating human behavior” (32 algorithms)
  4. “automating tasks” (31 algorithms)

Between the lines

After a year keeping track of algorithms in multiple sectors, our findings:

  • The domains or sectors where we found more algorithms in use are: 
    • policing and security” with 30 algorithms
    • social services” with 25 algorithms. 
    • labour and employment” with 17 algorithms
    • communication and media” with 15 algorithms
    • business and commerce” with 14 algorithms

 

  • Only 12% of the algorithms registered so far have been properly audited. This is clearly insufficient, especially if we bear in mind that audited algorithms are over-represented in our sample.

 

  • As algorithmic technologies evolve and spread from low-risk to high-risk domains, so do risks and impacts.We are witnessing a transition of algorithmic systems from leisure and entertainment domains (like social media and marketing) into high-risk areas. In high risk domains such as health, social services, work, education or security, the stakes are also higher.

 

  • Carrying out external audits of algorithms helps push the topic of algorithmic accountability in the public agenda.
    • Given that many algorithms are biased against gender, racial and other minorities, and disadvantaged groups, it is concerning that policing, security and social services are the two domains where we find nearly half of the algorithms in the OASI Register. 
    • Social media algorithms have most probably contributed to the spread of disinformation and to social and political polarization
    • OASI and the Register are necessarily a work in progress.

 

  • Mandatory registration of all social impact algorithms is long overdue. OASI proves that this is feasible and useful. Having such transparency and accountability efforts led by public institutions is urgent.

The OASI Register contains information that is in the public interest and should be easily accessible and in the public domain, and Eticas is open to teaming up and partnering with other organisations, so that we can work together towards producing and maintaining a global database of algorithms.