Seminar of digital democracy

Exploring the challenges for democracy in the digital age.

Session #3 • Algorithms and filter bubbles

27 October

> see slides or download .pdf


Introductory notes by Jonathan Piron

Algorithms

If we take a quick look at the news, ”algorithms” seem to be one of these new buzzwords (not so new but still…), served up in all sorts of ways, and presented as being both the cause and consequence of all our problems.

Fine. But what exactly are ”algorithms”? And why and how did they become so omnipresent in our digital lives (and thus IRL as well)? And, maybe more interestingly, what role do they play in the political and democratic aspects of our societies? Let’s dive in!

And first and foremost, let’s get the ”what exactly are we talking about?” question out of the way.

WTF is an algorithm?!

algorithm.gif

Basically, an algorithm is just a set of instructions. A bunch of rules, written by an engineer, to be followed by a computer in order to execute tasks and output a specific result. Algorithms are in every search engine, social media apps, etc., and deal with the data we feed them. Algorithms can also ”learn”. And this leads to what we call ”artificial intelligence” (AI). Examples include the sorting algorithms of our social media news feed, the recommendation algorithm on YouTube, or even the Google Search algorithm. And those algorithms have become more and more “popular” since we realized they really shaped our always more ”technologized” society in a certain way.

Code is law?

Code is law”, once said law professor and activist Lawrence Lessig. Even back in the days, he couldn’t be more right.

Those algorithms seem to pull the strings of most of our online interactions, and the big corporations behind them gradually imposed themselves as the law-makers of the Internet. And the web being the interface of just about anything that connects our everydays lives, these laws do of course not only impact the virtual world. Since ”no one is supposed to be ignorant of the law”, everyone should understand the code lines that rule the world. Or in other words, how these algorithms work. If only we could know what’s inside them.

As a starting point, I’d like to bring this food for thought from Seth Godin, questioning the (lack of) transparency in the creation of the algorithms that rule our digital world (and thus, again, our everyday lives!), and reminding us that algorithms are not really obscure computer things, they’re the creations of human beings. He finishes with an interpellation for the Google, Facebook and alike:

”Just because it makes the stockholders happy in the short run doesn’t mean it’s the right choice for the people who trust you.”

Therefore, as far as governance and democratic principles are concerned, we have to ask ourselves a few questions about those sacrosanct algorithms:

  • Who controls the algorithms?
  • What happens in these black boxes?
  • What data comes in?
  • How do we use the results?

The problems with algorithms

From these questions, we can draw a few quick reflections:

  • The lack of transparency in the design of the algorithms that govern our digital lives is a true democratic issue.
  • Anyone building an algorithm brings their own biases in the code itself, and these algorithmic biases that can lead to even more discrmination (racism, sexism).
  • Filter bubbles, resulting from the way algorithms sort information online, creates more division and lack of understanding between groups of people.

With great power comes great responsibility. So, maybe we should go from governance by algorithms to governance of algorithms. But in order to achieve this, perhaps we need to start by changing the democracy algorithm itself.

Go further

More food for thought

  • Even in the ”democratic innovations” field, algorithms are being developed and must be questioned to make sure they are ”fair” and transparent (see the Sortition Foundation algorithm and stratification app, as well as this article in Nature)

Collective notes

“Tour de table”

What have people learned?

  • What can we do to raise awareness on algorithms?
  • What we see on youtube is what we want to see
  • Study on Tik Tok: racist content are more pushed by algorithms
  • Algorithms are complex to understand, becoming more conservative by seeing some content
  • parents taken FB to court, FB chose a lot of info and bcse of the filter bubble, we stay in this kind of info
  • Algorithms increase polarization of opinions → they also increase connectivity
  • China use algorithms to control their workers, “Reconquête” tries to manipulate French elections with algorithms but algorithms are also used by the police to prevent pedocriminality → Tool that can be used for the best as for the worst
  • Existence of this filter bubble → really big phenomenon
  • Traditional media have filters (same for the TV), less filters regarding social media
  • Twitter contributing to filter bubbles and they have to do smtg about it, creating algorithms and they cannot go back, algorithms overtaking the creators
  • algorithms based on 60 criteria
  • some social media are politically biased : twitter (right-wing more put forward than left-wing); conservative messages amplified bcse of 2 things: right-wing can count on more hierarchical structure to amplify messages on social media and ideas are more based on fear, provocative things and provoke emotions and algorithms like this (>< left: concerns more struggle, less variable, harder to summarize these concepts)
  • Algorithm are not neutral
  • def of filter bubbles and algorithms; policies can be adopted some problems with algorithms and filter bubbles
  • Political algorithms aren’t really regulated.
  • Filter bubbles=influence what we see online; echo chambers=a way to …like-minded people; for democracies, we need a good flow of info, newspapers are critical because they are gatekeepers. Nowadays, we need another gatekeeper to control the flow of info (because journalists are not gatekeepers anymore)
  • algorithms change all the time (if we try to fix it, some can hack it)

Questions

  • Is it really algorithms that push these content
  • Is FB really responsible for the content that we see? Can we blame Facebook for a suicide?
  • How to get out this filter bubble?
  • Do different platforms like Youtube or Netflix work in the same way?
  • Is it mandatory to bring back traditional filters in social media?
  • How come the creators of algorithms cannot manage them anymore?
  • How the nazis use the Burneys(?) method of propaganda?
  • How can the content be so precise, especially on platforms like TikTok?
  • Do filters lead to dangerous radicalisation?
  • How the left can react to the far right hegemony on social media?
  • Is it possible for the creator to change the functioning of the algorithm?
  • Is it possible to apply these policies in the future?
  • Why aren’t political algorithms regulated by states?
  • Who could be a gatekeeper now to control the flow of information?
  • Even if we have control, would it be a good thing?

Key concepts

Filter bubbles

  • A space where our previous online behaviour (search history, likes, shares and shopping habits) influences what we see online and on social media and in what order
  • with filter bubbles, you can see things you disagree with; filter bubble is more individual, filter bubble = your space based on what you consume

Echo chambers

  • you only see the content of people who think like you
  • a space with different bubble inside, with different people but these people think like us, echo chamber is more collective Note: Those concepts are mostly amplified with social media, google but those things are not new, they were already there with mass media, …

Algorithms

  • code where it repeats itself based on the final entry, it increases its information
  • set of instructions like a recipe : you take data, you put it in the set of instructions
  • Description of a sequence of steps
  • An algorithm is the description of a sequence of steps to obtain a result from input elements. For example, a recipe is an algorithm for obtaining a dish from its ingredients! Polarization:
  • division within society between different groups where ideas are more and more distant from each other
  • Judgement
  • polarization as a process (because of echo chambers, your opinion becomes more
  • polarized because people support you)
  • Note: Psychological effect of positive reinforcement (of the algorithms)

Main issues

  • Bias linked to the color of the skin, the pol orientation, this kind of discourses based on feelings
  • We blame algorithms but some people created them: working on AI producing pictures and it shows less people of colour; companies produce things only for themselves so it has consequences →bias
  • lack of transparency: some people are not aware of algorithms, filter bubbles, …and a solution could be …(not deep studies because this kind of content will not useful to make majority aware of this).
  • When you publish pictures on Twitter and Twitter crops the picture, and the face of white people and men are the most favored → linked to the fact that engineers are mostly men and white people
  • Regulations: those issues happened before with mass media and they were regulated; it is something that takes time but we need to push to make it on the agenda
  • how we consume media, news articles: when we see an article we only read the title and we click on it, it is important to read all the article (to avoid other issues).

Ideas to go further

  • Pop the filter bubble
  • Regulations of private companies (FB, Twitter are not really public space, they are provided by private companies: it is good for freedom of speech
  • Authorities generally ask companies to deal with it themselves but it is not the best way to fix it → taxes could be a solution (taxing algorithms) to push companies to use it less or in a better way
  • Regulations of fake news: partnership of journalists who can control if it is a true info or not; have an independent structure which should not be linked to the state and the goal of this structure should not be to make money because it produces bias.