Whether in the news feed, streaming or in the online shop – we encounter algorithms everywhere on the Internet. They help decide what we see, read or are suggested – and what we don't. But what exactly is behind it? How do algorithms work? How do they influence our perception? What can we do to deal with them more consciously?
Sentences like "The algorithm has decided" or "The algorithm has shown me that" are often heard. It almost sounds as if one were talking about a thinking, invisible force.
What are algorithms?
Algorithms are a fixed sequence of instructions programmed by humans. These must be clear, unambiguous and complete. In the digital world, algorithms are used in various places. In search engines, online shops, streaming services or on social platforms, they regulate the technical processes in the background. The actual source code of algorithms remains a trade secret of the platforms. In Europe, however, according to the Digital Services Act, they must explain how their algorithms work and offer users control options.
Especially on digital platforms, algorithms help to sort and filter content. Without it, we would be overwhelmed by an unmanageable flood of information. They analyze what we like, what videos we watch, what we wipe away. After that, they control which content is preferred to be shown to us.
How exactly they filter data and what they should search for is not up to them, Tim Schmitz explains to us. He is a system architect and works in the Innovation Lab of
Where does it become problematic?
The feeling that the algorithm has its own personality often arises because its mechanisms are not transparent to us. Sometimes we only see cat videos for days, then suddenly politics or dance trends.
Algorithms do not act freely. Let's take the example of social media: The algorithms of YouTube, TikTok, Facebook and Co. are programmed in such a way that they support the goals of their operators to keep us on the platform for as long as possible. There is an economic interest behind this: the longer the content arouses our interest, the more data of our behavior can be collected, advertising can be placed, and sales can be generated.
Attention is the business model and algorithms are the tool. That's why they are programmed to prefer content that touches us emotionally, makes us curious or encourages us to scroll. What is profitable for platform operators can be problematic for users. Disinformation, conspiracy myths or hate speech often provoke many reactions because they polarize, provoke or shock. Since such content triggers a lot of interaction, it is displayed more often and thus disseminated. This becomes even more problematic in times of political elections, social tensions or crises.
This is how you consciously deal with algorithms
- Develop understanding: The selection of content that is displayed to you on social networks or search engines is based on your usage behavior. Be aware of this, and it will be easier to deal with it.
- Question recommendations critically: Not everything that is automatically suggested actually corresponds to one's own interests. It is worthwhile to consciously check recommendations and also select other content.
- Search for other perspectives: Regularly using media that offer other political or cultural perspectives is very helpful in getting out of your own filter bubble.
- Clean up your own feed: Delete your search and activity history in the settings regularly. You can also completely restart your feeds, forcing the algorithm to get to know you again.
- Include other sources in your opinion-forming: Expand your information mix to include other media such as radio, TV or newspapers.
Algorithms are tools programmed by humans to pursue specific goals. This is exactly where our responsibility lies: it is we ourselves who feed the algorithm with every click, every like and every swipe. So, it's worth questioning not only what we're being shown, but also why.
Preview
In the next article in our "Click by Click" series, we'll look at what the "Illusory Truth Effect" is and why repetition makes content believable, even if it's been refuted.
Against hate online: For respectful and democratic coexistenceSince 2020, Telekom has been committed to a digital world in which everyone can live together according to democratic principles. The company stands for diversity and participation and are resolutely against opinion manipulation, exclusion and hate on the internet. This commitment is part of Deutsche Telekom's social responsibility. Together with strong partners, Deutsche Telekom empowers and sensitizes society to respectful interaction in the digital world. The company also promotes digital skills with numerous initiatives and offerings, such as Teachtoday.
All information on Telekom's social commitment can be found at
https://
https://
https://www.teachtoday.de/en/