Algorithms usually give you what you ask for

One complaint I’ve seen a lot of recently is that technology algorithms are bad for us and are dividing our society. The Twitter “Trending” tab is blamed for all manner of political polarization, Tiktok is said to specifically show us inflammatory videos, and YouTube was long bemoaned as a haven for unhinged videos that could send people down a rabbit hole of radicalization if they clicked the wrong thing. All this is supposedly because there is an evil algorithm controlling them and making them show us these evil things, but is the algorithm evil or are viewers just addicted to evil content? That’s not a blithe or pithy statement, what you have to understand is that 99.9% of the time the algorithm is giving you engaging content. When you first start out on Twitter or YouTube, the algorithm doesn’t know anything about you or your account, so it starts by giving you content that has been highly engaging to other users. Eventually you start clicking around on the content, engaging with some of it and ignoring others, and the algorithm tracks your clicks to learn what you specifically will engage with. So when you find the algorithm is handing you solely inflammatory political videos, it’s very likely because you and others have spent a lot of time and clicks watching those. Remember to that hate watching is still watching so if you watched a bunch of far-right or far-left content for the sole purpose of being angry and commenting on them, the algorithm knows that this is the best way to farm your clicks and your time and so will keep giving you those.

That’s not to say that the algorithms are perfectly impartial by any means, there is always some amount of “secret sauce” in each of them that is controlled by the company. The simplest and most obvious version of this is that content from advertisers obviously get shown to you no matter what you click, and it’s likely that most of these sites also use a version of payola to allow certain content to pay for getting highly promoted by the algorithm. But at the end of the day the algorithm is a click farmer, and if people didn’t click on what it provided them, then the company that made it (Google, Meta, whoever) would quickly find themselves with less engaged users, less ad revenue, and in need of a new algorithm. So in conclusion, the algorithm itself isn’t evil, people are addicted to evil content.