you ever think about how even sorting algorithms can be, on a wider scale, extremely dangerous? the tip of the iceberg is that small experiment duckduckgo did showing that more people picked alternative search engines when google wasnt the very first option.
on a more serious scale, twitter could very much sort controversial replies in support of bigotry on the top of the replies list over repetitive replies rightfully calling out bigotry. youtube can sort biased far right news for news results over factual information.
idk mostly the point is even mundane stuff like choosing what information is shown first before others can lead to misinformation, bias, and even recruiting/convincing others into bigoted stances
and in a more general sense this is why everyone says not to quote retweet a known bigot saying something dumb just to dunk or own them or whatever, youre giving more screen space to them. same thing sorting bigoted content to the top ends up doing......
reason i wrote this thread is twitters whole "copypasta" thing, im worried that filtering out repetitive tweets could be used for much more than just repetitive memes