@edgarblythe,
Some of this is the nature of how social media itself works. Popular works tend to become more popular. Unpopular works are buried. This is baked right into the algorithms, and it isn't intended to suppress truth.
Rather, let's say I share a story on Facebook about brown cats being smarter than white ones. Forget whether it's true or the source is balanced or I have an agenda or whatever. It's just content that I share.
My friends all see this piece of content. Some find it interesting. Others, not so much. Others might find it fascinating, but they get such a tsunami of data flying past their news feeds that they don't see it. E. g. if they only have 40 minutes (
standard amount of time people are on Facebook, according to Bloomberg News) to play on Facebook, but there's 80 minutes' worth of stuff for them to read, then they will miss out on half of what's flying across their own personal transom.
If edgar likes my cat story and shares it with his friends (
average number of Facebook friends is 338, according to Pew Research), then even more people are served with this piece of content. Now, edgar and I probably have some overlapping friends, so the number of people seeing the content goes from my 338 pals (I actually have somewhere around 2,000, but let's keep me at the average to make things a little easier to follow) to the maybe 212 friends he has that we don't have in common. So now that piece of content, with just two shares, is being served to 550 people.
He doesn't even have to share it in order to serve this number of users. He just has to comment or like it. If a few more people engage with the content, it's going to rapidly hit 1,000 people served, and even 10,000, and that's just within our group of friends.
Facebook does this on a million-friend basis every single day. So do other platforms, like Twitter or YouTube.
In the meantime, because there is so much data, we humans have to parse it all. We can't be on 24/7, so we make choices all the time. Some choices are made by the algorithms, but we also make our own. People might click to not follow the source of my content, or they might unfriend or block me or edgar (sorry edgar!). The algorithms eventually learn a lot about you from all of these social signals you are providing.
And this doesn't even get into virality, the kind of content that spreads like wildfire. This is just regular old Facebook stuff.
Why do some stories do better than others? Interest, engagement, compelling headlines (don't underestimate clickbait headlines; they exist because the tactic works), subtle endorsements from trusted friends all play a part. In some ways, it's like the ultimate game of Telephone, too, where a message is subtly altered as it's passed along. It's not even necessarily deliberately or a sign of evil. It's just how messages degrade over time.
Recognize, too, that initial stories always do better than retractions, so even if a story was posted and then found to be exaggerated or otherwise untrue, the retraction might not have gotten around.