Do you feel like you agree with most of the posts you see on your Facebook newsfeed? Some people are concerned that the internet is an enabler for our own echo chamber, where our own opinions are merely mirrored back to us.
So many algorithms are at play when it comes to the information we catch a glimpse online, that we start wondering if we’re missing out on so much more because we are provided only with the kind of information we already agree with.
For example, Google and Facebook are always on the ready to shower you with new things and trends based on your previous searches, likes and interests. In the end, we only see what we like or want to see, instead of encouraging a larger discourse in the online world.
However, Facebook’s data experts have to disagree, according to ta new peer-reviewed study issued in the Science journal. For the first time, Facebook got its scientists to sit down, do the math and see how much does its formula interfere with divergent opinions when it provides new posts on the user’s newsfeed.
After a thorough analysis, they found that Facebook’s algorithms aren’t the ones to blame here – it’s us. User choice and computer algorithms shouldn’t be pitted against each other, however, since users can only like or be interested in posts that have already been filtered by the algorithm and then fed to them.
Starting back in July 2014, Facebook researchers have collected and analyzed for 6 months anonymous information provide by 10.1 million US accounts. Surprising in no way, researchers discovered that our newsfeeds’ content comes from friends who reflect our ideological preferences.
The ranking of a link showing up in your newsfeed – whether it is at the start of the page or lower – is also determined by an algorithm interpreting your preferences. This was also proved to have an effect on the chance a user will choose to follow that link. This finding, in other words, meant that Facebook’s algorithm censors opposing content in fewer cases than the user’s own choices.
Choosing to click or not to click on a story in your newsfeed is not the same thing, however, as the algorithm hiding that piece of content from the user. We see the information, but we choose not to click on it – versus not even knowing that information existed in the first place.
Last year, Facebook had another initiative of researching its users; it was not favorably received by the public, as the social network was trying to determine the effects of positive and negative posts by altering the way the showed up in your newsfeed.
A lot of people were outraged by the fact that Facebook wanted to manipulate people’s moods, so the research came to an end.
However, if the newest data analysis is correct, we are the ones who can do something about the echo chambers we have comfortably created around us. And if Facebook also has its share in driving us apart, we’re the ones who choose to go along for the ride or hop off.
Image Source: The Verge