Monday, 25 May 2015

Don’t (just) blame Facebook: We build our own bubbles

Science Focus

original post »

We’ve all heard (or expressed) the concern that the Internet allows us to choose only those sources that agree with our ideology. The same “echo chamber” concern applies to social media, with an added twist—platforms like Facebook filter the content we’re shown based on what an algorithm thinks we’ll want to see. Is Facebook going to make sure we don’t have to see articles shared by the few friends we have that might challenge our views?

Given the fact that all your actions on Facebook leave a data trail, this is logistically a much easier question to answer than most. Several researchers at Facebook, led by Eytan Bakshy and Solomon Messing, dug into all that data to investigate.

They had plenty to work with. They limited the study to just the US users over 18 who listed political affiliations on their profile, logged in at least four times a week over the latter half of 2014, and clicked on at least one news/politics link. But they were still left with a tad over 10 million people to work with. (Names were stripped from the data, but in case you’re wondering, this is the kind of thing covered by the data policy you agree to when you sign up. Unlike the controversial “mood” study last year, there was no manipulation of content on Facebook for this study.)

Read 8 remaining paragraphs | Comments

 
#science 
 » see original post http://feeds.arstechnica.com/~r/arstechnica/science/~3/f8Lb_xI56kQ/
See Zazzle gifts tagged with 'science'

No comments:

Post a Comment