{"id":4713,"date":"2016-12-05T14:48:55","date_gmt":"2016-12-05T13:48:55","guid":{"rendered":"https:\/\/ounews.co\/?p=4713"},"modified":"2016-12-05T14:48:55","modified_gmt":"2016-12-05T13:48:55","slug":"filter-bubble-isnt-just-facebooks-fault","status":"publish","type":"post","link":"https:\/\/www.open.ac.uk\/blogs\/news\/education-languages-health\/languages\/filter-bubble-isnt-just-facebooks-fault\/","title":{"rendered":"The filter bubble isn&#8217;t just Facebook&#8217;s fault &#8212; it&#8217;s yours"},"content":{"rendered":"<p>Following the shock results of Brexit and the Trump victory, <a href=\"http:\/\/www.theguardian.com\/technology\/2016\/nov\/10\/facebook-fake-news-election-conspiracy-theories\">a lot of attention<\/a> has focused on the role that Facebook might have played in creating online political ghettos in which false news can easily spread. Facebook now has serious political influence thanks to its development from a social networking tool into a primary source of news and opinions. <a href=\"http:\/\/www.nytimes.com\/2016\/11\/14\/technology\/facebook-is-said-to-question-its-influence-in-election.html\">And for many<\/a>, the way it manages this influence is in need of greater scrutiny. But to put the blame solely on the company is to overlook how people use the site, and how they themselves create a filter bubble effect through their actions.<\/p>\n<p>Much of this debate has focused on the design of Facebook itself. The site\u2019s personalisation algorithm, which is programmed to create a positive user experience, feeds people what they want. This creates what the CEO of viral content site Upworthy, Eli Pariser, calls \u201c<a href=\"http:\/\/www.ted.com\/talks\/eli_pariser_beware_online_filter_bubbles\">filter bubbles<\/a>\u201d, which supposedly shield users from views they disagree with. People are increasingly turning to Facebook for their news \u2013 <a href=\"http:\/\/www.journalism.org\/2016\/05\/26\/news-use-across-social-media-platforms-2016\/\">44 % of US adults<\/a> now report getting news from the site \u2013 and fake news is not editorially weeded out. This means that misinformation can spread easily and quickly, hampering the chance people have for making informed decisions.<\/p>\n<p>Over the last few weeks, there have been <a href=\"http:\/\/www.niemanlab.org\/2016\/11\/the-forces-that-drove-this-elections-media-failure-are-likely-to-get-worse\">frequent calls<\/a> for Facebook to address this issue. President Obama himself has <a href=\"http:\/\/www.theguardian.com\/media\/2016\/nov\/17\/barack-obama-fake-news-facebook-social-media\">weighed in on the issue<\/a>, warning of the perils that rampant misinformation can have for the democratic process.<\/p>\n<p>Much of the debate around this, however, has had an element of technological determinism to it, suggesting that users of Facebook are at the mercy of the algorithm. In fact, our research shows that the actions of users themselves are still a very important element in the way that Facebook gets used.<\/p>\n<p><a href=\"http:\/\/www.palgrave.com\/us\/book\/9781137029300\">Our research<\/a> has been looking specifically at how people\u2019s actions create the context of the space in which they communicate. Just as important as the algorithm is how people use the site and shape it around their own communications. We\u2019ve found that most users have an overwhelming view that Facebook is not ideally suited to political debate, and that posts and interactions should be kept trivial and light-hearted.<\/p>\n<figure class=\"align-center \"><img decoding=\"async\" src=\"https:\/\/62e528761d0685343e1c-f3d1b99a743ffa4142d9d7f1978d9686.ssl.cf2.rackcdn.com\/files\/148625\/width754\/image-20161205-19414-1g0buo4.jpg\" alt=\"\" \/><figcaption><span class=\"caption\">Shutting down conversation.\u00a0<\/span><span class=\"attribution\"><span class=\"source\">Shutterstock<\/span><\/span><\/figcaption><figcaption><\/figcaption><\/figure>\n<p>This isn\u2019t to say that people don\u2019t express political opinions on Facebook. But for many people there\u2019s a reluctance to engage in discussion, and a sense that anything that might be contentious is better handled by face-to-face conversation. People report that they fear the online context will lead to misunderstandings because of the way that written communication lacks some of the non-linguistic cues of spoken communication, such as tone of voice and facial gestures.<\/p>\n<p>There\u2019s strong evidence in our research that people are actually exposed to a great deal of diversity through Facebook. This is because their network includes people from all parts of their life, a finding that <a href=\"http:\/\/poq.oxfordjournals.org\/content\/early\/2016\/03\/21\/poq.nfw006.short\">echoes other research<\/a>. In this respect, the algorithm doesn\u2019t have a marked influence on the creation of filter bubbles. But because they often want to avoid conflict, people report ignoring or blocking posts, or even unfriending people, when confronted with views with which they strongly disagree.<\/p>\n<p>They also report taking care of what they say themselves so as not to antagonise people such as family members or work colleagues whose views differ from theirs, but whose friendship they wish to maintain. And finally, they talk of making a particular effort to put forward a positive persona on social media, which again stops them from engaging in debate which might lead to argument.<\/p>\n<h2>Not so easy to fix<\/h2>\n<p>The idea that algorithms are responsible for filter bubbles suggests it should be easy to fix (by getting rid of the algorithms), which <a href=\"http:\/\/www.scientificamerican.com\/article\/facebook-s-problem-is-more-complicated-than-fake-news\/\">makes it an appealing explanation<\/a>. But this perspective ignores the part played by users themselves, who effectively create their own filter bubbles by withdrawing from political discussions and hiding opinions they disagree with.<\/p>\n<p>This isn\u2019t done with the intention of sifting out diversity but is instead due to a complex mix of factors. These include the perceived purpose of Facebook, how users want to present themselves in an effectively public form, and how responsible they feel for the diverse ties that make up their online network.<\/p>\n<p>The fact that manipulation by the algorithm isn\u2019t the only issue here means that other solutions, for example raising people\u2019s awareness of the possible consequences that their online actions have, can help encourage debate. We have to recognise that the impact of technology comes not just from the innovations themselves but also from how we use them, and that solutions have to come from us as well.<\/p>\n<p><a href=\"https:\/\/theconversation.com\/profiles\/philip-seargeant-317748\">Philip Seargeant<\/a>, Senior Lecturer in Applied Linguistics, <em><a href=\"http:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em> and <a href=\"https:\/\/theconversation.com\/profiles\/caroline-tagg-317557\">Caroline Tagg<\/a>, Lecturer in Applied Linguistics and English Language, <em><a href=\"http:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em><\/p>\n<p>This article was originally published on <a href=\"http:\/\/theconversation.com\">The Conversation<\/a>. Read the <a href=\"https:\/\/theconversation.com\/the-filter-bubble-isnt-just-facebooks-fault-its-yours-69664\">original article<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Following the shock results of Brexit and the Trump victory, a lot of attention has focused on the role that Facebook might have played in creating online political ghettos in which false news can easily spread. Facebook now has serious political influence thanks to its development from a social networking tool into a primary source [&hellip;]<\/p>\n","protected":false},"author":19,"featured_media":4714,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[10],"tags":[855,866,890,1326,2055],"class_list":["post-4713","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-languages","tag-facebook","tag-false-news","tag-filtering","tag-linguistics","tag-social-media"],"_links":{"self":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/posts\/4713","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/users\/19"}],"replies":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/comments?post=4713"}],"version-history":[{"count":0,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/posts\/4713\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/media\/4714"}],"wp:attachment":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/media?parent=4713"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/categories?post=4713"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/tags?post=4713"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}