{"id":21134,"date":"2022-05-15T12:33:20","date_gmt":"2022-05-15T11:33:20","guid":{"rendered":"https:\/\/ounews.co\/?p=21134"},"modified":"2022-05-15T12:33:20","modified_gmt":"2022-05-15T11:33:20","slug":"elon-musk-could-roll-back-social-media-moderation-just-as-were-learning-how-it-can-stop-misinformation","status":"publish","type":"post","link":"https:\/\/www.open.ac.uk\/blogs\/news\/arts-social-sciences\/society-politics\/elon-musk-could-roll-back-social-media-moderation-just-as-were-learning-how-it-can-stop-misinformation\/","title":{"rendered":"Elon Musk could roll back social media moderation &#8211; just as we&#8217;re learning how it can stop misinformation"},"content":{"rendered":"<p><span><a href=\"https:\/\/theconversation.com\/profiles\/harith-alani-893886\">Harith Alani<\/a>, <em><a href=\"https:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em>; <a href=\"https:\/\/theconversation.com\/profiles\/gregoire-burel-1343260\">Gr\u00e9goire Burel<\/a>, <em><a href=\"https:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em>, and <a href=\"https:\/\/theconversation.com\/profiles\/tracie-farrell-1138685\">Tracie Farrell<\/a>, <em><a href=\"https:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em><\/span><\/p>\n<p>The US$44 billion (\u00a336 billion) purchase of Twitter by \u201c<a href=\"https:\/\/www.theguardian.com\/technology\/2022\/apr\/14\/how-free-speech-absolutist-elon-musk-would-transform-twitter\">free speech absolutist<\/a>\u201d <a href=\"https:\/\/www.bbc.co.uk\/news\/business-61399483?at_medium=RSS&amp;at_campaign=KARANGA\">Elon Musk<\/a> has many people worried. The concern is the site will start <a href=\"https:\/\/www.washingtonpost.com\/technology\/2022\/05\/06\/twitter-harassment\">moderating content less<\/a> and <a href=\"https:\/\/www.reuters.com\/technology\/exclusive-twitter-set-accept-musks-best-final-offer-sources-2022-04-25\">spreading misinformation<\/a> more, especially after his announcement that he would reverse the former US president <a href=\"https:\/\/www.bbc.co.uk\/news\/business-61399483?at_medium=RSS&amp;at_campaign=KARANGA\">Donald Trump<\/a>\u2019s ban. <\/p>\n<p>There\u2019s good reason for the concern. Research shows the sharing of unreliable information can negatively affect the <a href=\"https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/1369118X.2021.1874038\">civility of conversations<\/a>, perceptions of <a href=\"https:\/\/www.nature.com\/articles\/s44159-021-00006-y\">key social and political issues<\/a>, and people\u2019s <a href=\"https:\/\/www.apa.org\/pubs\/journals\/releases\/xap-xap0000371.pdf\">behaviour<\/a>.<\/p>\n<p><a href=\"https:\/\/firstdraftnews.org\/articles\/the-psychology-of-misinformation-why-its-so-hard-to-correct\/\">Research<\/a> also suggests that simply publishing accurate information to counter the false stuff in the hope that the truth will win out isn\u2019t enough. Other types of moderation are also needed. For example, <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0306457321002144\">our work<\/a> on social media misinformation during COVID showed it spread much more effectively than related fact-check articles. <\/p>\n<p>This implies some sort of moderation is always going to be needed to boost the spread of accurate information and enable factual content to prevail. And while moderation is hugely challenging and not always successful at stopping misinformation, we\u2019re learning more about what works as social media firms increase their efforts.<\/p>\n<p>During the pandemic, huge amounts of <a href=\"https:\/\/www.oecd-forum.org\/posts\/66752-managing-the-infodemic-a-critical-condition-for-an-effective-global-response-to-the-covid-19-pandemic\">misinformation<\/a> was shared, and unreliable false messages were <a href=\"https:\/\/www.who.int\/news-room\/spotlight\/let-s-flatten-the-infodemic-curve\">amplified<\/a> across all major platforms. The role of <a href=\"https:\/\/www.tandfonline.com\/doi\/pdf\/10.1080\/21645515.2021.1950504\">vaccine-related misinformation<\/a> on vaccine hesitancy, particularly, intensified the pressure on <a href=\"https:\/\/carnegieendowment.org\/2021\/10\/19\/disinformation-is-not-simply-content-moderation-issue-pub-85514\">social media companies<\/a> to do more moderation.<\/p>\n<p><a href=\"https:\/\/www.facebook.com\/journalismproject\/programs\/third-party-fact-checking\">Facebook<\/a>-owner Meta worked with factcheckers from more than 80 organisations during the pandemic to verify and report misinformation, before removing or reducing the distribution of posts. Meta claims to have <a href=\"https:\/\/about.fb.com\/news\/2021\/08\/taking-action-against-vaccine-misinformation-superspreaders\/\">removed<\/a> more than 3,000 accounts, pages and groups and 20 million pieces of content for breaking rules about COVID-19 and vaccine-related misinformation. <\/p>\n<p>Removal tends to be reserved for content that violates certain platform rules, such as showing prisoners of war or sharing fake and dangerous content. Labelling is for drawing attention to potentially unreliable content. Rules followed by platforms for each case are not set in stone and not very transparent.<\/p>\n<p>Twitter has published policies to highlight its approach to reduce misinformation, for example with regards to <a href=\"https:\/\/help.twitter.com\/en\/rules-and-policies\/medical-misinformation-policy\">COVID<\/a> or <a href=\"https:\/\/help.twitter.com\/en\/rules-and-policies\/manipulated-media\">manipulated media<\/a>. However, when such policies are enforced, and how strongly, is <a href=\"https:\/\/www.bloomberg.com\/news\/newsletters\/2022-03-03\/why-facebook-twitter-youtube-acted-so-fast-after-russia-s-ukraine-invasion\">difficult to determine<\/a> and seem to vary significantly from one context to another.<\/p>\n<h2>Why moderation is so hard<\/h2>\n<p>But clearly, if the goal of moderating misinformation was to reduce the spread of false claims, social media companies\u2019 efforts were not entirely effective in <a href=\"https:\/\/canucklaw.ca\/wp-content\/uploads\/2020\/09\/covid.related.misinformation.on_.youtube.pdf\">reducing<\/a> the amount of misinformation about COVID-19. <\/p>\n<p>At the <a href=\"http:\/\/kmi.open.ac.uk\/\">knowledge media institute<\/a> at <a href=\"http:\/\/open.ac.uk\">the Open University<\/a>, we have been studying how both misinformation and corresponding fact checks spread on Twitter since 2016. Our research <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0306457321002144\">on COVID<\/a> found that fact checks during the pandemic appeared relatively quickly after the appearance of misinformation. But the relationship between appearances of fact checks and the spread of misinformation in the study was less clear. <\/p>\n<p>The study indicated that misinformation was twice as prevalent as the corresponding fact checks. In addition, misinformation about conspiracy theories was persistent, which meshes with previous research arguing that truthfulness is only one reason <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S1364661321000516\">why people share information<\/a> online and that fact checks are <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3269541\">not always convincing<\/a>.<\/p>\n<p>So how can we improve moderation? Social media sites face numerous challenges. Users banned from one platform can still come back with a new account, or resurrect their profile on another platform. Spreaders of misinformation use tactics to avoid detection, for example by <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/anti-vaccine-groups-changing-dance-parties-facebook-avoid-detection-rcna1480\">using euphemisms<\/a> or <a href=\"https:\/\/www.ncbi.nlm.nih.gov\/pmc\/articles\/PMC7548543\/\">visuals<\/a> to avoid detection. <\/p>\n<p>Automated approaches using machine learning and artificial intelligence are <a href=\"https:\/\/direct.mit.edu\/tacl\/article\/doi\/10.1162\/tacl_a_00454\/109469\/A-Survey-on-Automated-Fact-Checking\">not sophisticated enough<\/a> to detect misinformation very accurately. They often suffer from biases, lack of appropriate training, over-reliance on the English language, and difficulty handling misinformation in images, video or audio.<\/p>\n<h2>Different approaches<\/h2>\n<p>But we also know some techniques can be effective. For example, research has shown using simple prompts to encourage users to <a href=\"https:\/\/www.nature.com\/articles\/s41586-021-03344-2\">think about accuracy<\/a> before sharing can reduce people\u2019s intention to share misinformation online (in laboratory settings, at least). Twitter has previously said it has found that labelling content as misleading or fabricated can <a href=\"https:\/\/www.wsj.com\/articles\/twitter-says-labels-and-warnings-slowed-spread-of-false-election-claims-11605214925\">slow the spread<\/a> of some misinformation.<\/p>\n<p>More recently, Twitter announced <a href=\"https:\/\/blog.twitter.com\/en_us\/topics\/company\/2022\/our-ongoing-approach-to-the-war-in-ukraine\">a new approach<\/a>, introducing measures to address misinformation related to the Russian invasion of Ukraine. These including adding labels to tweets sharing links to Russian state-affiliated media websites. It also reduced the circulation of this content as well as improving its vigilance of hacked accounts.<\/p>\n<\/p>\n<p>Twitter is employing people as curators to write notes giving context or notes on Twitter trends, relating to the war to explain why things are trending. Twitter <a href=\"https:\/\/blog.twitter.com\/en_us\/topics\/company\/2022\/our-ongoing-approach-to-the-war-in-ukraine\">claims<\/a> to have removed 100,000 accounts since the Ukraine war started that were in \u201cviolation of its platform manipulation strategy\u201d. It also says it has also labelled or removed 50,000 pieces of Ukraine war-related content.<\/p>\n<p>In some as-yet unpublished research, we performed the same analysis we did for COVID-19, this time on over 3,400 claims about the Russian invasion of Ukraine, then monitoring tweets related to that misinformation about the Ukraine invasion, and tweets with factchecks attached. We started to observe different patterns. <\/p>\n<p>We did notice a change in the spread of misinformation, in that the false claims appear not to be spreading as widely, and being removed more quickly, compared to  previous scenarios. It\u2019s early days but one possible explanation is that the latest measures have had some effect.<\/p>\n<p>If Twitter has found a useful set of interventions, becoming bolder and more effective in curating and labelling content, this could serve as a model for other social media platforms. It could at least offer a glimpse into the type of actions needed to boost fact-checking and curb misinformation. But it also makes Musk\u2019s purchase of the site and the implication that he will reduce moderation even more worrying.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/counter.theconversation.com\/content\/182261\/count.gif?distributor=republish-lightbox-basic\" alt=\"The Conversation\" width=\"1\" height=\"1\" style=\"border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important\" \/><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https:\/\/theconversation.com\/republishing-guidelines --><\/p>\n<p><span><a href=\"https:\/\/theconversation.com\/profiles\/harith-alani-893886\">Harith Alani<\/a>, Professor, <em><a href=\"https:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em>; <a href=\"https:\/\/theconversation.com\/profiles\/gregoire-burel-1343260\">Gr\u00e9goire Burel<\/a>, Research associate, <em><a href=\"https:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em>, and <a href=\"https:\/\/theconversation.com\/profiles\/tracie-farrell-1138685\">Tracie Farrell<\/a>, Research Associate &#8211; Web Science, <em><a href=\"https:\/\/theconversation.com\/institutions\/the-open-university-748\">The Open University<\/a><\/em><\/span><\/p>\n<p>This article is republished from <a href=\"https:\/\/theconversation.com\">The Conversation<\/a> under a Creative Commons license. Read the <a href=\"https:\/\/theconversation.com\/elon-musk-could-roll-back-social-media-moderation-just-as-were-learning-how-it-can-stop-misinformation-182261\">original article<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Harith Alani, The Open University; Gr\u00e9goire Burel, The Open University, and Tracie Farrell, The Open University The US$44 billion (\u00a336 billion) purchase of Twitter by \u201cfree speech absolutist\u201d Elon Musk has many people worried. The concern is the site will start moderating content less and spreading misinformation more, especially after his announcement that he would [&hellip;]<\/p>\n","protected":false},"author":19,"featured_media":21136,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[15],"tags":[1525,1640],"class_list":["post-21134","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-society-politics","tag-news-home","tag-ou-home"],"_links":{"self":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/posts\/21134","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/users\/19"}],"replies":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/comments?post=21134"}],"version-history":[{"count":0,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/posts\/21134\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/media\/21136"}],"wp:attachment":[{"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/media?parent=21134"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/categories?post=21134"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/news\/wp-json\/wp\/v2\/tags?post=21134"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}