{"id":2567,"date":"2021-07-15T18:15:37","date_gmt":"2021-07-15T18:15:37","guid":{"rendered":"https:\/\/www.open.ac.uk\/blogs\/opentel\/?p=2567"},"modified":"2021-07-16T18:25:31","modified_gmt":"2021-07-16T18:25:31","slug":"can-computers-detect-social-bias","status":"publish","type":"post","link":"https:\/\/www.open.ac.uk\/blogs\/opentel\/can-computers-detect-social-bias\/","title":{"rendered":"Can computers detect social bias?"},"content":{"rendered":"<p>The first 2021 seminar of the Special Interest Group focused on Artificial Intelligence in Education (openAIED SIG) was led by <strong>Josmario Albuquerque<\/strong>, a second-year PhD student at the OU Institute of Educational Technology, on Wednesday 7<sup>th<\/sup> July.<\/p>\n<p>With a background in Computer Science, Josmario has been involved in IT projects related to Artificial Intelligence in Education, Learning Analytics, and<strong>\u00a0the use of Computer Science to address social issues<\/strong>. His current research fits under the scope of these past projects, since he is studying group biases in online learning settings. During the seminar, Josmario suggested that <strong>human biases<\/strong> and <strong>stereotypes<\/strong> <strong>are still present in educational settings<\/strong>, diminishing several aspects of learning.<!--more--><\/p>\n<p>It is necessary to detect those issues related to social justice that <strong>can jeopardise<\/strong>\u00a0<strong>students&#8217; academic performance<\/strong>, confidence,\u00a0<strong>mental health\u00a0<\/strong>and engagement when learning online. Hence, he is investigating technologies that can help uncover potential group biases within online education.<\/p>\n<p>He referred to\u00a0<strong>group bias<\/strong>\u00a0as\u00a0<strong>any inclination or favouritism<\/strong>\u00a0towards people belonging to particular groups (Greenwald &amp; Krieger, 2006). In online learning, external and internal bias can occur when designing a course or giving feedback to students. However, there is a need to identify those social favouritisms before addressing them appropriately.<\/p>\n<p>Numerous approaches have been used to detect group bias in other fields, such as self-reports, psychological experiments and discourse analysis techniques. <strong>In Josmario&#8217;s study<\/strong>, <strong>bias-detection computational approaches<\/strong> were <strong>selected<\/strong>, <strong>replicated<\/strong> and <strong>tested <\/strong>to<strong> uncover potential group biases from learning materials <\/strong>presented in an online learning platform.<\/p>\n<div id=\"attachment_2568\" style=\"width: 1086px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2568\" class=\"wp-image-2568 size-full\" src=\"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture.png\" alt=\"\" width=\"1076\" height=\"603\" srcset=\"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture.png 1076w, https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture-300x168.png 300w, https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture-768x430.png 768w, https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture-1024x574.png 1024w, https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-content\/uploads\/2021\/07\/josmario-blog-picture-500x280.png 500w\" sizes=\"auto, (max-width: 1076px) 100vw, 1076px\" \/><\/a><p id=\"caption-attachment-2568\" class=\"wp-caption-text\">Josmario&#8217;s Study Overview<\/p><\/div>\n<p>After conducting a thorough literature review that included 84 studies, he identified <strong>two computational approaches<\/strong> that could potentially detect biases in text. The former approach focused on\u00a0<strong>gender bias<\/strong>\u00a0and the latter on\u00a0<strong>subjective language<\/strong>. The two bias-detection algorithms are generally used in job advertisements and Wikipedia, respectively.<\/p>\n<p>Then, 2024 sentences from 41\u00a0online courses across eight\u00a0disciplines were classified by the\u00a0two bias-detection algorithms. While potential biases were suggested by those approaches within the modules analysed, the extent to which those biases are relevant for an educational setting is questionable.<\/p>\n<p>One of the reasons for questioning the outcomes obtained from the two approaches was the lack of context that made it hard to judge the relevance of each instance. Josmario said:<\/p>\n<blockquote><p><em><strong>Context matters<\/strong>. Regarding the gender approach, for example, I don\u2019t think its inferences are accurate as it was designed for job advertisements. Anyway, I think that without context, it will be hard to tell which approach is doing well or not.<\/em><\/p><\/blockquote>\n<p><span style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">He also pointed out that\u00a0<\/span><strong style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">automatic<\/strong><span style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">\u00a0<\/span><strong style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">identification of group bias in educational texts is scarce<\/strong><span style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">. Similarly, none of the approaches reviewed in the literature focused on\u00a0<\/span><strong style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">racial discrimination<\/strong><span style=\"font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">.\u00a0<\/span>He concluded his presentation by stating:<\/p>\n<blockquote><p><em>The use of technology for identifying potential biases in online learning seems promising towards fair educational settings. Future works may benefit from accounting for context-sensitive biases and focus on underrepresented populations, particularly individuals from black and minority ethnic groups<\/em><em>.<\/em><\/p><\/blockquote>\n<p>Josmario&#8217;s presentation led to <strong>a thought-provoking discussion<\/strong> afterwards. Attendees also reflected on their practice at work. One of them said: &#8220;<em>There is a lot of <strong>power in the words<\/strong> and the connotations of those words we use when communicating a message in our daily basis&#8221;.\u00a0<\/em>Therefore, it is essential to acknowledge and address those potential biases to improve not only the development of learning materials but also the performance and well-being of students learning online.<\/p>\n<p>If you would like to keep up to date with the openAIED SIG agenda, don&#8217;t hesitate to get in touch with us to add your name to the openAIED mailing list.\u00a0<strong>\u00a0<\/strong><\/p>\n<p>Greenwald, A. G., &amp; Krieger, L. H. (2006). Implicit bias: Scientific foundations.\u00a0<em>California law review<\/em>,\u00a0<em>94<\/em>(4), 945-967.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The first 2021 seminar of the Special Interest Group focused on Artificial Intelligence in Education (openAIED SIG) was led by Josmario Albuquerque, a second-year PhD student at the OU Institute [&hellip;]<\/p>\n","protected":false},"author":14,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_EventAllDay":false,"_EventTimezone":"","_EventStartDate":"","_EventEndDate":"","_EventStartDateUTC":"","_EventEndDateUTC":"","_EventShowMap":false,"_EventShowMapLink":false,"_EventURL":"","_EventCost":"","_EventCostDescription":"","_EventCurrencySymbol":"","_EventCurrencyCode":"","_EventCurrencyPosition":"","_EventDateTimeSeparator":"","_EventTimeRangeSeparator":"","_EventOrganizerID":[],"_EventVenueID":0,"_OrganizerEmail":"","_OrganizerPhone":"","_OrganizerWebsite":"","_VenueAddress":"","_VenueCity":"","_VenueCountry":"","_VenueProvince":"","_VenueZip":"","_VenuePhone":"","_VenueURL":"","_VenueStateProvince":"","_VenueLat":"","_VenueLng":"","footnotes":""},"categories":[7,1,12],"tags":[],"class_list":["post-2567","post","type-post","status-publish","format-standard","hentry","category-events","category-generic","category-openaied"],"_links":{"self":[{"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/posts\/2567","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/users\/14"}],"replies":[{"embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/comments?post=2567"}],"version-history":[{"count":7,"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/posts\/2567\/revisions"}],"predecessor-version":[{"id":2575,"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/posts\/2567\/revisions\/2575"}],"wp:attachment":[{"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/media?parent=2567"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/categories?post=2567"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.open.ac.uk\/blogs\/opentel\/wp-json\/wp\/v2\/tags?post=2567"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}