<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Bias on AI and Society Course</title><link>https://msucerl.org/cmse101/tags/bias/</link><description>Recent content in Bias on AI and Society Course</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Tue, 05 May 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://msucerl.org/cmse101/tags/bias/index.xml" rel="self" type="application/rss+xml"/><item><title>Week 3 Assignment: Bias Case Study Analysis</title><link>https://msucerl.org/cmse101/assignments/week-03/</link><pubDate>Tue, 05 May 2026 00:00:00 +0000</pubDate><guid>https://msucerl.org/cmse101/assignments/week-03/</guid><description>&lt;h2 id="week-3-assignment-bias-case-study-analysis"&gt;Week 3 Assignment: Bias Case Study Analysis&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Due:&lt;/strong&gt; End of Week 3 | &lt;strong&gt;Format:&lt;/strong&gt; Written analysis | &lt;strong&gt;Length:&lt;/strong&gt; 600-800 words&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="-assignment-overview"&gt;📝 Assignment Overview&lt;/h2&gt;
&lt;p&gt;Select a real-world case of algorithmic bias and conduct a deep analysis of what went wrong, why, and what could have been done differently.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="-instructions"&gt;📋 Instructions&lt;/h2&gt;
&lt;h3 id="case-study-selection"&gt;Case Study Selection&lt;/h3&gt;
&lt;p&gt;Choose one of these cases or propose your own:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Amazon Hiring Algorithm&lt;/strong&gt; — Gender discrimination in recruitment&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;COMPAS Recidivism Tool&lt;/strong&gt; — Bias in criminal risk assessment&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Facial Recognition Bias&lt;/strong&gt; — Misidentification by race/gender&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Apple Card Algorithm&lt;/strong&gt; — Gender discrimination in credit decisions&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Automated Resume Screening&lt;/strong&gt; — Discrimination by name/background&lt;/li&gt;
&lt;li&gt;Your choice (get instructor approval)&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="analysis-600-800-words"&gt;Analysis (600-800 words)&lt;/h3&gt;
&lt;p&gt;Address each section:&lt;/p&gt;</description></item><item><title>Week 3: Ethics &amp; Bias</title><link>https://msucerl.org/cmse101/readings/week-03/</link><pubDate>Tue, 05 May 2026 00:00:00 +0000</pubDate><guid>https://msucerl.org/cmse101/readings/week-03/</guid><description>&lt;h2 id="week-3-ethics--bias"&gt;Week 3: Ethics &amp;amp; Bias&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Focus:&lt;/strong&gt; Algorithmic bias and fairness&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="-required-readings"&gt;📚 Required Readings&lt;/h2&gt;
&lt;h3 id="primary-readings"&gt;Primary Readings&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;&amp;ldquo;Weapons of Math Destruction&amp;rdquo; — Chapter 1&lt;/strong&gt; (30 min)&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;How algorithms can amplify inequality&lt;/li&gt;
&lt;li&gt;Case studies of biased AI systems&lt;/li&gt;
&lt;li&gt;The impact on vulnerable communities&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;&amp;ldquo;Defining and Detecting Algorithmic Bias&amp;rdquo;&lt;/strong&gt; (25 min)&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Statistical definitions of fairness&lt;/li&gt;
&lt;li&gt;Types of bias: historical, measurement, representation, evaluation&lt;/li&gt;
&lt;li&gt;Methods for detecting and mitigating bias&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h3 id="supplementary-resources"&gt;Supplementary Resources&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;ProPublica: &amp;ldquo;Machine Bias&amp;rdquo; investigation — Interactive article&lt;/li&gt;
&lt;li&gt;&amp;ldquo;The Ethics of Artificial Intelligence&amp;rdquo; — Stanford Encyclopedia excerpt&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-discussion-prompts"&gt;💭 Discussion Prompts&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;How can an AI system be mathematically &amp;ldquo;accurate&amp;rdquo; yet fundamentally unfair?&lt;/li&gt;
&lt;li&gt;When is some level of bias acceptable in AI systems?&lt;/li&gt;
&lt;li&gt;Who should be responsible for the harms caused by biased algorithms?&lt;/li&gt;
&lt;/ol&gt;
&lt;hr&gt;
&lt;h2 id="-preparation-for-class"&gt;📝 Preparation for Class&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Read about a real-world case of algorithmic bias (e.g., Amazon hiring, COMPAS, facial recognition)&lt;/li&gt;
&lt;li&gt;Prepare a 2-minute summary: What went wrong? Who was harmed?&lt;/li&gt;
&lt;li&gt;Brainstorm: How could that system have been designed more fairly?&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-related-assignment"&gt;🔗 Related Assignment&lt;/h2&gt;
&lt;p&gt;See &lt;a href="https://msucerl.org/cmse101/assignments/week-03/"&gt;Week 3 Assignment&lt;/a&gt; for this week&amp;rsquo;s task.&lt;/p&gt;</description></item></channel></rss>