Skip to Main Content

Lesson 5

We will continue to look into the second group of subtopic questions: Should our society allow AI to influence human decision-making or make decisions on its own? Why or why not? We will read an article from ProPublica that questions the reliability of machine learning and decision-making.

Lesson Goals

  • Can I analyze an article about the negative effects of machine learning?

Texts

Core

  • Unit Reader
    • “Machine Bias,” Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner, ProPublica Inc., 2016

Optional

  • Digital Access
    • “Are Criminal Risk Assessment Scores Racist?,” Jennifer L Doleac and Megan Stevenson, Brookings Institute, 2016
    • “COMPAS Risk Scales: Demonstrating Accuracy, Equity and Predictive Parity,” William Dieterich, Christina Mendoza, and Tim Brennan, Northpointe Inc. Research Department, 2016
    • “Ethics for Powerful Algorithms,” Abe Gong, Open Data Science, 2017
    • “False Positives, False Negatives, and False Analyses,” Anthony W. Flores, Kristen Bechtel, and Christopher Lowenkamp, Federal Probation Journal, 2016
    • “ProPublica Is Wrong in Charging Racial Bias in an Algorithm,” Chuck Dinerstein, American Council on Science and Health, 2018

Materials

Tools

Question Sets

Editable Google Docs

Activity 1: Read – Discuss

We will discuss our annotations and thoughts from reading “Machine Bias.”

With a partner, compare your annotations and interpretations of the authors’ purpose and claims. Look at the “about” page on the ProPublica website.

Discuss if there is anything about working for ProPublica that might influence the journalists’ perspective or ideas.

Activity 2: Read – Discuss – Write

We will analyze the article “Machine Bias.”

With a partner, consider the following questions. Then use the Attending to Details Tool to write down your response to a guiding question designated for you and your partner by your teacher.

  1. The authors state that "risk assessments—are increasingly common in courtrooms across the nation." What are risk assessments, and how are they used?

  2. What were some of the findings of ProPublica’s study of risk scores? What was the sample size?

  3. The article describes how the company Northpointe creates the risk assessment score. What are the details of this process?

  4. The authors state that forecasting criminal risk has made a comeback. Why?

  5. What evidence do the authors present to support their opening claim: "There’s software used across the country to predict future criminals. And it’s biased against Blacks"?

Join with another partner group. Compare your answers.

Share your findings with the whole class.

Activity 3: Read

We will read counterarguments in response to “Machine Bias.”

With a partner, review five online articles that respond to ProPublica’s "Machine Bias" study by skimming each, looking at the bold headings, words, and subheadings.

Identify one article out of the five that you will read closely and analyze.

Activity 4: Read – Write

For homework, we will read our chosen articles.

For homework, read the article, focusing on the following guiding questions:

  1. How does the article respond to and dispute "Machine Bias"?

  2. What are the counterarguments it presents?

  3. What kind of evidence does it use?

  4. Research the authors and publishers. What are their qualifications? Is there anything to suggest the authors might have a specific agenda or bias?

Write new or interesting words you encounter in your Vocabulary Journal.