Text reads, “AI Isn’t Gender Neutral: A Women’s History Month explainer.” The image shows a database with a series of 1s and 0s next to “approved” or “denied” decisions for redacted people files. The last entry has a 2 and a cursor where a decision should be made. Around the database are checkboxes with green checkmarks, and one checkbox in the bottom-right corner with a red question mark. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.
Text reads, “A system mismatch can block access on the spot. That’s not neutral and can put someone in a situation where they go without needed care.” The image above is a quadrant of four images: a phone with a sale ad, a benefits card in jagged pieces as if cut up, a computer, and a photo with a black bar over it that says “CENSORED.” In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.
Text reads, “If you don’t match what the system expects, you won’t get to receive what you need to survive.” The image above shows a gender form section, and it’s a required field. There are two options, “Male” and “Female,” with no other options. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.
Text reads, “Across healthcare, education, housing, benefits, and employment, AI is used to make decisions people often can’t see, understand, or easily appeal.” The image above shows a black box in the center with a blue question mark. There are five dotted lines branching out on different sides of the black box, leading to a stethoscope, a backpack, a home, a benefits card, and a briefcase. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.
AI isn’t gender neutral. It shows up when benefits systems block people over “mismatches,” when automation flags “fraud,” and when hiring tools inherit old assumptions about who counts. Check out TTJ’s Tips for identifying AI Use: www.techtonicjustice.org/resources