Advertisement · 728 × 90

Posts by TechTonic Justice

Text reads, “North Star Data Center Policy Toolkit | Policy in Practice Training Series AI Now| April 22, 1-2 PM ET | The AI Industry, Data Center Buildout, and How to Take Power Back. | Dwaign Tyndal, Executive Director, Alternatives for Community and Environment | Andrea Reyes, Senior Organizing Strategist, TechTonic Justice | Ashley Nicole Leitka, Director, Department of Sovereignty and Self Determination, Honor The Earth”

Text reads, “North Star Data Center Policy Toolkit | Policy in Practice Training Series AI Now| April 22, 1-2 PM ET | The AI Industry, Data Center Buildout, and How to Take Power Back. | Dwaign Tyndal, Executive Director, Alternatives for Community and Environment | Andrea Reyes, Senior Organizing Strategist, TechTonic Justice | Ashley Nicole Leitka, Director, Department of Sovereignty and Self Determination, Honor The Earth”

Join the AI Now Institute's virtual Data Center Policy in Practice training series on 4/22 1-2 PM ET, ft. "The AI Industry, Data Center Buildout, and How to Take Power Back." TTJ’s Senior Organizing Strategist Andrea Reyes will be there. Register: https://bit.ly/policy-in-practice-training-series

1 day ago 2 0 0 0
Flyer for 'Rebuilding Tomorrow: A Blueprint to Resist AI Injustice' event featuring speakers Kevin De Liban and Kristen Hess at UAlbany on April 22, 2026. There is a QR code to register.

Flyer for 'Rebuilding Tomorrow: A Blueprint to Resist AI Injustice' event featuring speakers Kevin De Liban and Kristen Hess at UAlbany on April 22, 2026. There is a QR code to register.

Event flyer for AI Roundtable on public values, equity, and accountability, hosted by Center for Technology in Government at University at Albany.

Event flyer for AI Roundtable on public values, equity, and accountability, hosted by Center for Technology in Government at University at Albany.

Join TechTonic Justice’s founder Kevin De Liban and Philosophy Professor Kristen Hessler on Wednesday, April 22nd, from 10 AM - 11 AM in the Rockefeller College Levitt Room Milne 120 to learn how AI is being used to inflict justice and what you can do to fight. Register today using the QR code.

6 days ago 1 0 0 0
The text reads “TechTonic x Upturn | Toward Justice in Technology Present: The Impending AI-Driven Medicaid Cuts: What to Know and How to Prepare. Join us on Saturday, April 18th, at the Take Back Tech Conference”

The text reads “TechTonic x Upturn | Toward Justice in Technology Present: The Impending AI-Driven Medicaid Cuts: What to Know and How to Prepare. Join us on Saturday, April 18th, at the Take Back Tech Conference”

Join TTJ and Upturn at the Take Back Tech Conference on Saturday, April 18, for our session on “The Impending AI-Driven Medicaid Cuts: What to Know and How to Prepare.” We’ll break down what’s coming, what’s at stake, and how you can take action.

1 week ago 0 2 1 0
Post image

Street Cred Education Consultants, Inc., @techtonicjustice.bsky.social, The American Dream Housing Counseling Agency, The Beloved Foundation/ Beloved Couture Bridal, and Three Valleys Community Foundation!

1 week ago 1 1 0 0
A large phone shines light on symbols of crime and protest, with text stating censorship extends beyond screens.

A large phone shines light on symbols of crime and protest, with text stating censorship extends beyond screens.

Censorship extends beyond our screens. In 2026, AI, surveillance, and our data are being used to suppress dissent in real life: protests monitored, campus organizing investigated, people flagged and targeted. Speaking up in public should be protected, not punished.

2 weeks ago 1 1 0 0
Post image

On #TDOV, TTJ calls for trans futures built on care, safety, healthcare, and freedom from AI that denies, flags, surveils, or erases them. Trans joy and dignity are required for a better future for everyone.

3 weeks ago 4 2 0 0
Post image

AI-based cuts harm people seeking Medicaid home care by delaying support and increasing denials. You and your loved ones pay the price when disabled people and elders lose care, and women are forced to fill the gap. People deserve fully funded care.

3 weeks ago 7 3 0 1
Two people faced in opposite directions behind text that reads, “People deserve systems that do not punish them by gender.”

Two people faced in opposite directions behind text that reads, “People deserve systems that do not punish them by gender.”

1 month ago 0 0 0 0
Text reads, “Facial recognition:  AI misidentifies women, especially women of color, at higher rates. Dr. Joy Buolamwini, a Black woman, found that some systems failed to detect her face until she wore a white mask.”

Text reads, “Facial recognition: AI misidentifies women, especially women of color, at higher rates. Dr. Joy Buolamwini, a Black woman, found that some systems failed to detect her face until she wore a white mask.”

Text reads, “Facial recognition: NIST, the National Institute of Standards and Technology, also found higher false-positive rates among women meaning they were more likely to be wrongly matched facially to someone else than men.“

Text reads, “Facial recognition: NIST, the National Institute of Standards and Technology, also found higher false-positive rates among women meaning they were more likely to be wrongly matched facially to someone else than men.“

Text reads, “Voice Recognition: AI tools often perform worse for women, making them more likely to be misheard by voice assistants, transcription services, voice authentication systems, and call centers.”

Text reads, “Voice Recognition: AI tools often perform worse for women, making them more likely to be misheard by voice assistants, transcription services, voice authentication systems, and call centers.”

Text reads, “Disability Justice:  A 2017 study found that YouTube auto-captions were less accurate for women than men.”

Text reads, “Disability Justice: A 2017 study found that YouTube auto-captions were less accurate for women than men.”

1 month ago 0 0 1 0
Two people faced in opposite directions behind a text that reads, “Algorithms aren’t neutral,  women are paying the price.”

Two people faced in opposite directions behind a text that reads, “Algorithms aren’t neutral, women are paying the price.”

Text reads, “Healthcare: AI can further a system that already downplays women’s pain and symptoms, leading to delayed diagnoses and worse treatment. These AI tools often lead to worse health outcomes for Black and brown women.”

Text reads, “Healthcare: AI can further a system that already downplays women’s pain and symptoms, leading to delayed diagnoses and worse treatment. These AI tools often lead to worse health outcomes for Black and brown women.”

Text reads, “Career advancement:  In a 2025 Brookings study, women’s names were favored in resume screenings only 11% of the time, compared with 51.9% for men.”

Text reads, “Career advancement: In a 2025 Brookings study, women’s names were favored in resume screenings only 11% of the time, compared with 51.9% for men.”

Text reads, “Credit: Until 1974, women didn’t have access to credit. The gaps in data from women lead to women facing higher rates of being denied credit or receiving lower credit limits.”

Text reads, “Credit: Until 1974, women didn’t have access to credit. The gaps in data from women lead to women facing higher rates of being denied credit or receiving lower credit limits.”

Want a dose of alternate reality? Look no further than the gender divide. AI tools use biased data that makes it harder for you to have equal access in hiring, healthcare, credit, face and voice recognition systems, and other systems that shape everyday life. Swipe through, read the stats, & share.

1 month ago 4 2 1 0
Advertisement
Text reads, “People aren’t switches. Our bodies are not 0/1. Real life doesn’t fit neatly into a form.” The text above shows a series of 0s and 1s slowly being covered by a bunch of small, green, glitchy blocks. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “People aren’t switches. Our bodies are not 0/1. Real life doesn’t fit neatly into a form.” The text above shows a series of 0s and 1s slowly being covered by a bunch of small, green, glitchy blocks. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “Women feel it. Disabled people feel it. Trans and gender-nonconforming people feel it. And the people pushed to the margins feel it first and hardest.” The image above shows a woman, a nonbinary person, and a disabled person with an elbow crutch facing the viewer and smiling. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “Women feel it. Disabled people feel it. Trans and gender-nonconforming people feel it. And the people pushed to the margins feel it first and hardest.” The image above shows a woman, a nonbinary person, and a disabled person with an elbow crutch facing the viewer and smiling. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “A system can call itself ‘objective’ and still enforce old barriers. TechTonic Justice pushes back on systems that treat identity like an error message. Check out our resource, Tips for Identifying AI Use.” The image above shows a hand in the top left corner reaching down for a hand reaching out from a cracked phone at the bottom center of the image. In the bottom-right corner of the graphic is the TechTonic Justice logo.

Text reads, “A system can call itself ‘objective’ and still enforce old barriers. TechTonic Justice pushes back on systems that treat identity like an error message. Check out our resource, Tips for Identifying AI Use.” The image above shows a hand in the top left corner reaching down for a hand reaching out from a cracked phone at the bottom center of the image. In the bottom-right corner of the graphic is the TechTonic Justice logo.

1 month ago 0 0 0 0
Text reads, “AI Isn’t Gender Neutral: A Women’s History Month explainer.” The image shows a database with a series of 1s and 0s next to “approved” or “denied” decisions for redacted people files. The last entry has a 2 and a cursor where a decision should be made. Around the database are checkboxes with green checkmarks, and one checkbox in the bottom-right corner with a red question mark. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “AI Isn’t Gender Neutral: A Women’s History Month explainer.” The image shows a database with a series of 1s and 0s next to “approved” or “denied” decisions for redacted people files. The last entry has a 2 and a cursor where a decision should be made. Around the database are checkboxes with green checkmarks, and one checkbox in the bottom-right corner with a red question mark. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “A system mismatch can block access on the spot. That’s not neutral and can put someone in a situation where they go without needed care.” The image above is a quadrant of four images:  a phone with a sale ad, a benefits card in jagged pieces as if cut up, a computer, and a photo with a black bar over it that says “CENSORED.” In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “A system mismatch can block access on the spot. That’s not neutral and can put someone in a situation where they go without needed care.” The image above is a quadrant of four images: a phone with a sale ad, a benefits card in jagged pieces as if cut up, a computer, and a photo with a black bar over it that says “CENSORED.” In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “If you don’t match what the system expects, you won’t get to receive what you need to survive.” The image above shows a gender form section, and it’s a required field. There are two options, “Male” and “Female,” with no other options. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “If you don’t match what the system expects, you won’t get to receive what you need to survive.” The image above shows a gender form section, and it’s a required field. There are two options, “Male” and “Female,” with no other options. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “Across healthcare, education, housing, benefits, and employment, AI is used to make decisions people often can’t see, understand, or easily appeal.” The image above shows a black box in the center with a blue question mark. There are five dotted lines branching out on different sides of the black box, leading to a stethoscope, a backpack, a home, a benefits card, and a briefcase. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

Text reads, “Across healthcare, education, housing, benefits, and employment, AI is used to make decisions people often can’t see, understand, or easily appeal.” The image above shows a black box in the center with a blue question mark. There are five dotted lines branching out on different sides of the black box, leading to a stethoscope, a backpack, a home, a benefits card, and a briefcase. In the bottom-right corner of the graphic are a green arrow and the TechTonic Justice logo.

AI isn’t gender neutral. It shows up when benefits systems block people over “mismatches,” when automation flags “fraud,” and when hiring tools inherit old assumptions about who counts. Check out TTJ’s Tips for identifying AI Use: www.techtonicjustice.org/resources

1 month ago 3 2 1 0
Post image

Clarence Okoh (TTJ) breaks down what AI weapons detection can truly do in schools: more surveillance, false alarms, and harm for students. We all want safe schools, but not at that cost. undark.org/2026/02/13/a...

1 month ago 2 1 0 0
Attending a Protest For quick reference, we've created a handy guide designed to be printed, folded, and carried in your pocket (PDF download). Now, more than ever, citizens must be able to hold those in power acco...

For more on digital safety, check out this guide from the Electronic Frontier Foundation: ssd.eff.org/module/atten...

1 month ago 0 0 0 0
Post image

Freedom isn’t just a vote. It’s being able to show up, organize, and live without your phone turning into a tracking device. ICE is scaling surveillance through private vendors, widening the net to target citizens and non-citizens, including at protests.

1 month ago 0 0 1 0
Post image

The “New Jim Code” (Ruha Benjamin) names how “neutral” tech can reinforce racial inequity under the banner of efficiency. So the question is: Who benefits, and who pays the price? Head to our website to learn more.

1 month ago 0 0 0 0
Post image

At TechTonic Justice, we’re uplifting the legacy of Rev. Jesse Jackson, who passed away earlier this week. A lion of justice movements in the U.S. and abroad, his life reminds us that systems do not change on their own. It’s up to us to make the systems better.

2 months ago 2 0 0 0
Preview
Black Tech Town Hall: AI, Algorithms, And The Costs To Our Future · Color Of Change Color Of Change invites you to join our Black Tech Town Hall: AI, Algorithms, And The Costs To Our Future on Wednesday, February 25, 2026 at 8:00 pm ET Via Zoom. Join us this Black history ...

As AI gets woven into daily life, Black communities are paying the price—sign up to learn more.
www.mobilize.us/colorofchang...

2 months ago 1 0 0 0
Advertisement
Post image

We’re honored to have TTJ’s Civil Rights and Technology Attorney, Clarence Okoh, join Color Of Change’s Black Tech Town Hall: AI, Algorithms, and the Costs to Our Future on Wednesday, Feb. 25, 2026, at 8:00 PM ET on Zoom.

2 months ago 2 1 1 0
Post image

We’re looking for new additions to our team of Do-gooders this year as we expand and focus our work on key states in 2026. There are two positions in California and one in Arkansas. Come join us! More details below 🧵

Link to apply in bio. Deadline to apply is March 1st.

2 months ago 2 1 0 0
Picture of three warehouse workers. Overlaying the image are three semi-transparent red hearts with white sparkles. At the top of the image in white text with blue shadow are words that state, "employers love AI"

Picture of three warehouse workers. Overlaying the image are three semi-transparent red hearts with white sparkles. At the top of the image in white text with blue shadow are words that state, "employers love AI"

New workplace metrics just dropped: clicks, pauses, mouse wiggles, break time

A slow day becomes a suspicion. A quiet hour becomes a warning.

Because nothing says “we value our team” like tracking your every movement. That’s not management; it’s a threat to keep you in line... 💔

2 months ago 1 1 0 0

This relationship is toxic. Break up with the tools that are fueling authoritarianism in our communities. 🙅

2 months ago 0 0 0 0

Authoritarians love AI because AI makes it easy to] hunt down and punish people fighting for democracy. It fuels disinformation to inundate, isolate, and confuse you. It crushes dissent. And, it allows them to rule without explanation or accountability.

2 months ago 0 0 1 0
Picture of man wearing a President Trump mask. Overlaying it are three semi-transparent red hearts and two white sparkles. At the top in white text with a blue shadow it states "Authoritarians Love AI"

Picture of man wearing a President Trump mask. Overlaying it are three semi-transparent red hearts and two white sparkles. At the top in white text with a blue shadow it states "Authoritarians Love AI"

This tax season, your dollars are doing the most 💸 Not for childcare. Not for healthcare. Not for the safety net or your retirement. They’re using your dollars to build undemocratic systems that track dissent, give bonuses to kidnappers, and throw immigrants into inhumane warehouses.

2 months ago 2 1 1 0
Preview
a man wearing glasses and a pink hoodie drinks from a cup with a straw ALT: a man wearing glasses and a pink hoodie drinks from a cup with a straw

If they loved you, they’d approve your care as fast as they approve their bonuses.

2 months ago 0 0 0 0

The lack of transparency in decision-making + over-reliance on auto-denials = record profits for insurance companies and delayed medical care for you. Your meds don’t arrive. Your procedure gets bumped. Your care hours get cut. Your bills stack up. Your body carries the stress.

2 months ago 0 0 1 0
Black and White image of sick child in hospital. Overlaying it are 3 red shaded hearts with white sparkles. At the top of the image is white text with a blue shadow stating "insurance companies love AI"

Black and White image of sick child in hospital. Overlaying it are 3 red shaded hearts with white sparkles. At the top of the image is white text with a blue shadow stating "insurance companies love AI"

Coverage for them. Consequences for you. They cash the checks. The AI says “denied.” You cover the fallout.

Their AI prior authorization systems don’t help you be healthier. They overrule your own doctors in deciding what you need. 💔

2 months ago 0 0 1 0
Advertisement
A person is speaking at a podium next to a banner displaying "Georgetown University Walsh School of Foreign Service." The podium has a microphone and a glass of water. The setting appears to be a formal lecture or presentation.

A person is speaking at a podium next to a banner displaying "Georgetown University Walsh School of Foreign Service." The podium has a microphone and a glass of water. The setting appears to be a formal lecture or presentation.

A person stands at a podium giving a presentation titled "Rebuilding Tomorrow" in a wood-paneled room. Attendees sit facing the presenter. Visible signage includes "Mortara Center for International Studies" and "SFS.

A person stands at a podium giving a presentation titled "Rebuilding Tomorrow" in a wood-paneled room. Attendees sit facing the presenter. Visible signage includes "Mortara Center for International Studies" and "SFS.

A person standing and speaking passionately in a wood-paneled room with seated attendees.

A person standing and speaking passionately in a wood-paneled room with seated attendees.

A group of people seated in a wood-paneled room, attentively listening.

A group of people seated in a wood-paneled room, attentively listening.

How can we resist AI injustice? This week, the STIA program hosted the annual Loewy Lecture featuring Kevin De Liban, founder & president of @techtonicjustice.bsky.social. He discussed how AI weakens public systems & harms marginalized communities, urging for accountability & collective action.

2 months ago 1 1 0 0

Step one: hype.
Step two: harm.

AI isn’t neutral; it’s a smoke machine. It scales decisions that squeeze workers, renters, and patients, then makes the decision-maker disappear. No rules you can read. No score you can question. No person to answer for it. ❌

2 months ago 1 0 0 0
A picture of Jeff Bezos, Elon Musk, and  Sundar Pichai. On top of the image are transparent hearts and sparkles with a caption in blue shadow and white text stating "Tech Billionaires Love AI"

A picture of Jeff Bezos, Elon Musk, and Sundar Pichai. On top of the image are transparent hearts and sparkles with a caption in blue shadow and white text stating "Tech Billionaires Love AI"

Move fast, break wages, bad bots do it well ❣️The algorithm catches the blame. The billionaires keep the bonus.

2 months ago 1 1 1 0