Still more exciting @aera-motsig.bsky.social programming to come today and tomorrow! Check it out at the link below #AERA2026
Posts by Adam K. Dubé
NEW Just a few years ago Sal Khan was predicting that AI was poised to revolutionize education. But his experience launching an AI-powered tutor, Khanmigo, has been sobering, he says.
The hope that it would quickly become a super-tutor still seems a long way off.
www.chalkbeat.org/2026/04/09/s...
Working on a test to measure spatial thinking in middle school kids #CDS2026
Here's some of what @mileslab.bsky.social members will be up to Friday (tomorrow!) & Saturday at #AERA2026 --see you there! And don't miss @sanheeta.bsky.social presenting on Belonging Opportunity Structures bright & early on Friday: tinyurl.com/32p8bxxj
We’re hiring at McGill (ECP) — Assistant Professor in Human Development
Come work with us 🙂
Genuinely great group, collaborative culture, amazing students, and Canada!
Focus on early childhood but is open. Must study development
Happy to chat if you’re considering applying or know someone great.
Agreed! Here 👇, I argue the AI students use are untested, broken products designed around maximizing use and not learning. To me, thinking of AI as a product, to be marketed and sold, helps us see why these systems are not helping students
Using cognitive science to design tools that promote learning, not offloading performance. Basing the responses on expert knowledge and pedagogical content knowledge. Learning from work on intelligent tutoring systems. All promising ways to design effective #GenAI
wapo.st/4ds9lND
This happened with educational games. Most games on the market are not well designed and do not work. But, researchers only study the good games and the ones we design. This created a disconnect between the promise we researchers see and the reality students and teachers live.
Researchers should build better tools for students, ones based on learning/cognitive sciences. But, we must equally critique and push for better versions of the products in their hands. This means addressing the reality that the tools we make are not the products students use.
Making LLMs useful will require identifying heuristics and fine tuning; intelligent tutor systems research show us this is hard, requires deep expertise, and is hard to scale. This is all antithetical to the value proposition general purpose LLMs pose to companies.
To me, companies are disincentivized to spend the time/money to build better products. Historically, they promise personalized systems but build practice drills, see Alpha School.
God! Y’all need unions. The set rate is 11,800 per class for a course lecturer at McGill. That’s $8450 USD.
“When it comes to the tutor side of things, critically, there’s very little to almost no research on the efficacy of these tools for elementary and high school students,”
Great piece. Here I similarly argue that EdTech is a product and not a neutral tool. Always interesting to see similar ideas come out at the same time.
Education technology is never neutral by @daisychristo.bsky.social open.substack.com/pub/daisychr...
A quick thread on a pair of articles in The New York Times that illustrate how important it is to ask better questions about technology than whether it is good or bad for people, students, etc. The first is by Ezra Klein and it focuses on how technology can change us without our realizing it. (1/n)
There are not "math people" and "not math people."
School math represents a sliver of mathematics as a discipline, and there is a lot more in there that the so called "not math people" could really dig and be good at.
Great thread! Also, one key aspect of EdTech is
Tech students use = commercial products from companies seeking to maximize use and not learning. See Grammarly’s AI-text humanizer for students. Market forces drive EdTech design and must be countered by policy. Tech is not a neutral tool.
People do not learn better when taught according to their “learning style”. Instead, learning improves when content is covered in a diversity of ways.
6/ We can demand better products or train people to compensate for bad ones
Right now, we’re doing the latter
5/ We’ve seen this before with social media
delay regulation → long-term harm
/4 Research often studies ideal systems
but students use commercial ones
/3 That’s why we see:
– offloading of thinking
– high error rates
– superficial outputs that look like learning
/2 These products are designed for engagement
not accuracy, not pedagogy, not learning
Most research on GenAI in education studies idealized tools.
Students interact with commercial products.
That gap is where the real problem is.
I unpack this here 👇
tlclab.owlstown.net/posts/5365
Check out this piece on GenAI in education and let me know what you think.
"In Education: GenAI is a Product first and a Tool Second”
Every GenAI system learners use is a product, and no tech company wins by designing their product to lower engagement.
tlclab.owlstown.net/posts/5365
For AI literacy day:
As researchers and educational leaders, we must change how we talk and think about GenAI in education. Like social media, GenAI is a consumer product intentionally designed to maximize use.
bsky.app/profile/edte...
Check out this piece on GenAI in education and let me know what you think.
"In Education: GenAI is a Product first and a Tool Second”
Every GenAI system learners use is a product, and no tech company wins by designing their product to lower engagement.
tlclab.owlstown.net/posts/5365
We are repeating the same mistakes that happen with all new EdTech. Brief studies, novelty effects, poor measurement, and lack of methodological detail. What’s new? LLMs are positioned as general purpose tutors but need heuristics to be useful. Heuristic AI tutors don’t scale = low market value.