Congratulations to our amazing PhD student Cathy Fang on being selected as an Apple Scholar!
Learn more about the award and Fang’s work: mitmedialab.info/43P8EZM or link in profile.
Posts by Fluid Interfaces
MIT Media Lab Research Highlight: Introducing "Human-AI Co-Dancing" by Cultural Computing: "Cyber Subin" demonstrates how traditional Thai dance knowledge can evolve through collaboration with generative virtual characters: www.media.mit.edu/projects/cyb...
@mitmedialab #AHAMediaLab
Crazy facts: our sleep enhancement study is currently collecting a year's worth of sleep data every ~2 days.
www.media.mit.edu/projects/sle...
The fact that we can do a sleep interventions study on that scale is pretty cool and testament to the very talented engineers who put this together.
On April 10, 2025, tune in to the livestream of the inaugural AHA Symposium @ahamedialab.bsky.social , where leading researchers and thinkers from industry, academia, and the nonprofit sector will ask the question: “Can we design Al to support human flourishing?”
www.media.mit.edu/events/aha-s... .
@cbsnews.com talked to our PhD student Cathy Fang about our work @medialab.bsky.social together with OpenAI team on a series of studies to investigate how users’ wellbeing may be affected by uses of Al that involve emotional engagement.
Watch the interview: youtu.be/EaDD75YUguM?...
Researchers from the @mitmedialab and @OpenAI have conducted a series of studies to investigate how users’ wellbeing may be affected by uses of AI that involve emotional engagement.
Read more: www.media.mit.edu/posts/openai... or link in profile.
#AHAMediaLab @mitmedialab
New paper presented at Augmented Humans 2025!
“Resonance: Drawing from Memories to Imagine Positive Futures Through Al-Augmented Journaling” by Wazeer Zulfikar, Treyden Chiaravalloti, Jocelyn Shen, Rosalind Picard and Pattie Maes @medialab.bsky.social
#AHs2025 #mbzuai
Imagine having an AI assistant that helps you become more rational by detecting when arguments lack evidence. Our research team has developed a system that does exactly that! Follow our research on how AI can support human flourishing at: www.media.mit.edu/groups/aha/o...
#AHAMediaLab
We're thrilled to welcome Jaron Lanier, Prime Unifying Scientist at @Microsoft to the AHA Symposium! In-person attendance is by invitation only, but join us via livestream at www.media.mit.edu/events/aha-s...
Connect with us on LinkedIn, Instagram, and X!
#AHAMediaLab
We are delighted to have Sherry Turkle @STurkle at the AHA Symposium on AI and Human Flourishing.
In-person attendance is by invitation only, but join us via livestream at www.media.mit.edu/events/aha-s...
Connect with us on LinkedIn, Instagram, and X!
#AHAMediaLab @medialab
The MIT Media Lab is excited to launch of the multi-faculty research program "Advancing Humans with AI" (AHA) @medialab.bsky.social #AHA #MITMediaLab
www.media.mit.edu/groups/aha/o...
Congratulations! Pattie Maes, professor at @medialab co-director of @AHA_MediaLab , receives the ACM SIGCHI @chi.acm.org Lifetime Research Award for pioneering the concept of Software Agents, which forms the foundation of AI Agents today.
@mitofficial.bsky.social #MITMediaLabAHA
@medialab.bsky.social announces new research
program on Advancing Humans with AI (AHA) @ahamedialab.bsky.social Learn more about the program and the upcoming symposium here: aha.media.mit.edu
In the cover story for 3D Printing and Additive Manufacturing, @medialab.bsky.social researchers and collaborators including our PhD student Valdemar Danry, present Depthfusion, a text-to-3D model that allows to transform text or 2D image into detailed 3D models.
www.media.mit.edu/projects/phy...
Image of all SIGCHI awardees for 2025
🎉 We're delighted to announce the recipients of the 2025 ACM SIGCHI Awards! Congratulations to these incredible awardees!
If you're interested in trying out the app, you can sing up here! mit.co1.qualtrics.com/jfe/form/SV_...
Our new study aims to test whether these improvements in sleep physiology translate to meaningful improvements in people’s sleep quality, mood, and cognitive sharpness. To do that, we’re aiming to recruit 1000 people to use the app on their own Galaxy Watch devices.
Excited to start testing our new watch app designed to boost slow waves and improve brain function! You can try it if you are at least 18, live in the US, and have a Galaxy Watch. You’ll be paid $20 after completing the study and can continue to use the app!
mit.co1.qualtrics.com/jfe/form/SV_...
Congratulations to all of the recipients of the 2025 ACM SIGCHI Awards, including our very own team lead Pattie Maes ! Prof. Maes is one of two recipients of the ACM SIGCHI Lifetime Research Award and she is also member of ACM SIGCHI Academy Class of 2025! @medialab.bsky.social
Also if you would like to participate in our experiment on using smart watches to improve sleep quality, you can sign up here! mit.co1.qualtrics.com/jfe/form/SV_...
@nathanww.bsky.social
Check this interview with our research scientist Nathan Whitmore, and learn more about his work to boost sleep quality and memory, using regular smartwatches.
alumcommunity.mit.edu/topics/23419...
Check the latest @wsj.com article featuring our Future You system, co-led by @patpat.bsky.social @medialab.bsky.social - an interactive AI platform that allows users to create a virtual older self—a chatbot that looks like an aged version of the person: www.wsj.com/tech/ai/ai-t...
Paper title: Superficial Alignment, Subtle Divergence, and Nudge Sensitivity in LLM Decision-Making; Authors: Manuel Cherep*, Nikhil Singh*, and Pattie Maes
Excited to present our new paper on nudging LLMs (👉🤖) as a spotlight talk at the NeurIPS Behavioral ML Workshop! @neuripsconf.bsky.social
w/ Nikhil Singh* (@nikhilsinghmus.bsky.social) and Pattie Maes
🔗 openreview.net/forum?id=chb...
🧵 1/3
Congratulations to our PhD student @joaleong.bsky.social on successfully passing her PhD proposal critique on “Cultivating Learning Attitudes Using AI & AR” @medialab.bsky.social
Watch: Meet MIT’s remarkable community of hands-on problem-solvers. We combine knowledge in science, engineering, technology, humanities, and social sciences to tackle the most pressing problems facing the world today.
www.youtube.com/watch?v=1b7t...
Four dancers—three human, one computer-generated—on a dark stage, washed in golden light.
A dancer in a dramatic pose, his arms raised, lit by a bright spotlight that casts his shadow against the screen behind him. It displays several panels of computer code related to the choreography.
A human dancer, his left foot planted, his right foot raised, looks up at a large screen on which a computer-generated dancer strikes a similar pose.
Four human dancers move through a choreography. A computer-generated figure on the screen behind them dances with them, surrounded by menus that display options such as "Circle & Curve" and "Synchronic Limbs."
Cyber Subin, a project from @fluidinterfaces researcher Dr. Pat Pataranutaporn, explores ways of applying emerging technology to cultural preservation. fastcompany.com/91194605/heres…
Interested in applying to the Media Lab? Applications to the Program in Media Arts and Sciences (MAS) are open through Dec. 15, 2024! All MAS students begin at the master's level and can then apply to the PhD program during their second year of study. www.media.mit.edu/graduate-pro...
Media Lab Professor Pattie Maes, wearing a black top, stands in front of a gray background with a black border across the bottom.
Prof. Pattie Maes, head of the Media Lab’s Fluid Interfaces group, talks to MIT ILP about innovation, AI, and prioritizing people. “The Media Lab tends to emphasize the human and societal impact of emerging technologies,” ilp.mit.edu/read/Maes