Advertisement · 728 × 90

Posts by Evan Ackerman

Preview
Video Friday: Digit Learns to Deadlift Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA RSS 2026 : 13–17 July 2026, SYDNEY Summer School on Multi-Robot Systems : 29 July–4 August 2026, PRAGUE Enjoy today’s videos! Training a policy for Digit to perform a deadlift isn’t just about Digit impressing colleagues–it lets us push the limits of our hardware and training methodologies. The heavier the object (in this case 65 pounds [29.5 kg]) the more whole-body coordination we need in our controller, and the more resilience Digit’s actuators and joints require. By including whatever object we want Digit to lift in simulation as we train a new policy, we’re able to account for load distribution, grip forces, and changes to Digit’s center of mass–the result is a policy that translates to a dynamically balanced lift in the real world. New robot, you say...? [ Agility ] Gatlin Robotics is proud to unveil our first commercial, showcasing our robots in action for our debut Robot-as-a-Service (RaaS) contract! [ Gatlin Robotics ] Thanks, Erika! At Dexterity, we build robots designed for precision, adaptability, and real-world problem solving. But every now and then, we like to remind ourselves (and everyone else) that motion intelligence isn’t just about efficiency—it can be expressive, fluid, even a little playful. [ Dexterity ] Harvard researchers built a swarm of simple ant-like robots (RAnts) that can collectively excavate and construct structures without central control. By tuning just two parameters—cooperation strength and material deposition rate—the same swarm can switch between building new structures and dismantling existing ones. Adaptive group behavior can emerge from the interaction between many simple agents and their environment, with potential applications in many fields. [ Harvard University ] I really appreciate companies who give their robots the ability to entertain themselves . [ Generalist ] “Spark of Color.” Manvi Saxena, Yihao Geng, Jason Brown, Daniel Newman, Cameron Aubin. A tiny controlled explosion inflates the soft membrane of a microcombustion actuator, sending colorful, carefully arranged water droplets skyward. The actuator measures just 8 mm in diameter, while the high-speed sequence captures only 3 milliseconds of motion. The work challenges the assumption that soft actuators must be slow or gentle, showing instead how softness can also be fast, forceful, and explosive. [ Michigan Robotics ] With the physique of an ordinary person, running at a world champion’s speed! I am questioning whether it knows how to stop. [ Unitree Robotics ] Aww [ Boston Dynamics ] In this episode of Innovator Story, the FotoBot team from The University of Hong Kong made an appearance and conducted on-site tests with their AI photography robot at Shenzhen Bay Talent Park. Relying on TRON 1, it easily handles complex terrains such as grasslands, slopes, and stairs, unlocking a brand-new “Robot + Photography” experience for the public. [ LimX Dynamics ] The objective of this game is to cover up as much of the hole as possible, right? [ Kinetic Intelligent Machine Lab ] MagicLab Robotics just deployed a massive swarm of robot dogs and humanoids at the Jiangsu Super League opening ceremony. Beyond a stunning spectacle, this is live proof of Embodied AI at scale. Coordinating a cross-category fleet in a complex, open-air environment proves our multi-agent control systems are ready for real-world deployment. [ MagicLab ] A swarm of drones being launched out of the back of a Chinook would be terrifying except that from this angle, it looks like the drones are being puked out by an astonished frog. [ Boeing ] Welcome to Robot Talk, from IHMC Robotics! [ IHMC Robotics ] Third-year ‪Michigan Engineering‬ undergrad Yulei Fu sits down with Professor Jessy Grizzle to talk about what it’s actually like to major in Robotics at ‪University of Michigan. What makes it different from CS or ME? Where do graduates end up? Are the courses brutal? And what makes the department feel like a community instead of a competition? [ Michigan Robotics ] This CMU RI Yata Memorial Lecture is by Boris Softman, on “Journeys from Research to Commercialization: Lessons from Anki, Waymo, and Bedrock Robotics.” In this lecture, Boris will share an honest account of that journey and its lessons, including the energizing wins, the wrong turns and painful surprises, and the moments where an earlier experience turned out to matter more than expected. Closing with a deeper look at Bedrock, he will share why he believes autonomous construction is one of the most important problems robotics can tackle right now, driven by a unique convergence of maturing technology and critical industry need. For students at the beginning of their own paths, this is a talk about how a career in robotics and entrepreneurship might actually unfold, the many variables one navigates in the journey, and why the connections you cannot yet see may end up being the most valuable ones. [ Carnegie Mellon University Robotics Institute ]

Video Friday: Digit Learns to Deadlift https://spectrum.ieee.org/robot-learning

20 hours ago 2 0 0 0
Preview
​Boston Dynamics and Google DeepMind Teach Spot to Reason​ The amazing and frustrating thing about robots is that they can do almost anything you want them to do, as long as you know how to ask properly. In the not-so-distant past, asking properly meant writing code, and while we’ve thankfully moved beyond that brittle constraint, there’s still an irritatingly inverse correlation between ease of use and complexity of task. AI has promised to change that. The idea is that when AI is embodied within robots—giving AI software a physical presence in the world—those robots will be imbued with with reasoning and understanding. This is cutting-edge stuff, though, and while we’ve seen plenty of examples of embodied AI in a research context, finding applications where reasoning robots can provide reliable commercial value has not been easy. Boston Dynamics is one of the few companies to commercially deploy legged robots at any appreciable scale; there are now several thousand hard at work. Today the company is announcing that its quadruped robot Spot is now equipped with Google DeepMind’s Gemini Robotics-ER 1.6 , a high-level embodied reasoning model that brings usability and intelligence to complex tasks. YouTube.com Although this video shows Spot in a home context, the focus of this partnership is on one of the very few applications where legged robots have proven themselves to be commercially viable: inspection. That is, wandering around industrial facilities, checking to make sure that nothing is imminently exploding. With the new AI onboard, Spot is now able to autonomously look for dangerous debris or spills, read complex gauges and sight glasses, and call on tools like vision-language-action models when it needs help understanding what’s going on in the environment around it. “Advances like Gemini Robotics ER 1.6 mark an important step toward robots that can better understand and operate in the physical world,” Marco da Silva , Vice President and General Manager of Spot at Boston Dynamics, says in a press release . “Capabilities like instrument reading and more reliable task reasoning will enable Spot to see, understand, and react to real-world challenges completely autonomously.” Understanding Robot Understanding The words “reasoning” and “understanding” are being increasingly applied to AI and robotics, but as Toyota Research Institute’s Gill Pratt recently pointed out , what those words actually mean for robots in practice isn’t always clear. “The benchmark we measure ourselves against when it comes to understanding is that the system should answer the way a human would,” Carolina Parada , Head of Robotics at Google DeepMind, explained in an interview. For robots to reliably and safely perform tasks, this connection between how robots understand the world and how humans do is critical. Otherwise, there may be a disconnect between the instructions that a human gives a robot, and how the robot decides to carry out that task. Boston Dynamics’ video above is a potentially messy example of this. One of the instructions to Spot was to “recycle any cans in the living room.” It has no problem completing the task, as the video shows, but in doing so it grips the can sideways, which is not going to end up well for cans that have leftover liquid in them. We humans would avoid this because we can draw on a lifetime of experience to know how cans should be held, but robots don’t (yet) have that kind of world knowledge. Parada says that Gemini Robotics-ER 1.6 approaches situations like this from a safety perspective. “If you ask the robot to bring you a cup of water, it will reason not to place it on the edge of a table where it could fall. We track this using our ASIMOV benchmark , which includes a whole lot of natural language examples of things the robot should not do.” The current version of Spot doesn’t use these semantic safety models for manipulation, but the plan is to make future versions reason about holding objects in ways that are safe. YouTube.com There does still seem to be a disconnect between Gemini Robotics-ER 1.6 as a high-level reasoning model for a robot, and the robot itself as an interface with the physical world. One of the new features of 1.6 is success detection , which combines multiple camera angles to more reliably be able to tell when Spot has successfully grasped an object. This is great if you’re relying entirely on vision for your object interaction, but robots have all kinds of other well-established ways to detect a successful grasp, including touch sensors and force sensors, that 1.6 is not using. The reason why this is the case speaks to a fundamental problem that the robotics field is still trying to figure out: how to train models when you need physical data. “At the moment, these models are strictly vision only,” Parada explains. “There is lots of [visual] information on the web about how to pick up a pen. If we had enough data with touch information, we could easily learn it, but there is not a lot of data with touch sensing on the internet.” Customers who use these new capabilities for inspection with Spot will be required to share their data with Boston Dynamics, which is where some of this data will come from. Real-World Robots That Are Useful The fact that Boston Dynamics has customers makes them something of an anomaly when it comes to legged robots that rely on AI in commercial deployments. And those customers will have to be able to trust the robot—always a problem when AI is involved . “We take this very seriously,” da Silva said in an interview. “We roll out new DeepMind capabilities through beta programs to a smaller set of customers to understand what to anticipate, and we only actively advertise features we are confident will work.” There’s a threshold of usefulness that robots like Spot need to reach, and fortunately, the real world doesn’t demand perfection. “Most critical infrastructure in a facility will be instrumented to tell you whether something is wrong,” da Silva says. “But there is a lot of stuff that is not instrumented that can still cause a problem if you aren’t paying attention to it. We’ve found that somewhere north of 80 percent is the threshold where it’s not annoying. Below that, basically the robot is crying wolf, and the operators will start ignoring it.” Both da Silva and Parada agree that there’s still plenty of room for improvement in robotic inspection. As Parada points out, Spot’s rarefied status as a scalable commercial platform provides a valuable opportunity to learn how models like Gemini Robotics-ER 1.6 can be the most useful, and then apply that knowledge to other embodied AI platforms, including Boston Dynamics’ Atlas . Does that mean that Atlas is going to be the next industrial inspection robot? Probably not. But if this real-world experience can get us closer to safe and reliable robots that can pick up laundry, take a dog for a walk, and clear away soda cans without making a mess, that’s something we can all get excited about.

​Boston Dynamics and Google DeepMind Teach Spot to Reason​ spectrum.ieee.org/boston-dynamics-spot-goo...

3 days ago 3 0 1 1
Preview
Video Friday: This Floor Lamp Will Do Your Chores Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA RSS 2026 : 13–17 July 2026, SYDNEY Summer School on Multi-Robot Systems : 29 July–4 August 2026, PRAGUE Enjoy today’s videos! Lume is a sculptural floor lamp designed to feel at home the moment you place it. It’s crafted from anodized aluminum and high-gloss finishes, shaped into a slender, balanced form that quietly conceals its complexity. Every surface is refined to feel smooth, precise, and enduring. When it moves, it’s quiet and deliberate. When it’s still, it holds its place with ease. Apparently, and let me stress that ‘apparently,’ Lume can make the bed, fold laundry , and do other chores involving soft materials. I’m intensely skeptical because it feels like that video has more footage of people staring out of windows and dancing for no reason beyond the robot actually doing anything. And when you do see the robot working at a task, it’s cut up into lots of different pieces of footage in a way that is typically used to distract from either plodding speed, frequent failures, or both. So, yeah. There may be a lot to like about the philosophy here, but even at a suspiciously cheap US$2,500 for a pair of these robots, more detail is certainly called for before they’ve earned your preorder. [ Syncere ] In Science Robotics, researchers from MIT Media Lab and collaborators from Politecnico di Bari present Electrofluidic Fiber Muscles, a new class of artificial muscle fibers for robots and wearables. Unlike the rigid servo motors used in most robots, these fiber-shaped muscles are soft and flexible. They combine electrohydrodynamic (EHD) fiber pumps—slender tubes that move liquid using electric fields to generate pressure with no moving parts—with fluidic fiber actuators. The muscles are driven by electric fields and operate silently, with no external pumps or reservoirs. [ MIT ] We first saw this thing at ICRA@40 a few years ago, but the paper is out now. [ Nature Communications ] via [ LASA ] I do like tea, and I suppose there could be worse applications for a robot than this one, since it leverages both payload and complex terrain mobility. [ DEEP Robotics ] We’ve created GEN-1, our latest milestone in scaling robot learning. We believe it to be the first general-purpose AI model that crosses a new performance threshold: mastery of simple physical tasks. It improves average success rates to 99% on tasks where previous models achieve 64%, completes tasks roughly 3x faster than state of the art, and requires only 1 hour of robot data for each of these results. GEN-1 unlocks commercial viability across a broad range of applications—and while it cannot solve all tasks today, it is a significant step towards our mission of creating generalist intelligence for the physical world. [ Generalist ] Legged manipulators offer high mobility and versatile manipulation. However, robust interaction with heterogeneous articulated objects, such as doors, drawers, and cabinets, remains challenging because of the diverse articulation types of the objects and the complex dynamics of the legged robot. In this paper, we propose a robust and sample-efficient framework for opening heterogeneous articulated objects with a legged manipulator. [ OpenHEART ] By deeply coupling real-time depth perception with reinforcement learning motion control, Adam achieves natural human-like stair-stepping gait, showing outstanding dynamic stability and environmental adaptability. [ PNDbotics ] The way these robots deliver packages will never not be amusing to me. [ DEEP Robotics ] Tether performs autonomous real-world functional play involving structured, task-directed interactions. We introduce a policy that performs trajectory warping anchored by keypoint correspondences, which is extremely data efficient and robust to significant spatial and semantic environment variation. Running the policy within a VLM-guided multi-task loop, we generate a stream of play data that consistently improves downstream policy learning over time. [ Tether ] What happens when your walls begin to move? This paper explores the design of human-robot interaction for architectural-scale, shape-changing environments. [ Interactive Structures Lab ] I will admit to being somewhat disappointed about the reality of the Unreal Robotics Lab. [ URLab ] We’re not done yet! Illinois is back in the Final Four for the first time since 2005, and we’re cheering all the way to the championship. This video features teleoperated G1 and AI Worker robots. [ KIMLAB ] Fighting robots are cool. Destroying expensive electronics while fighting robots is not cool. We make robots out of plastic so our electronics survive. [ Weaponized Plastic Fighting League ]

Video Friday: This Floor Lamp Will Do Your Chores https://spectrum.ieee.org/video-friday-robot-lamp

1 week ago 1 0 0 0
Video Friday: Digit Learns to Dance—Virtually Overnight Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA RSS 2026 : 13–17 July 2026, SYDNEY Summer School on Multi-Robot Systems : 29 July–4 August 2026, PRAGUE Enjoy today’s videos! Getting Digit to dance takes more than putting on some fancy shoes–our AI Team can teach Digit new whole-body control capabilities overnight. Using raw motion data from mocap, animation, and teleop methods, Digit gets new skills through sim-to-real reinforcement training. [ Agility ] We’ve created GEN-1, our latest milestone in scaling robot learning. We believe it to be the first general-purpose AI model that crosses a new performance threshold: mastery of simple physical tasks. It improves average success rates to 99% on tasks where previous models achieve 64%, completes tasks roughly 3x faster than state of the art, and requires only 1 hour of robot data for each of these results. GEN-1 unlocks commercial viability across a broad range of applications—and while it cannot solve all tasks today, it is a significant step towards our mission of creating generalist intelligence for the physical world. [ Generalist ] Unitree open-sources UnifoLM-WBT-Dataset—high-quality real-world humanoid robot whole-body teleoperation (WBT) dataset for open environments. Publicly available since March 5, 2026, the dataset will continue to receive high-frequency rolling updates. It aims to establish the most comprehensive real-world humanoid robot dataset in terms of scenario coverage, task complexity, and manipulation diversity. [ Hugging Face ] Autonomous mobile robots operating in human-shared indoor environments often require paths that reflect human spatial intentions, such as avoiding interference with pedestrian flow or maintaining comfortable clearance. This paper presents MRReP, a Mixed Reality-based interface that enables users to draw a Hand-drawn Reference Path (HRP) directly on the physical floor using hand gestures. [ MRReP ] Thanks, Masato! Eye contact, even momentarily between strangers, plays a pivotal role in fostering human connection, promoting happiness, and enhancing belonging. Through autonomous navigation and adaptive mirror control, Mirrorbot facilitates serendipitous, non-verbal interactions by dynamically transitioning reflections from self-focused to mutual recognition, sparking eye contact, shared awareness, and playful engagement. [ ARL ] via [ Cornell University ] Experience PAL Robotics’ new teleoperation system for TIAGo Pro, the AI-ready mobile manipulator designed for advanced research. This real-time VR teleoperation setup allows precise control of TIAGo Pro’s dual arms in Cartesian space, ideal for remote manipulation, AI data collection, and robot learning. [ PAL Robotics ] Utter brilliance from Robust AI. No notes. [ Robust AI ] Come along with our Senior Test Engineer, Nick L., as he takes us on a tour of the Home Test Labs inside the iRobot HQ. [ iRobot ] By automating the final “magic 5%” of production—the precise trimming of swim goggles’ silicone gaskets based on individual face scans—UR cobots allow THEMAGIC5 to deliver affordable, custom-fit goggles, enabling the company to scale from a Kickstarter sensation to selling over 400,000 goggles worldwide. [ Universal Robots ] Sanctuary AI has once again demonstrated its industry-leading approach to training dexterous manipulation policies for its advanced hydraulic hands. In this video, their proprietary hydraulic hand autonomously manipulates a lettered cube, continuously reorienting it to match a specified goal (displayed in the bottom-left corner of the video). [ Sanctuary AI ] China’s Yuxing 3-06 commercial experimental satellite, the first of its kind to be equipped with a flexible robotic arm, has recently completed an in-orbit refueling test and verification of key technologies. The test paves the way for Yuxing 3-06, dubbed a “space refueling station,” to refuel other satellites in orbit, manage space debris, and provide other in-orbit services. [ Sanyuan Aerospace ] via [ Space News ] This is a demonstration of natural walking, whole-body teleoperation, and motion tracking with our custom-built humanoid robot. The control policies are trained using large-scale parallel reinforcement learning (RL). By deploying robust policies learned in a physics simulator onto the real hardware, we achieve dynamic and stable whole-body motions. [ Tokyo Robotics ] Faced with aging railway infrastructure, a shrinking workforce and rising construction costs, Japan Railway West asked construction innovator Serendix to replace an old wooden building at its Hatsushima railway station using its 3D printing technology. An ABB robot enabled the company to assemble the new building in a single night ready for the first train service the next day. [ ABB ] Humanoid, SAP, and Martur Fompak team up to test humanoid robots in automotive manufacturing logistics. This joint proof of concept explores how robots can streamline operations, improve efficiency, and shape the future of smart factories. [ Humanoid ] This MIT Robotics Seminar is from Dario Floreano at EPFL, on “Avian Inspired Drones.” [ MIT ] This MIT Robotics Seminar is from Ken Goldberg at UC Berkeley, on “Good Old-Fashioned Engineering Can Close the 100,000 Year “Data Gap” in Robotics.” [ MIT ]

Video Friday: Digit Learns to Dance—Virtually Overnight https://spectrum.ieee.org/video-humanoid-dancing

2 weeks ago 1 0 0 0
Preview
Gill Pratt Says Humanoid Robots’ Moment Is Finally Here In 2012, the U.S. Defense Advanced Research Projects Agency announced the DARPA Robotics Challenge (DRC). The multi-year, multi-million-dollar competition for disaster robotics resulted in Boston Dynamics’ Atlas , some absolutely incredible moments from one of the very first generations of useful humanoid robots, and a blooper video that will live on forever. Gill Pratt , the architect of the competition, had a very clear understanding of what the DRC was going to do for robotics. “The reason [for the DARPA Robotics Challenge] is actually to push the field forward and make this capability a reality,” Pratt told IEEE Spectrum in 2012 . At the time, he pointed out that before the DARPA Grand Challenge in 2004 and the DARPA Urban Challenge in 2007, driverless cars for complex environments essentially did not exist. He saw the DRC doing the same thing for robotics. It’s been about a decade since the conclusion of the DARPA Robotics Challenge , and many in the industry believe humanoid robots are about to have the transformative moment that Pratt predicted. But as is common in robotics, things tend to be far more difficult than it seems like they should be. Spectrum checked in with Pratt, now the CEO of the Toyota Research Institute (TRI), to find out what’s holding humanoid robotics back, what he thinks these robots should be doing (or not doing), and how to navigate the humanoid hype bubble. What do you think about this robotics moment that we’re in? Gill Pratt: What has changed is actually not about humanoids. Many people have been building research robots in the humanoid form for a long time. What’s different now isn’t the body, but the brain. We have always had this disparity in the robotics field where the mechanisms we were building were incredibly capable, but we didn’t really have the means for making the utility of the robot match that potential. Now we actually do, and that’s because of the AI revolution that has happened over the last few years. It’s very tempting to look back ten years and directly credit the DRC with a lot of what is now happening with commercial humanoids. Is there any reason not to do that? Gill Pratt poses with an early version of NASA’s Valkyrie DRC robot. Gill Pratt Pratt: No, but I want to be humble about it. The DRC was focused on half autonomy and half teleoperation in real time. There was remote supervision, and then semi-autonomy to amplify that supervision to handle tasks in real-time while the remote person was telling the robot what to do. That was all before the breakthroughs that have happened in AI recently. What has changed now is that we have a way to essentially teach robots what to do, and make them competent in a way that doesn’t require writing code; you can just demonstrate the task to the robot instead. With a sufficient amount of that data and new AI methods, robots can be far more performant than ever before. But that data is a bottleneck, right? How do we know what it should consist of, and what a sufficient amount is to get a robot to do something reliably? Pratt: This mirrors exactly the debate going on in large language models [LLMs]. You have certain people who believe that if you take LLMs—which are auto-regressive predictors that guess what the next word should be based on past words—and patch them up with a variety of methods to solve their hallucinations, we’ll eventually get to a point where we can trust the AI system. And then there are other people, and I think Yann LeCun is the most well-known of them, who say that’s nonsense, and we need something else. His view, and I agree, is that we need world models. We need some way for the AI system to imagine, try things out, and truly reason. And I know that we’re applying words like ‘reason’ to what are essentially pattern-matching systems. Saying that there’s ‘reasoning’ is just a sticker we put on whatever we’ve built; it’s not true reasoning. Data Bottlenecks in Robot Learning This is an example of ”system one” versus “system two” thinking, right? Pratt: Yes. System one is the fast, reflexive thinking we have, which is the kind of pattern matching that current LLMs do. System two is the slow reasoning that involves imagination and world models. That’s what we have not done yet. Progress on system one has been extraordinary, but we still don’t have system two. These attempts to patch system one to make it system two are like trying to squeeze a balloon filled with water; you squeeze it on one side and the water bulges out on the other side. You keep getting surprised that you fix one thing and something else breaks, and the performance overall doesn’t really get that much better. How have you been approaching this problem at TRI? Pratt: Two years ago, we came up with diffusion policy , and then we came up with what I call large behavior models (LBMs). That involves having one model trained on many tasks, and showing that as you add each task, it actually helps with the other tasks and cuts down on the amount of training data needed to reach a given level of performance. These have been incredible system one advances. The breakthrough happened when we realized that diffusion could be applied to robot behavior. We discovered that operating in the behavior space, from vision in, to action out, worked incredibly well. That kicked off the whole field, and since then, I think every robotics demonstration that we’ve seen is using some form of diffusion policy to do what it’s doing. But again, this is system one pattern matching: ‘If I see the world like this, I act on the world like that.’ The robot’s not imagining, thinking, and planning the way traditional robotics with hand coding used to do. It’s just reacting. System one’s pattern matching often breaks down in the real world, though, as we’ve seen with autonomous driving’s struggles. Pratt: Ten years ago, when TRI first started, almost everybody was saying that automated driving was right around the corner. Ten years later, I do think we are now there, and the remaining questions are business ones: How much does the hardware cost, the insurance, the support, does it economically make sense? We haven’t necessarily solved automated driving, but our solutions are good enough, because we use humans for backup. When an automated vehicle gets stuck at a double-parked car, it calls home and asks a person for a system two decision. I think other robots could do that also. Most of the time they do their work on their own, and every once in a while, they raise their hand for help. If we’ve just barely managed to get autonomous cars right, why are we devoting so much attention to the legged humanoid form factor? Pratt: We’ve built the world with physical affordances for our bodies. If the robot is to do well in that world, it should have something that takes advantage of those affordances. It’s also easier for imitation learning to work because we have the same form. And legs are good for certain environments; you can step over obstacles to balance faster than you can roll to a new point of support with wheels. Having said all that, legs are not always the most practical thing. It’s very weird to see so much focus on legged robots in factories, which are flat environments perfectly suited for wheels. Managing the Humanoid Robotics Hype Do you think that the amount of money being poured into legged humanoids is a good thing for robotics? Pratt: It has both advantages and dangers. It’s wonderful seeing so many resources into the robotics field, and I do think that something special has occurred. Things are not the way they were before, and there are so many possibilities when you think about people teaching robots how to do things. Gill Pratt admires a robot on the roof of the Ghibli Museum in Tokyo. Gill Pratt What kinds of things should humans be teaching robots to do? Pratt: For ten years at TRI, we’ve been thinking about society and aging . It’s not just about physical disability; it’s about loneliness and loss of purpose, which are far more prevalent (and far worse) problems. And so the question is, what can we do technologically to help people feel that they’re younger? At TRI, we’re exploring “care-receiving robots”—robots that receive teaching from a human. We have evolved to be creatures that love giving and love helping. When you program a machine by demonstration, and that machine goes on to help someone else, you feel a sense of purpose. We think robots can be bi-directional things to improve quality of life psychologically, not only physically. When you started TRI ten years ago, I asked you what you would be focusing on, your answer really stuck with me: you said elder care, because ‘we don’t have a choice.’ Pratt: Yes. The statistics in Japan and the U.S. are only getting worse, and we don’t have a choice. It’s important to remember that an aging society has a huge impact on young people. This is because of the dependency ratio, which is how many young people in the workforce are supporting both people that are too young to work, and also people that are too old to work. Those numbers keep getting worse and worse. How do we solve this? Pratt: We’ve had some incredible breakthroughs with system one, but it doesn’t mean the robots are going to be doing all that much, unless somebody makes a system two breakthrough also. Or, where we have a system where humans provide some level of system two supervisory control. That kind of human supervisory control takes us right back to the DRC, doesn’t it? Pratt: [Laughs] That’s exactly right! Look, I’m not going to tell you not to praise the DRC… There was someone who called it the ‘Woodstock of Robots ,’ which just warmed my heart, that was so cool! So, ten years later, how do you feel about the amount of hype in humanoid robotics right now? Pratt: We are approaching what (I hope!) is a peak of inflated expectations for humanoids. And that’s because nobody’s thinking deeply enough about the system one versus system two thing. Right now, our physical AI systems are just pattern matching. They’re incredibly capable, and it’s astonishing how good these things are—we are so proud of it. And we do believe that aggregating learning from many tasks through large behavior models will be incredibly effective. But it’s still not system two. There’s a lot of overpromising going on, and it’s very sad because it’s setting us up for a fall. What I’m worried about is the trough of disillusionment that will follow. How do we avoid that crash in robotics when the humanoid hype bubble bursts? Pratt: For now, we need damping. In control systems, you stabilize an unstable system by adding damping. The press and the academic world can add lead compensation by reminding everyone that what we’re seeing in humanoids now isn’t really reasoning. We should also remember that the automated driving field went through a bubble burst also, and just a few companies survived that by keeping the hype down and being persistent. I think we should do that here, too.

Gill Pratt Says Humanoid Robots’ Moment Is Finally Here spectrum.ieee.org/humanoid-robots-gill-pra...

2 weeks ago 4 0 0 2
Preview
Video Friday: Beep! Beep! Roadrunner Bipedal Bot Breaks the Mold Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA RSS 2026 : 13–17 July 2026, SYDNEY Summer School on Multi-Robot Systems : 29 July–4 August 2026, PRAGUE Enjoy today’s videos! “Roadrunner” is a new bipedal wheeled robot prototype designed for multi-modal locomotion. It weighs around 15 kg (33 lb) and can seamlessly switch between its side-by-side and in-line wheel modes and stepping configurations depending on what is required for navigating its environment. The robot’s legs are entirely symmetric, allowing it to point its knees forward or backward, which can be used to avoid obstacles or manage specific movements. A single control policy was trained to handle both side-by-side and in-line driving. Several behaviors, including standing up from various ground configurations and balancing on one wheel, were successfully deployed zero-shot on the hardware. [ Robotics and AI Institute ] Incredibly (INCREDIBLY!) NASA says that this is actually happening. NASA’s SkyFall mission will build on the success of the Ingenuity Mars helicopter, which achieved the first powered, controlled flight on another planet. Using a daring mid-air deployment, SkyFall will deliver a team of next-gen Mars helicopters to scout human landing sites and map subsurface water ice. [ NASA ] NASA’s MoonFall mission will blaze a path for future Artemis missions by sending four highly mobile drones to survey the lunar surface around the Moon’s South Pole ahead of astronauts’ arrival there. MoonFall is built on the legacy of NASA’s Ingenuity Mars Helicopter. The drones will be launched together and released during descent to the surface. They will land and operate independently over the course of a lunar day (14 Earth days) and will be able to explore hard-to-reach areas, including permanently shadowed regions (PSRs), surveying terrain with high-definition optical cameras and other potential instruments. For what it’s worth, Moon landings have a success rate well under 50%. So let’s send some robots there to land over and over! [ NASA ] In Science Robotics, researchers from the Tangible Media group led by Professor Hiroshi Ishii, together with colleagues from Politecnico di Bari, present Electrofluidic Fiber Muscles: a new class of artificial muscle fibers for robots and wearables. Unlike the rigid servo motors used in most robots, these fiber-shaped muscles are soft and flexible. They combine electrohydrodynamic (EHD) fiber pumps — slender tubes that move liquid using electric fields to generate pressure silently, with no moving parts — with fluid-filled fiber actuators. These artificial muscles could enable more agile untethered robots, as well as wearable assistive systems with compact actuation integrated directly into textiles. [ MIT Media Lab ] In this study, we developed MEVIUS2, an open-source quadruped robot. It is comparable in size to Boston Dynamics Spot, equipped with two LiDARs and a C1 camera, and can freely climb stairs and steep slopes! All hardware, software, and learning environments are released as open source. [ MEVIUS2 ] Thanks, Kento! What goes into preparing for a live performance? Arun highlights the reliability testing that goes into trying a new behavior for Spot. [ Boston Dynamics ] In this work, a multi-robot planning and control framework is presented and demonstrated with a team of 40 indoor robots, including both ground and aerial robots. That soundtrack though. [ GitHub ] Thanks, Keisuke! Quadrupedal robots can navigate cluttered environments like their animal counterparts, but their floating-base configuration makes them vulnerable to real-world uncertainties. Controllers that rely only on proprioception (body sensing) must physically collide with obstacles to detect them. Those that add exteroception (vision) need precisely modeled terrain maps that are hard to maintain in the wild. DreamWaQ++ bridges this gap by fusing both modalities through a resilient multi-modal reinforcement learning framework. The result: a single controller that handles rough terrains, steep slopes, and high-rise stairs—while gracefully recovering from sensor failures and situations it has never seen before. That cliff behavior is slightly uncanny. [ DreamWaQ++ ] I take issue with this from iRobot: While the pyramid exploration that iRobot did was very cool, they did it with a custom made robot designed for a very specific environment. Cleaning your floors is way, way harder. Here’s a bit more detail on the pyramids thing: [ iRobot ] More robots in circus please! [ Daniel Simu ] MIT engineers have designed a wristband that lets wearers control a robotic hand with their own movements. By moving their hands and fingers, users can direct a robot to perform specific tasks, or they can manipulate objects in a virtual environment with high-dexterity control. [ MIT ] At NVIDIA GTC 2026 , we showcased how AI is moving into the physical world. Visitors interacted with robots using voice commands, watching them interpret intent and act in real time — powered by our KinetIQ AI brain. [ Humanoid ] Props to Sony for their continued support and updates for Aibo ! [ Aibo ] This robot looks like it could be a little curvier than normal? [ LimX Dynamics ] Developed by Zhejiang Humanoid Robot Innovation Center Co., Ltd., the Naviai Robot is an intelligent cooking device. It can autonomously process ingredients, perform cooking tasks with high accuracy, adjust smart kitchen equipment in real time, and complete post-cooking cleaning. Equipped with multi-modal perception technology, it adapts to daily kitchen environments and ensures safe and stable operation. That 7x is doing some heavy lifting. [ Zhejiang Lab ] This CMU RI Seminar is by Hadas Kress-Gazit from Cornell, on “Formal Methods for Robotics in the Age of Big Data.” Formal methods – mathematical techniques for describing systems, capturing requirements, and providing guarantees – have been used to synthesize robot control from high-level specification, and to verify robot behavior. Given the recent advances in robot learning and data-driven models, what role can, and should, formal methods play in advancing robotics? In this talk I will give a few examples for what we can do with formal methods, discuss their promise and challenges, and describe the synergies I see with data-driven approaches. [ Carnegie Mellon University Robotics Institute ]

Video Friday: Beep! Beep! Roadrunner Bipedal Bot Breaks the Mold https://spectrum.ieee.org/roadrunner-bipedal-robot

3 weeks ago 0 0 0 0
Preview
Video Friday: Humanoid Learns Tennis Skills Playing Humans Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Summer School on Multi-Robot Systems : 29 July–4 August 2026, PRAGUE Enjoy today’s videos! Human athletes demonstrate versatile and highly dynamic tennis skills to successfully conduct competitive rallies with a high-speed tennis ball. However, reproducing such behaviors on humanoid robots is difficult, partially due to the lack of perfect humanoid action data or human kinematic motion data in tennis scenarios as reference. In this work, we propose LATENT, a system that Learns Athletic humanoid TEnnis skills from imperfect human motioN daTa. [ LATENT ] A beautifully designed robot inspired by Strandbeests. [ Cranfield University ] We believe we’re the first robotics company to demonstrate a robot peeling an apple with dual dexterous human-like hands. This breakthrough closes a key gap in robotics, achieving bimanual, contact-rich manipulation and moving far beyond the limits of simple grippers. Today’s AI models (VLMs) are excellent at perception but struggle with action. Controlling high-degree-of-freedom hands for tasks like this is incredibly complex, and precise finger-level teleoperation is nearly impossible for humans. Our first step was a shared-autonomy system: rather than controlling every finger, the operator triggers pre-learned skills like a “rotate apple or tennis ball” primitive via a keyboard press or pedal. This makes scalable data collection and RL training possible. How does the AI manage this? We created “MoDE-VLA ” (Mixture of Dexterous Experts). It fuses vision, language, force, and touch data by using a team of specialist “experts,” making control in high-dimensional spaces stable and effective. The combination of these two innovations allows for seamless, contact-rich manipulation. The human provides high-level guidance, and the robot executes the complex in-hand coordination required. [ Sharpa ] Thanks, Alex! It was great to see our name amongst the other “AI Native” companies during the NVIDIA GTC keynote. NVIDIA Isaac Lab helps us train reinforcement learning policies that enable the UMV to drive, jump, flip, and hop like a pro. [ Robotics and AI Institute ] This Finger-Tip Changer technology was jointly researched and developed through a collaboration between Tesollo and RoCogMan LaB at Hanyang University ERICA. The project integrates Tesollo’s practical robotic hand development experience with the lab’s expertise in robotic manipulation and gripper design. I don’t know why more robots don’t do this. Also, those pointy fingertips are terrifying. [ RoCogMan LaB ] Here’s an upcoming ICRA paper from the Fluent Robotics Lab at the University of Michigan featuring an operational PR2 ! With functional batteries!!! [ Fluent Robotics Lab ] This video showcases the field tests and interaction capabilities of KAIST Humanoid v0.7, developed at the DRCD Lab featuring in-house actuators. The control policy was trained through deep reinforcement learning leveraging human demonstrations. [ KAIST DRCD Lab ] This needs to come in adult size. [ DEEP Robotics ] I did not know this, but apparently shoeboxes are really annoying to manipulate because if you grab them by the lid, they just open, so specialized hardware is required. [ Nomagic ] Thanks, Gilmarie! This paper presents a method to recover quadrotor Unmanned Air Vehicles (UAVs) from a throw, when no control parameters are known before the throw. [ MAVLab ] Uh oh, robots can see glass doors now. We’re in trouble. [ LimX Dynamics ] This drone hugs trees [ Stanford BDML ] Electronic waste is one of the fastest-growing environmental problems in the world. As robotics and electronic systems become more widespread, their environmental footprint continues to increase. In this research, scientists developed a fully biodegradable soft robotic system that integrates electronic devices, sensors, and actuators, yet completely decomposes after use. [ Nature ] We developed a distributed algorithm that enables multiple aerial robots to flock together safely in complex environments, without explicit communication or prior knowledge of the surroundings, using only on-board sensors and computation. Our approach ensures collision avoidance, maintains proximity between robots, and handles uncertainties (tracking errors and sensor noise). Tested in simulations and real-world experiments with up to four drones in a dense forest, it proved robust and reliable. [ RBL ] The University of Pennsylvania’s 2025 President’s Sustainability Prize winner Piotr Lazarek has developed a system that uses satellite data to pinpoint inefficiencies in farmers’ fields, conducts real-time soil analysis with autonomous drones to understand why they occur, and generates precise fertilizer application maps. His startup Nirby aims to increase productivity in farm areas that are underperforming and reduce fertilizer in high-performing ones. [ University of Pennsylvania ] The production version of Atlas is a departure from the typical humanoid form factor, favoring industrial utility over human likeness. Intended for purposeful work in an industrial setting, Atlas has a form factor that signals its role as a machine rather than a companion or friendly assistant. Join two lead hardware engineers and our head of industrial design for a technical discussion of how key product requirements, ranging from passive thermal management to a modular architecture, dictated a bold new vision for a humanoid. [ Boston Dynamics ] Dr. Christian Hubicki gives a talk exploring the common themes of modern robotics research and his time on the reality competition show, Survivor. [ Optimal Robotics Lab ]

Video Friday: Humanoid Learns Tennis Skills Playing Humans https://spectrum.ieee.org/tennis-playing-robot

3 weeks ago 1 0 0 0
Advertisement
Video Friday: These Robots Were Born to Run Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! All legged robots deployed “in the wild” to date were given a body plan that was predefined by human designers and could not be redefined in situ. The manual and permanent nature of this process has resulted in very few species of agile terrestrial robots beyond familiar four-limbed forms. Here, we introduce highly athletic modular building blocks and show how they enable the automatic design and rapid assembly of novel agile robots that can “hit the ground running” in unstructured outdoor environments. [ Northwestern UniversityCenter for Robotics and Biosystems ] [ Paper ] via [ Gizmodo ] If you were going to develop the ideal urban delivery robot more or less from scratch, it would be this. [ RIVR ] Don’t get me wrong, there are some clever things going on here, but I’m still having a lot of trouble seeing where the unique, sustainable value is for a humanoid robot performing these sorts of tasks. [ Figure ] One of those things that you don’t really think about as a human, but is actually pretty important. [ Paper ] via [ ETH Zurich ] We propose TRIP-Bag (Teleoperation, Recording, Intelligence in a Portable Bag), a portable, puppeteer-style teleoperation system fully contained within a commercial suitcase, as a practical solution for collecting high-fidelity manipulation data across varied settings. [ KIMLAB ] We propose an open-vocabulary semantic exploration system that enables robots to maintain consistent maps and efficiently locate (unseen) objects in semi-static real-world environments using LLM-guided reasoning. [ TUM ] That’s it folks, we have no need for real pandas anymore—if we ever did in the first place. Be honest, what has a panda done for you lately? [ MagicLab ] RoboGuard is a general-purpose guardrail for ensuring the safety of LLM-enabled robots. RoboGuard is configured offline with high-level safety rules and a robot description, reasons about how these safety rules are best applied in robot’s context, then synthesizes a plan that maximally follows user preferences while ensuring safety. [ RoboGuard ] In this demonstration, a small team responds to a (simulated) radiation contamination leak at a real nuclear reactor facility. The team deploys their reconfigurable robot to accompany them through the facility. As the station is suddenly plunged into darkness, the robot’s camera is hot-swapped to thermal so that it can continue on. Upon reaching the approximate location of the contamination, the team installs a Compton gamma-ray camera and pan-tilt illuminating device. The robot autonomously steps forward, locates the radiation source, and points it out with the illuminator. [ Paper ] On March 6th, 2025, the Robomechanics Lab at CMU was flooded with 4 feet of black water (i.e. mixed with sewage). We lost most of the robots in the lab, and as a tribute my students put together this “In Memoriam” video. It includes some previously unreleased robots and video clips. [ Carnegie Mellon University Robomechanics Lab ] There haven’t been a lot of successful education robots , but here’s one of them. [ Sphero ] The opening keynote from the 2025 Silicon Valley Humanoids Summit: “Insights Into Disney’s Robotic Character Platform,” by Moritz Baecher, Director, Zurich Lab, Disney Research. [ Humanoids Summit ]

Video Friday: These Robots Were Born to Run https://spectrum.ieee.org/legged-modular-robot

1 month ago 2 0 0 0
Preview
Video Friday: A Robot Hand With Artificial Muscles and Tendons Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! The functional replication and actuation of complex structures inspired by nature is a longstanding goal for humanity. Creating such complex structures combining soft and rigid features and actuating them with artificial muscles would further our understanding of natural kinematic structures. We printed a biomimetic hand in a single print process comprised of a rigid skeleton, soft joint capsules, tendons, and printed touch sensors. [ Paper ] via [ SRL ] Two Boston Dynamics product managers talk about their favorite classic BD robots, and then I talk about mine. And this is Boston Dynamics’ LittleDog, doing legged locomotion research 16 or so years ago in what I’m pretty sure is Katie Byl’s lab at UCSB. [ Boston Dynamics ] This is our latest work on the trajectory planning method for floating-based articulated robots, enabling the global path searching in complex and cluttered environments. [ DRAGON Lab ] Thanks, Moju! OmniPlanner is a unified solution for exploration and inspection path planning (as well as target reach) across aerial, ground, and underwater robots. It has been verified through extensive simulations and a multitude of field tests, including in underground mines, ballast water tanks, forests, university buildings, and submarine bunkers. [ NTNU ] Thanks, Kostas! In the ARISE project, the FZI Research Center for Information Technology and its international partners ETH Zurich, University of Zurich, University of Bern, and University of Basel took a major step toward future lunar missions by testing cooperative autonomous multi-robot teams under outdoor conditions. [ FZI ] Welcome to the future, where there are no other humans. [ Zhejiang Humanoid ] This is our latest work on robotic fish, and is also the first underwater robot of DRAGON Lab. [ DRAGON Lab ] Thanks, Moju! Watch this one simple trick to make humanoid robots cheaper and safer! [ Zhejiang Humanoid ] Gugusse and the Automaton’ is a 1897 French film by Georges Méliès featuring a humanoid robot in nearly as realistic of a way as some of the humanoid promo videos we’ve seen lately. [ Library of Congress ] via [ Gizmodo ] At Agility, we create automated solutions for the hardest work. We’re incredibly proud of how far we’ve come, and can’t wait to show you what’s next. [ Agility ] Kamel Saidi , Robotics Program Manager at the National Institute of Standards and Technology (NIST) , on How Performance Standards can Pave the Way for Humanoid Adoption. [ Humanoids Summit ] Anca Dragan is no stranger to Waymo. She worked with us for six years while also at UC Berkeley and now, Google DeepMind. Her focus on making AI safer helped Waymo as it launched commercially. In this final episode of our season, Anca describes how her work enables AI agents to work fluently with people, based on human goals and values. [ Waymo Podcast ] This UPenn GRASP SFI Seminar is by Junyao Shi, on “Unlocking Generalist Robots with Human Data and Foundation Models.” Building general-purpose robots remains fundamentally constrained by data scarcity and labor-intensive engineering. Unlike vision and language, robotics lacks large, diverse datasets spanning tasks, environments, and embodiments, limiting both scalability and generalization. This talk explores how human data and foundation models trained at scale can help overcome these bottlenecks. [ UPenn ]

Video Friday: A Robot Hand With Artificial Muscles and Tendons spectrum.ieee.org/video-friday-robot-hand-...

1 month ago 1 0 0 0
Preview
Video Friday: Robot Dogs Haul Produce From the Field Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Our robots Lynx M20 help transport harvested crops in mountainous farmland—tackling the rural “last mile” logistics challenge. [ DEEP Robotics ] Once again, I would point out that now that we are reaching peak humanoid robots doing humanoid things, we are inevitably about to see humanoid robots doing non-humanoid things. [ Unitree ] In a study, a team of researchers from the Max Planck Institute for Intelligent Systems, the University of Michigan, and Cornell University show that groups of magnetic microrobots can generate fluidic forces strong enough to rotate objects in different directions without touching them. These microrobot swarms can turn gear systems, rotate objects much larger than the robots themselves, assemble structures on their own, and even pull in or push away many small objects. [ Science ] via [ Max Planck Institute ] Bipedal—or two-legged—autonomous robots can be quite agile. This makes them useful for performing tasks on uneven terrain, such as carrying equipment through outdoor environments or performing maintenance on an ocean-going ship. However, unstable or unpredictable conditions also increase the possibility of a robot wipeout. Until now, there’s been a significant lack of research into how a robot recovers when its direction shifts—for example, a robot losing balance when a truck makes a quick turn. The team aims to fix this research gap. [ Georgia Tech ] Robotics is about controlling energy, motion, and uncertainty in the real world. [ Carnegie Mellon University ] Delicious dinner cooked by our robot Robody. We’ve asked our investors to speak about why they’re along for the ride. [ Devanthro ] Tilt-rotor aerial robots enable omnidirectional maneuvering through thrust vectoring, but introduce significant control challenges due to the strong coupling between joint and rotor dynamics. This work investigates reinforcement learning for omnidirectional aerial motion control on over-actuated tiltable quadrotors that prioritizes robustness and agility. [ DRAGON Lab ] At the CMU Robotic Innovation Center’s 75,000-gallon water tank, members of the TartanAUV student group worked to further develop their autonomous underwater vehicle (AUV) called Osprey. The team, which takes part in the annual RoboSub competition sponsored by the U.S. Office of Naval Research, is comprised primarily of undergraduate engineering and robotics students. [ Carnegie Mellon University ] Sure seems like the only person who would want a robot dog is a person who does not in fact want a dog. Compact size, industrial capability. Maximum torque of 90N·m, over 4 hours of no-load runtime, IP54 rainproof design. With a 15 kg payload, range exceeds 13 km. Open secondary development, empowering industry applications. [ Unitree ] If your robot video includes tasty baked goods it WILL be included in Video Friday. [ QB Robotics ] Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of industrial robots giving students the necessary skills for future work. [ Astorino by Kawasaki ] We need more autonomous driving datasets that accurately reflect how sucky driving can be a lot of the time. [ ASRL ] This Carnegie Mellon University Robotics Institute Seminar is by CMU’s own Victoria Webster-Wood, on “Robots as Models for Biology and Biology and Materials for Robots.” In the last century, it was common to envision robots as shining metal structures with rigid and halting motion. This imagery is in contrast to the fluid and organic motion of living organisms that inhabit our natural world. The adaptability, complex control, and advanced learning capabilities observed in animals are not yet fully understood, and therefore have not been fully captured by current robotic systems. Furthermore, many of the mechanical properties and control capabilities seen in animals have yet to be achieved in robotic platforms. In this talk, I will share an interdisciplinary research vision for robots as models for neuroscience and biology as materials for robots. [ CMU RI ]

Video Friday: Robot Dogs Haul Produce From the Field https://spectrum.ieee.org/quadruped-farming-robots

1 month ago 2 0 0 0
Preview
Video Friday: Humanoid Robots Celebrate Spring Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! So, humanoid robots are nearing peak human performance. I would point out, though, that this is likely very far from peak robot performance , which has yet to be effectively exploited, because it requires more than just copying humans. [ Unitree ] “The Street Dance of China” Turning lightness into gravity, and rhythm into impact.This is a head-on collision between metal and beats. This Chinese New Year, watch PNDbotics Adam bring the heat with a difference. [ PNDbotics ] You had me at robot pandas. [ MagicLab ] NASA’s Perseverance rover can now precisely determine its own location on Mars without waiting for human help from Earth. This is possible thanks to a new technology called Mars Global Localization. This technology rapidly compares panoramic images from the rover’s navigation cameras with onboard orbital terrain maps. It’s done with an algorithm that runs on the rover’s Helicopter Base Station processor, which was originally used to communicate with the Ingenuity Mars Helicopter. In a few minutes, the algorithm can pinpoint Perseverance’s position to within about 10 inches (25 centimeters). The technology will help the rover drive farther autonomously and keep exploring. [ NASA Jet Propulsion Laboratory ] Legs? Where we’re going, we don’t need legs! [ Paper ] This is a bit of a tangent to robotics, but it gets a pass because of the cute jumping spider footage. [ Berkeley Lab ] Corvus One for Cold Chain is engineered to live and operate in freezer environments permanently, down to -20°F, while maintaining full flight and barcode scanning performance. I am sure there is an excellent reason for putting a cold storage facility in the Mojave desert. [ Corvus Robotics ] The video documents the current progress made in the picking rate of the Shiva robot when picking strawberries. It first shows the previous status, then the further development, and finally the field test. [ DFKI ] Data powers an organization’s digital transformation, and ST Engineering MRAS is leveraging Spot to get a full view of critical equipment and facility. Working autonomously, Spot collects information about machine health - and now, thanks to an integration of the Leica BLK ARC for reality capture, detailed and accurate point cloud data for their digital twin. [ Boston Dynamics ] The title of this video is “Get out and have fun!” Is that mostly what humanoid robots are good for right now, pretty much...? [ Engine AI ] ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it. [ Kawasaki ] Can I get this in my living room? [ Yaskawa ] What does it mean to build a humanoid robot in seven months, and the next one in just five? This documentary takes you behind the scenes at Humanoid, a UK-based AI and robotics company building reliable, safe, and helpful humanoid robots. You’ll hear directly from our engineering, hardware, product, and other teams as they share their perspectives on the journey of turning physical AI into reality. [ Humanoid ] This IROS 2025 keynote is from Tim Chung who is now at Microsoft, on “Catalyzing the Future of Human, Robot, and AI Agent Teams in the Physical World.” The convergence of technologies—from foundation AI models to diverse sensors and actuators to ubiquitous connectivity—is transforming the nature of interactions in the physical and digital world. People have accelerated their collaborative connections and productivity through digital and immersive technologies, no longer limited by geography or language or access. Humans have also leveraged and interacted with AI in many different forms, with the advent of hyperscale AI models (i.e., large language models) forever changing (and at an ever-astonishing pace) the nature of human-AI teams, realized in this era of the AI “copilot.” Similarly, robotics and automation technologies now afford greater opportunities to work with and/or near humans, allowing for increasingly collaborative physical robots to dramatically impact real-world activities. It is the compounding effect of enabling all three capabilities, each complementary to one another in valuable ways, and we envision the triad formed by human-robot-AI teams as revolutionizing the future of society, the economy, and of technology. [ IROS 2025 ] This GRASP SFI talk is by Chris Paxton at Agility Robotics, on “How Close Are We To Generalist Humanoid Robots?” With billions of dollars of funding pouring into robotics, general-purpose humanoid robots seem closer than ever. And certainly it feels like the pace of robotics is faster than ever, with multiple companies beginning large-scale deployments of humanoid robots. In this talk, I’ll go over the challenges still facing scaling robot learning, looking at insights from a year of discussions with researchers all over the world. [ University of Pennsylvania GRASP Laboratory ] This week’s CMU RI Seminar is from Jitendra Malik at UC Berkeley, on “Robot Learning, With Inspiration From Child Development.” For intelligent robots to become ubiquitous, we need to “solve” locomotion, navigation and manipulation at sufficient reliability in widely varying environments. In locomotion, we now have demonstrations of humanoid walking in a variety of challenging environments. In navigation, we pursued the task of “Go to Any Thing” – a robot, on entering a newly rented Airbnb, should be able to find objects such as TV sets or potted plants. RL in simulation and sim-to-real have been workhorse technologies for us, assisted by a few technical innovations. I will sketch promising directions for future work. [ Carnegie Mellon University Robotics Institute ]

Video Friday: Humanoid Robots Celebrate Spring https://spectrum.ieee.org/robot-martial-arts

1 month ago 1 0 0 0
Video Friday: Robot Collective Stays Alive Even When Parts Die Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! No system is immune to failure. The compromise between reducing failures and improving adaptability is a recurring problem in robotics. Modular robots exemplify this tradeoff, because the number of modules dictates both the possible functions and the odds of failure. We reverse this trend, improving reliability with an increased number of modules by exploiting redundant resources and sharing them locally. [ Science ] via [ RRL ] Now that the Atlas enterprise platform is getting to work, the research version gets one last run in the sun. Our engineers made one final push to test the limits of full-body control and mobility, with help from the RAI Institute. [ RAI ] via [ Boston Dynamics ] Announcing Isaac 0: the laundry folding robot we’re shipping to homes, starting in February 2026 in the Bay Area. [ Weave Robotics ] In a paper published in Science, researchers at the Max Planck Institute for Intelligent Systems, the Humboldt University of Berlin, and the University of Stuttgart have discovered that the secret to the elephant’s amazing sense of touch is in its unusual whiskers. The interdisciplinary team analyzed elephant trunk whiskers using advanced microscopy methods that revealed a form of material intelligence more sophisticated than the well-studied whiskers of rats and mice. This research has the potential to inspire new physically intelligent robotic sensing approaches that resemble the unusual whiskers that cover the elephant trunk. [ MPI ] Got an interest in autonomous mobile robots, ROS2 , and a mere $150 lying around? Try this. [ Maker's Pet ] Thanks, Ilia! We’re giving humanoid robots swords now. [ Robotera ] A system developed by researchers at the University of Waterloo lets people collaborate with groups of robots to create works of art inspired by music. [ Waterloo ] FastUMI Pro is a multimodal, model-agnostic data acquisition system designed to power a truly end-to-end closed loop for embodied intelligence — transforming real-world data into genuine robotic capability. [ Lumos Robotics ] We usually take fingernails for granted, but they’re vital for fine-motor control and feeling textures. Our students have been doing some great work looking into the mechanics behind this. [ Paper ] This is a 550-lb all-electric coaxial unmanned rotorcraft developed by Texas A&M University’s Advanced Vertical Flight Laboratory and Harmony Aeronautics as a technology demonstrator for our quiet-rotor technology. The payload capacity is 200 lb (gross weight = 750 lb). The noise level measured was around 74 dBA in hover at 50-ft making this probably the quietest rotorcraft at this scale. [ Harmony Aeronautics ] Harvard scientists have created an advanced 3D printing method for developing soft robotics. This technique, called rotational multimaterial 3D printing, enables the fabrication of complex shapes and tubular structures with dissolvable internal channels. This innovation could someday accelerate the production of components for surgical robotics and assistive devices, advancing medical technology. [ Harvard ] Lynx M20 wheeled-legged robot steps onto the ice and snow, taking on challenges inspired by four winter sports scenarios. Who says robots can’t enjoy winter sports? [ Deep Robotics ] NGL right now I find this more satisfying to watch than a humanoid doing just about anything. [ Fanuc ] At Mentee Robotics, we design and build humanoid robots from the ground up with one goal: reliable, scalable deployment in real-world industrial environments. Our robots are powered by deep vertical integration across hardware, embedded software, and AI, all developed in-house to close the Sim2Real gap and enable continuous, around-the-clock operation. [ Mentee Robotics ] You don’t need to watch this whole video, but the idea of little submarines that hitch rides on bigger boats and recharge themselves is kind of cool. [ Lockheed Martin ] Learn about the work of Dr. Roland Siegwart, Dr. Anibal Ollero, Dr. Dario Floreano, and Dr. Margarita Chli on flying robots and some of the challenges they are still trying to tackle in this video created based on their presentations at ICRA@40 the 40th anniversary celebration of the IEEE International Conference on Robotics and Automation. [ ICRA@40 ]

Video Friday: Robot Collective Stays Alive Even When Parts Die https://spectrum.ieee.org/video-friday-robot-collective

2 months ago 0 0 0 0
Preview
Video Friday: Autonomous Robots Learn By Doing in This Factory Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! To train the next generation of autonomous robots , scientists at Toyota Research Institute are working with Toyota Manufacturing to deploy them on the factory floor. [ Toyota Research Institute ] Thanks, Erin! This is just one story (of many) about how we tried, failed, and learned how to improve ‪drone delivery system. Okay but like you didn’t show the really cool bit...? [ Zipline ] We’re introducing KinetIQ, an AI framework developed by Humanoid, for end-to-end orchestration of humanoid robot fleets. KinetIQ coordinates wheeled and bipedal robots within a single system, managing both fleet-level operations and individual robot behaviour across multiple environments. The framework operates across four cognitive layers, from task allocation and workflow optimization to VLA-based task execution and reinforcement-learning-trained whole-body control, and is shown here running across our wheeled industrial robots and bipedal R&D platform. [ Humanoid ] What if a robot gets damaged during operation? Can it still perform its mission without immediate repair? Inspired by self-embodied resilience strategies of stick insects, we developed a decentralized adaptive resilient neural control system (DARCON). This system allows legged robots to autonomously adapt to limb loss, ensuring mission success despite mechanical failure. This innovative approach leads to a future of truly resilient, self-recovering robotics. [ VISTEC ] Thanks, Poramate! This animation shows Perseverance’s point of view during drive of 807 feet (246 meters) along the rim of Jezero Crater on Dec. 10, 2025, the 1,709th Martian day, or sol, of the mission. Captured over two hours and 35 minutes, 53 Navigation Camera (Navcam) image pairs were combined with rover data on orientation, wheel speed, and steering angle, as well as data from Perseverance’s Inertial Measurement Unit, and placed into a 3D virtual environment. The result is this reconstruction with virtual frames inserted about every 4 inches (0.1 meters) of drive progress. [ NASA Jet Propulsion Lab ] −47.4°C, 130,000 steps, 89.75°E, 47.21°N… On the extremely cold snowfields of Altay, the birthplace of human skiing, Unitree’s humanoid robot G1 left behind a unique set of marks. [ Unitree ] Representing and understanding 3D environments in a structured manner is crucial for autonomous agents to navigate and reason about their surroundings. In this work, we propose an enhanced hierarchical 3D scene graph that integrates open-vocabulary features across multiple abstraction levels and supports object-relational reasoning. Our approach leverages a Vision Language Model (VLM) to infer semantic relationships. Notably, we introduce a task reasoning module that combines Large Language Models (LLM) and a VLM to interpret the scene graph’s semantic and relational information, enabling agents to reason about tasks and interact with their environment more intelligently. We validate our method by deploying it on a quadruped robot in multiple environments and tasks, highlighting its ability to reason about them. [ Norwegian University of Science & Technology, Autonomous Robots Lab ] Thanks, Kostas! We present HoLoArm, a quadrotor with compliant arms inspired by the nodus structure of dragonfly wings. This design provides natural flexibility and resilience while preserving flight stability, which is further reinforced by the integration of a Reinforcement Learning (RL) control policy that enhances both recovery and hovering performance. [ HO Lab via IEEE Robotics and Automation Letters ] In this work, we present SkyDreamer, to the best of our knowledge, the first end-to-end vision-based autonomous drone racing policy that maps directly from pixel-level representations to motor commands. [ MAVLab ] This video showcases AI WORKER equipped with five-finger hands performing dexterous object manipulation across diverse environments. Through teleoperation, the robot demonstrates precise, human-like hand control in a variety of manipulation tasks. [ Robotis ] Autonomous following, 45° slope climbing, and reliable payload transport in extreme winter conditions — built to support operations where environments push the limits. [ DEEP Robotics ] Living architectures, from plants to beehives, adapt continuously to their environments through self-organization. In this work, we introduce the concept of architectural swarms: systems that integrate swarm robotics into modular architectural façades. The Swarm Garden exemplifies how architectural swarms can transform the built environment, enabling “living-like” architecture for functional and creative applications. [ SSR Lab via Science Robotics ] Here are a couple of IROS 2025 keynotes, featuring Bram Vanderborght and Kyu Jin Cho. - YouTube www.youtube.com [ IROS 2025 ]

Video Friday: Autonomous Robots Learn By Doing in This Factory https://spectrum.ieee.org/autonomous-warehouse-robots

2 months ago 1 0 0 0
Preview
Video Friday: Multitasking Robots Smoothly Do the Things Together Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy this week’s videos! Westwood Robotics is proud to announce a major update: THEMIS Gen2.5, the world’s first commercial full-size humanoid robot capable of manipulation on the move! Now that you mention it, the bit at the end where the robot picks up a can while walking? I haven’t seen a lot of that. [ Westwood Robotics ] Last year, Helix showed that a single neural network could control a humanoid’s upper body from pixels. Today, Helix 02 extends that control to the entire robot—walking, manipulating, and balancing as one continuous system. Why yes, I am a normal human and this is very similar to the default state of my kitchen. [ Figure ] Harry Goldstein , our Editor in Chief, went to meet Sprout from Fauna Robotics. He was skeptical at first, but Sprout won him over with its robotic charm. [ Fauna Robotics ] Kimberly Elenberg is showing how the data collected by robotic responders can save lives in mass casualty events. [ Carnegie Mellon University ] The educational robotics market is tough, but you’ve got to hand it to Sphero —going strong since 2011, which is pretty incredible. [ Sphero ] If you want to fly in crazy conditions, you have to flight test in those conditions. Here’s how and why we do it! [ Zipline ] I want to be impressed more by the idea of 3D printing skin and skeleton at the same time, but come on, animals have been doing that for literally hundreds of years without even trying. [ JSK Lab, University of tokyo ] If there is a market for small bipedal robots that can both ski and be dinosaurs, LimX has it covered. [ LimX ] How do you remotely control robots that change shape? We introduce a method for user-guided control of modular robots using reconfigurable joint-space joysticks (JoJo) and real-time optimization. We demonstrate this system on two different robots, Mori3 and Roombots. The video shows examples of these robots performing object manipulation, locomotion, human-assistance, and reconfiguration, controlled by our system. [ EPFL Reconfigurable Robotics Lab ] via [ Nature Communications ] Quadrotor Biplane Tailsitter (QBiT) UAVs at four different sizes (4, 12, 25, and 50 lbs) developed at Texas A&M University. QBiT combines the mechanical simplicity of a quadrotor drone with the cruise efficiency of a fixed-wing aircraft. [ Texas A&M University ] There’s a new DARPA challenge for “novel drone designs that can carry payloads more than four times their weight, which would revolutionize the way we use drones across all sectors.” [ DARPA ] Here are a couple of plenary and keynote talks from IROS 2025, from Marco Hutter and Karinne Ramirez Amaro . [ IROS 2025 ]

Video Friday: Multitasking Robots Smoothly Do the Things Together https://spectrum.ieee.org/multitasking-robot

2 months ago 1 0 0 0
Advertisement

Where can I watch this???

2 months ago 0 0 1 0

I am living under a rock. What show?

2 months ago 0 0 1 0
Preview
Video Friday: Humans and Robots Team Up in Battlefield Triage Your weekly selection of awesome robot videos Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! One of my favorite parts of robotics is watching research collide with non-roboticists in the real (or real-ish) world. [ DARPA ] Spot will put out fires for you. Eventually. If it feels like it. [ Mechatronic and Robotic Systems Laboratory ] All those robots rising out of their crates is not sinister at all. [ LimX ] The Lynx M20 quadruped robot recently completed an extreme cold-weather field test in Yakeshi, Hulunbuir, operating reliably in temperatures as low as –30°C. [ DEEP Robotics ] This is a teaser video for KIMLAB’s new teleoperation robot. For now, we invite you to enjoy the calm atmosphere, with students walking, gathering, and chatting across the UIUC Main Quad—along with its scenery and ambient sounds, without any technical details. More details will be shared soon. Enjoy the moment. The most incredible part of this video is that they have publicly available power in the middle of their quad. [ KIMLAB ] For the eleventy billionth time: just because you can do a task with a humanoid robot , doesn’t mean you should do a task with a humanoid robot. [ UBTECH ] I am less interested in this autonomous urban delivery robot and more interested in whatever that docking station is at the beginning that loads the box into it. [ KAIST ] Ok so figuring out where Spot’s face is just got a lot more complicated. [ Boston Dynamics ] An undergraduate team at HKU’s Tam Wing Fan Innovation Wing developed CLIO, an embodied tour-guide robot, just in months. Built on LimX Dynamics TRON 1, it uses LLMs for tour planning, computer vision for visitor recognition, and a laser pointer/expressive display for engaging tours. [ CLIO ] The future of work is doing work so that robots can then do the same work except less well. [ AgileX ]

Video Friday: Humans and Robots Team Up in Battlefield Triage https://spectrum.ieee.org/darpa-triage-challenge-robot

2 months ago 3 1 0 0
Preview
Video Friday: Bipedal Robot Stops Itself From Falling Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! This is one of the best things I have ever seen. [ Kinetic Intelligent Machine LAB ] After years of aggressive testing and pushing the envelope with U.S. Army and Marine Corps partners, the Robotic Autonomy in Complex Environments with Resiliency (RACER) program approaches its conclusion. But the impact of RACER will reverberate far beyond the program’s official end date, leaving a legacy of robust autonomous capabilities ready to transform military operations and inspire a new wave of private sector investment. [ DARPA ] Best looking humanoid yet. [ Kawasaki ] COSA (Cognitive OS of Agents) is a physical-world-native Agentic OS that unifies high-level cognition with whole-body motion control, enabling humanoid robots to think while acting in real environments. Powered by COSA, Oli becomes the first humanoid agent with both advanced loco-manipulation and high-level autonomous cognition. [ LimX Dynamics ] Thanks, Jinyan! The 1X World Model’s latest update is a paradigm shift in robot learning: NEO now uses a physics-grounded video model (World Model) to turn any voice or text prompt into fully autonomous action, even for completely novel tasks and objects NEO has never seen before. By leveraging internet-scale, video data fine-tuned on real robot experience, NEO can visualize future actions, predict outcomes, and execute them with human-like understanding– all without prior examples. This marks the critical first step in NEO being able to collect data on its own to master new tasks all by itself. [ 1X ] I’m impressed by the human who was mocapped for this. [ PNDbotics ] We introduce the GuideData Dataset, a collection of qualitative data, focusing on the interactions between guide dog trainers, visually impaired (BLV) individuals, and their guide dogs. The dataset captures a variety of real-world scenarios, including navigating sidewalks, climbing stairs, crossing streets, and avoiding obstacles. By providing this comprehensive dataset, the project aims to advance research in areas such as assistive technologies, robotics, and human-robot interaction, ultimately improving the mobility and safety of visually impaired people. [ DARoS Lab ] Fourier’s desktop Care-Bot prototype is gaining much attention at CES 2026 ! Even though it’s still in the prototype stage, we couldn’t wait to share these adorable and fun interaction features with you. [ Fourier ] Volcanic gas measurements are critical for understanding eruptive activity. However, harsh terrain, hazardous conditions, and logistical constraints make near-surface data collection extremely challenging. In this work, we present an autonomous legged robotic system for volcanic gas monitoring, validated through real-world deployments on Mount Etna. The system combines a quadruped robot equipped with a quadrupole mass spectrometer and a modular autonomy stack, enabling long-distance missions in rough volcanic terrain. [ ETH Zurich RSL ] Humanoid and Siemens successfully completed a POC testing humanoid robots in industrial logistics. This is the first step in the broader partnership between the companies. The POC focused on a tote-to-conveyor destacking task within Siemens’ logistics process. HMND 01 autonomously picked, transported, and placed totes in a live production environment during a two-week on-site deployment at the Siemens Electronics Factory in Erlangen. [ Humanoid ] Four Growers, a category leader in intelligent ag-tech platforms, developed the GR-200 robotic harvesting platform, powered by FANUC’s LR Mate robot. The system combines AI-driven vision and motion planning to identify and harvest ripe tomatoes with quick precision. [ FANUC ] Columbia Engineers build a robot that, for the first time, is able to learn facial lip motions for tasks such as speech and singing. In a new study published in Science Robotics, the researchers demonstrate how their robot used its abilities to articulate words in a variety of languages, and even sing a song out of its AI-generated debut album “hello world_.” The robot acquired this ability through observational learning rather than via rules. It first learned how to use its 26 facial motors by watching its own reflection in the mirror before learning to imitate human lip motion by watching hours of YouTube videos. [ Columbia ] Roborock has some odd ideas about what lawns are like. [ Roborock ] DEEP Robotics’ quadruped robots demonstrate coordinated multi-module operations under unified command, tackling complex and dynamic firefighting scenarios with agility and precision. [ DEEP Robotics ] Unlike statically stable wheeled platforms, humanoids are dynamically stable, requiring continuous active control to maintain balance and prevent falls. This inherent instability presents a critical challenge for functional safety, particularly in collaborative settings. This presentation will introduce Synapticon’s POSITRON platform, a comprehensive solution engineered to address these safety-critical demands. We will explore how its integrated hardware and software enable robust, certifiable safety functions that meet the highest industrial standards, providing key insights into making the next generation of humanoid robots safe for real-world deployment. [ Synapticon ] The University of California, Berkeley is world-famous for its AI developments, and one big name behind them is Ken Goldberg . Longtime professor and lifelong artist, Ken is all about deep learning while staying true to “good, old fashioned engineering.” Hear Ken talk about his approach to vision and touch for robotic surgeries and how robots will evolve across the board. [ Waymo ]

Video Friday: Bipedal Robot Stops Itself From Falling https://spectrum.ieee.org/video-friday-bipedal-robot

3 months ago 2 0 0 0
Video Friday: Robots Are Everywhere at CES 2026 Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! We’re excited to announce the product version of our Atlas® robot. This enterprise-grade humanoid robot offers impressive strength and range of motion, precise manipulation, and intelligent adaptability—designed to power the new industrial revolution. [ Boston Dynamics ] I appreciate the creativity and technical innovation here, but realistically, if you’ve got more than one floor in your house? Just get a second robot. That single-step sunken living room though.... [ Roborock ] Wow, SwitchBot’s CES 2026 video show almost as many robots in their fantasy home as I have in my real home. [ SwitchBot ] What is happening in robotics right now that I can derive more satisfaction from watching robotic process automation than I can from watching yet another humanoid video? [ ABB ] Yes, this is definitely a robot I want in close proximity to my life. [ Unitree ] The video below demonstrates a MenteeBot learning, through mentoring, how to replace a battery in another MenteeBot. No teleoperation is used. [ Mentee Robotics ] Personally, I think that we should encourage humanoid robots to fall much more often, just so that we can see whether they can get up again. [ Agility Robotics ] Achieving long-horizon, reliable clothing manipulation in the real world remains one of the most challenging problems in robotics. This live test demonstrates a strong step forward in embodied intelligence, vision-language-action systems, and real-world robotic autonomy. [ HKU MMLab ] Millions of people around the world need assistance with feeding. Robotic feeding systems offer the potential to enhance autonomy and quality of life for individuals with impairments and reduce caregiver workload. However, their widespread adoption has been limited by technical challenges such as estimating bite timing, the appropriate moment for the robot to transfer food to a user’s mouth. In this work, we introduce WAFFLE: Wearable Approach For Feeding with LEarned Bite Timing, a system that accurately predicts bite timing by leveraging wearable sensor data to be highly reactive to natural user cues such as head movements, chewing, and talking. [ CMU RCHI ] Humanoid robots are now available as platforms, which is a great way of sidestepping the whole practicality question. [ PNDbotics ] We’re introducing Spatially-Enhanced Recurrent Units (SRUs) — a simple yet powerful modification that enables robots to build implicit spatial memories for navigation. Published in the International Journal of Robotics Research (IJRR), this work demonstrates up to +105% improvement over baseline approaches, with robots successfully navigating 70+ meters in the real world using only a single forward-facing camera. [ ETHZ RSL ] Looking forward to the DARPA Triage Challenge this fall! [ DARPA ] Here are a couple of good interviews from the Humanoids Summit 2025. [ Humanoids Summit ]

Video Friday: Robots Are Everywhere at CES 2026 https://spectrum.ieee.org/robots-ces-2026

3 months ago 1 0 0 0
Video Friday: Watch Scuttle Evolve Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! I always love seeing robots progress from research projects to commercial products . [ Ground Control Robotics ] Well this has to be one of the most “watch a robot do this task entirely through the magic of jump cuts” I’ve ever seen. [ UBTECH ] Very satisfying sound on this one. [ Pudu Robotics ] Welcome to the AgileX Robotics Data Collection Facility—where real robots build the foundation for universal embodied intelligence. Our core mission? Enable large-scale data sharing and reuse across dual-arm teleoperation robots of diverse morphologies, breaking down data silos that slow down AI progress. [ AgileX ] I’m not sure how much thought was put into this, but giving a service robot an explicit cat face could be a good way of moderating expectations on its behavior and interactivity. [ Pudu Robotics ] UBTECH says they have built 1000 of their Walker S2 humanoid robots , over 500 of which are “delivered & working.” I would very much like to know what “working” means in this context. [ UBTECH ] Every story has its beginning, and ours started in 2023—a year defined by the unknown. Let technology return to passion; let trials catalyze evolution. Embracing growth, embarking on a new journey. We’ll see you at the next stop. Please, please hire someone to do some HRI (human-robot interface) design. [ PNDbotics ]

Video Friday: Watch Scuttle Evolve https://spectrum.ieee.org/video-friday-robot-farming

3 months ago 1 0 0 0
Teams of Robots Compete to Save Lives on the Battlefield Last September, the Defense Advanced Research Projects Agency (DARPA) unleashed teams of robots on simulated mass-casualty scenarios , including an airplane crash and a night ambush. The robots’ job was to find victims and estimate the severity of their injuries, with the goal of helping human medics get to the people who need them the most. Kimberly Elenberg Kimberly Elenberg is a principal project scientist with the Auton Lab of Carnegie Mellon University’s Robotics Institute. Before joining CMU, Elenberg spent 28 years as an army and U.S. Public Health Service nurse, which included 19 deployments and serving as the principal strategist for incident response at the Pentagon. The final event of the DARPA Triage Challenge will take place in November, and Team Chiron from Carnegie Mellon University will be competing, using a squad of quadruped robots and drones. The team is led by Kimberly Elenberg , whose 28-year career as an army and U.S. Public Health Service nurse took her from combat surgical teams to incident response strategy at the Pentagon. Why do we need robots for triage? Kimberly Elenberg: We simply do not have enough responders for mass-casualty incidents. The drones and ground robots that we’re developing can give us the perspective that we need to identify where people are, assess who’s most at risk, and figure out how responders can get to them most efficiently. When could you have used robots like these? Elenberg: On the way to one of the challenge events, there was a four-car accident on a back road. For me on my own, that was a mass casualty event. I could hear some people yelling and see others walking around, and so I was able to reason that those people could breathe and move. In the fourth car, I had to crawl inside to reach a gentleman who was slumped over with an occluded airway. I was able to lift his head until I could hear him breathing. I could see that he was hemorrhaging and feel that he was going into shock because his skin was cold. A robot couldn’t have gotten inside of the car to make those assessments. This challenge involves enabling robots to remotely collect this data—can they detect heart rate from changes in skin color or hear breathing from a distance? If I’d had these capabilities, it would have helped me identify the person at greatest risk and gotten to them first. How do you design tech for triage? Elenberg: The system has to be simple. For example, I can’t have a device that’s going to force a medic to take their hands away from their patient. What we came up with is a vest-mounted Android phone that flips down at chest height to display a map that has the GPS location of all of the casualties on it and their triage priority as colored dots, autonomously populated from the team of robots. Are the robots living up to the hype? Elenberg: From my time in service, I know the only way to understand true capability is to build it, test it, and break it. With this challenge, I’m learning through end-to-end systems integration—sensing, communications, autonomy, and field testing in real environments. This is art and science coming together, and while the technology still has limitations, the pace of progress is extraordinary. What would be a win for you? Elenberg: I already feel like we’ve won. Showing responders exactly where casualties are and estimating who needs attention most—that’s a huge step forward for disaster medicine. The next milestone is recognizing specific injury patterns and the likely life-saving interventions needed, but that will come. This article appears in the January 2026 print issue as “Kimberly Elenberg.”

Teams of Robots Compete to Save Lives on the Battlefield https://spectrum.ieee.org/darpa-triage-challenge-robots

3 months ago 1 0 0 0
Advertisement
Video Friday: Holiday Robot Helpers Send Season’s Greetings Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Happy Holidays from Boston Dynamics! I would pay any amount of money for that lamp. [ Boston Dynamics ] What if evolution wasn’t carbon-based — but metal instead? This short film explores an alternative, iron-based evolution through robots, simulation, and real-world machines. Inspired by biological evolution, this Christmas lab film imagines a world where machines evolve instead of organisms. [ ETH Zurich Robotics System Lab ] Happy Holidays from FieldAI! [ FieldAI ] Happy Holidays from the Institute of Robotics and Machine Intelligence at Poznan University of Technology ! [ Poznan University of Technology IRMI ] Happy Holidays from BruBotics! [ AugmentX ] Thanks, Bram! [ Humanoid ] Check out how SCUTTLE tackles the dull, dirty, and dangerous tasks of the pest control industry. [ Ground Control Robotics ] Happy Holidays from LimX Dynamics! [ LimX Dynamics ] Happy (actually maybe not AI?) Holidays from Kawasaki Robotics! [ Kawasaki Robotics ] Happy Holidays from AgileX Robotics [ AgileX Robotics ] Big news: Badminton just got a new training partner. Our humanoid robot can rally with a human in continuous exchanges, combining fast returns with stable movement. Peak return speed reaches 19.1 m/s. [ Phybot ] Well, here’s one way of deploying a legged robot. [ Kepler ] Today, we present the world’s first demo video of a full-size robot taking on the challenging Charleston dance. [ PNDbotics ] The DR02 humanoid robot from DEEP Robotics showcases remarkable versatility and agility. From the graceful flow of Tai Chi to the energetic moves of street dance, DR02 combines precision, strength, and artistry with ease! [ Deep Robotics ] Decreasing the Cost of Morphing in Adaptive Morphogenetic Robots: By using kirigami laminar jamming flippers, the Jamming Amphibious Robotic Turtle (JART) can quickly morph its limbs to adapt to changing terrain. This pneumatic layer jamming technology enables multi-environment locomotion on land and water by changing the robot’s flipper shape and stiffness to decrease the cost of transport. [ Paper ] Super Odometry is a resilient sensor-fusion framework that delivers accurate, real-time state estimation in challenging environments by integrating external and inertial sensing. For decades, SLAM has depended on external sensors like cameras and LiDAR. We argue it’s time to reverse this hierarchy: true robustness begins from within. By placing inertial sensing at the core of state estimation, robots gain an inner sense of motion. We believe the systems that not only see, but also feel, learn, and adapt. [ AirLab ]

Video Friday: Holiday Robot Helpers Send Season’s Greetings https://spectrum.ieee.org/holiday-robot-videos

3 months ago 1 0 0 0
Preview
The Top 6 Robotics Stories of 2025 Usually, I start off these annual highlights posts by saying that it was the best year ever for robotics. But this year, I’m not so sure. At the end of 2024 , it really seemed like AI and humanoid robots were poised to make a transformative amount of progress towards some sort of practicality. While it’s certainly true that progress has been made, it’s hard to rationalize what’s actually happened in 2025 with the amount of money and hype that has suffused robotics over the course of the year. And for better or worse, humanoids are overshadowing everything else, raising questions about what will happen if the companies building them ultimately do not succeed. We’ll be going into 2026 with both optimism and skepticism, and we’ll keep doing what we always do: talking to the experts, asking as many hard questions as we can, and making sure to share all the cool robots, even (or especially) the ones that you won’t see anywhere else. So thanks for reading, and to all you awesome robotics folks out there, thanks for sharing your work with us! IEEE Spectrum has a bunch of exciting new stuff planned for 2026, and as we close out 2025, here’s a quick look back at some of our best robotics stories of the year. 1. Reality Is Ruining the Humanoid Robot Hype Eddie Guy Humanoid robots are hard, and they’re hard in lots of different ways. For some of those ways, we at least understand the problems and what the solutions will likely involve. But there are other problems that have no clear solutions, and most humanoid companies, especially the well-funded ones, seem quite happy to wave those problems away while continuing to raise extraordinary amounts of money. We’re going to keep calling this out whenever we see it, and expect even more skepticism in 2026. 2. Exploit Allows for Takeover of Fleets of Unitree Robots CFOTO/Future Publishing/Getty Images Security is one of those pesky little things that is super important in robotics but that early-stage robotics companies typically treat as an afterthought because it doesn’t drive investment. Chinese manufacturer Unitree is really the one company with humanoids robots that are available enough and affordable enough for clever people to perform a security audit on them. And to the surprise of no one, Unitree’s robots had serious vulnerabilities , which as of yet have not all been fixed. 3.Amazon’s Vulcan Robots Now Stow Items Faster Than Humans Amazon The thing I appreciate about the folks at Amazon Robotics is how relentless they are in finding creative solutions for problems at scale . Amazon simply doesn’t have time to mess around, and they’re designing robots to do what robots do best: specific repetitive tasks in structured environments . In the current climate of robotics hype, it’s refreshing, honestly. 4. Large Behavior Models Are Helping Atlas Get to Work Boston Dynamics Did I mention that humanoids robots are hard? Whether or not anyone can deliver on the promises being made about them (and personally, I’m leaning more and more strongly towards not), progress is being made towards humanoids that are much more capable and versatile than they ever have been . The collaboration between Toyota Research and Boston Dynamics on large behavior models is just one example of how far we’ve come , and how far we still have to go. 5. iRobot’s Cofounder Weighs In on Company’s Bankruptcy Lindsey Nicholson/Universal Images Group/Getty Images My least favorite story to write happened right at the end of the year—iRobot filed for bankruptcy . This was not a total surprise; regulators shutting down an acquisition by Amazon in 2024 essentially gutted the company, and it’s been limping along towards the inevitable since then. Right after the news was announced, we spoke with iRobot co-founder and ex-CEO Colin Angle, who had plenty to share about where things went wrong, and what we can learn from it. 6. How Dairy Robots Are Changing Work for Cows (and Farmers) Evan Ackerman My favorite story of 2025 was as much about cows as it was about robots. I was astonished to learn just how many fully autonomous robots are hard at work on dairy farms around the world , and utterly delighted to also learn that these robots are actively improving the lives of both dairy farmers and the dairy cows themselves. Dairy farming is endless hard work, but thanks to these robots, small family farms are able to keep themselves sustainable (and sane). Everybody wins, thanks to the robots.

The Top 6 Robotics Stories of 2025 https://spectrum.ieee.org/top-robotics-stories-2025

3 months ago 3 1 0 0
Preview
Video Friday: Happy Robot Holidays Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Happy Holidays from FZI Living Lab! [ FZI ] Thanks, Georg! Happy Holidays from Norlab! I should get a poutine... [ Norlab ] Happy Holidays from Fraunhofer IOSB! [ Fraunhofer ] Thanks, Janko! Happy Holidays from HEBI Robotics! [ HEBI Robotics ] Thanks, Trevor! Happy Holidays from the Learning Systems and Robotics Lab! [ Learning Systems and Robotics Lab ] Happy Holidays from Toyota Research Institute! [ Toyota Research Institute ] Happy Holidays from Clearpath Robotics! [ Clearpath Robotics ] Happy AI Holidays from Robotnik! [ Robotnik ] Happy AI Holidays from ABB Robotics! [ ABB Robotics ] With its unique modular configuration, TRON 2 lets you freely configure Dual-arm, Bipedal, or Wheeled setups to fit your mission. [ LimX Dynamics ] Thanks, Jinyan! I love this robot but can someone please explain why what happens at 2:00 makes me physically uncomfortable? [ Paper ] Thanks, Ayato! This robot, REWW-ARM, is a remote wire-driven mobile robot that separates and excludes electronics from the mobile part, so that the mobile robot can operate in harsh environments. A novel transmission mechanism enables efficient and long-distance electronics-free power transmission, closed-loop control which estimates the distal state from wire. It demonstrated locomotion and manipulation on land and underwater. [ JSK Lab ] Thanks, Takahiro! DEEP Robotics has deployed China’s first robot dog patrol team for forest fire protection in the West Lake area. Powered by embodied AI, these quadruped robots support early detection, patrol, and risk monitoring—using technology to protect nature and strengthen emergency response. [ DEEP Robotics ] In this video we show how we trained our robot to fold a towel from start to finish. Folding a towel might seem simple, but for a robot it means solving perception, planning, and dexterous manipulation all at once, especially when dealing with soft, deformable fabric. We walk through how the system sees the towel, identifies key features, and executes each fold autonomously. [ Kinisi Robotics ] This may be the first humanoid app store, but it’s far from the first app store for robots. Problem is, for an app store to gain traction, there needs to be a platform out there that people will buy for its core functionality first. [ Unitree ] You can tell that this isn’t U.S. government funded research because it involves a robot fetching drinks . [ Flexiv ] This video shows the Perseverance Mars Rover’s point of view during a record-breaking drive that occurred June 19, 2025, the 1,540th Martian day, or sol, of the mission. Perseverance rover was traveling northbound and covered 1,350.7 feet (411.7 meters) on that sol, over the course of about 4 hours and 24 minutes. This distance eclipsed its previous record of distance traveled in a single sol: 1,140.7 feet (347.7 meters), which was achieved on April 3, 2023 (Sol 753). [ NASA ] Automation is what’s helped keep lock maker Wilson Bohannan based in America for more than 150 years while all of its competitors relocated overseas. Using two high-speed and high-precision FANUC M-10 series robots, Acme developed a simple but highly sophisticated system that uses innovative end-of-arm tooling to accommodate 18 different styles of padlocks. As a result of Acme’s new system using FANUC robots, Wilson Bohannan production rocketed from 1,500-1,800 locks finished per eight-hour shift to more than 5,000. [ Fanuc ] In this conversation, Zack Jackowski, General Manager and Vice President, Atlas and Alberto Rodriguez, Director of Robot Behavior sit down to discuss the path to generalist humanoid robots working at scale and how we approach research & development to both push the boundaries of the industry and deliver valuable applications. [ Boston Dynamics ]

Video Friday: Happy Robot Holidays https://spectrum.ieee.org/happy-holidays-robot

3 months ago 1 0 0 0
Preview
iRobot’s Co-founder Weighs in on Company’s Bankruptcy On Sunday evening, legendary robotics company iRobot, manufacturer of the Roomba robotic vacuum, filed for bankruptcy . The company will be handing over all of its assets to its Chinese manufacturing partner, Picea. According to iRobot’s press release, “this agreement represents a critical step toward strengthening iRobot’s financial foundation and positioning the Company for long-term growth and innovation,” which sounds like the sort of thing that you put in a press release when you’re trying your best to put a positive spin on really, really bad news. This whole situation started back in August of 2022, when iRobot announced a US $1.7 billion acquisition by Amazon. Amazon’s interest was obvious—some questionable hardware decisions had left the company struggling to enter the home robotics market. And iRobot was at a point where it needed a new strategy to keep ahead of lower cost (and increasingly innovative) home robots from China. Some folks were skeptical of this acquisition, and admittedly, I was one of them . My primary worry was that iRobot would get swallowed up and effectively cease to exist, which tends to happen with acquisitions like these, but regulators in the United States had much more pointed concerns: Namely, that Amazon would leverage its marketplace power to restrict competition . The European Commission expressed similar objections . By late January 2024, the deal had fallen through , iRobot laid off a third of its staff, suspended research and development, and CEO and co-founder Colin Angle left the company. Since then, iRobot has seemed resigned to its fate, coasting along on a few lackluster product announcements and not much else, and so Sunday’s announcement of bankruptcy was a surprise to no one—perhaps least of all to Angle. iRobot’s Bankruptcy and Amazon Deal Collapse “iRobot’s bankruptcy filing was really just a public-facing outcome of the tragedy that happened a year and a half ago,” Angle told IEEE Spectrum on Monday. “Today sucks, but I’ve already mourned. I mourned when the deal with Amazon got blocked for all the wrong reasons.” Angle points out that by the early 2020s, iRobot was no longer monopolizing the robot vacuum market. This was especially true in Europe, where iRobot’s market share was 12 percent and decreasing. But from Angle’s perspective, regulators were more focused on making a point about big tech than they were about the actual merits and risks of the merger. Co-founder Colin Angle says that iRobot’s bankruptcy filing was unsurprising after a failed acquisition by Amazon a year and a half ago. Charles Krupa/AP “We were roadkilled in a larger agenda,” Angle says. “And this kind of regulation is incredibly destructive to the innovation economy. The whole concept of starting a tech company and having it acquired by a bigger tech company is far and away the most common positive outcome. For that to be taken away is not a good thing.” And for iRobot, it was fatal. A common criticism of iRobot even before the attempted Amazon merger is that the company was simply being out-innovated in the robot vacuum space, and Angle doesn’t necessarily disagree. “By 2020, China had become the largest market in the world for robot vacuums, and Chinese robotics companies with government support were investing two or three times as much as iRobot was in R&D. We simply didn’t have the capital to move as quickly as we wanted to. In order for iRobot to continue to innovate and lead the industry, we needed to do so as part of a larger entity, and Amazon was very aligned with our vision for the home.” This situation is not unique to iRobot, and there is significant concern in robotics about how companies can effectively compete against the massive advantage that China has in the production of low-cost hardware. In some sense, what happened to iRobot is an early symptom of what Angle (and others ) see as a fundamental problem with robotics in the United States: lack of government support. In China, long-term government support for robotics and embodied AI (in the form of both policy and direct investment) can be found across industry and academia, something that neither the United States nor the European Union has been able to match. “Robotics is in a global competition against some very fearsome competitors,” Angle says. “ We have to decide whether we want to support our innovation economy. And if the answer is no, then the innovation economy g oes elsewhere.” The consequence of companies like iRobot losing this competition can be more than just bankruptcy. In iRobot’s case, a Chinese company now owns iRobot’s intellectual property and app infrastructure, which gives it access to data from millions of highly sensorized autonomous mobile robots in homes across the world. I asked Angle whether or not Roomba owners should be concerned about this. “When I was running the company, we talked a lot about this, and put a lot of effort into privacy and security,” he says. “This was fundamental to Roomba’s design. But now, I don’t know.” While Angle has moved on from iRobot, and has since co-founded a more-mysterious-than-we’d-like company called Familiar Machines and Magic , he still feels strongly that what has happened to iRobot should be a warning to both robotics companies and policymakers. “Make no mistake: China is good at robots. So we need to play this hard. There’s a lot to learn from what we did at iRobot, and a lot of ways to do it better.” On a personal note, I’m choosing to remember the iRobot that was—not just the company that that built a robot vacuum out of nothing and conquered the world with it for nearly two decades, but also the company that built the PackBot to save lives, as well as all of these other crazy robots . I’m not sure there’s ever been a company quite like iRobot, and there may never be again. It will be missed.

iRobot’s Co-founder Weighs in on Company’s Bankruptcy spectrum.ieee.org/irobot-bankruptcy-colin-...

4 months ago 2 0 0 0
Video Friday: Robot Dog Shows Off Its Muscles Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Suzumori Endo Lab, Science Tokyo has developed a dog musculoskeletal robot using thin McKibben muscles. This robot mimics the flexible “hammock-like” shoulder structure to investigate the biomechanical functions of dog musculoskeletal systems. [ Suzimori Endo Robotics Laboratory ] HOLEY SNAILBOT!!! [ Freeform Robotics ] We present a system that transforms speech into physical objects using 3D generative AI and discrete robotic assembly. By leveraging natural language, the system makes design and manufacturing more accessible to people without expertise in 3D modeling or robotic programming. [ MIT ] Meet the next generation of edge AI. A fully self-contained vision system built for robotics, automation, and real-world intelligence. Watch how OAK 4 brings compute, sensing, and 3D perception together in one device. [ Luxonis ] Thanks, Max! Inspired by vines’ twisty tenacity, engineers at MIT and Stanford University have developed a robotic gripper that can snake around and lift a variety of objects, including a glass vase and a watermelon, offering a gentler approach compared to conventional gripper designs. A larger version of the robo-tendrils can also safely lift a human out of bed. [ MIT ] The paper introduces an automatic limb attachment system using soft actuated straps and a magnet-hook latch for wearable robots. It enables fast, secure, and comfortable self-donning across various arm sizes, supporting clinical-level loads and precise pressure control. [ Paper ] Thanks, Bram! Autonomous driving is the ultimate challenge for AI in the physical world. At Waymo, we’re solving it by prioritizing demonstrably safe AI, where safety is central to how we engineer our models and AI ecosystem from the ground up. [ Waymo ] Built by Texas A&M engineering students, this AI-powered robotic dog is reimagining how robots operate in disaster zones. Designed to climb through rubble, avoid hazards and make autonomous decisions in real time, the robot uses a custom multimodal large language model (MLLM) combined with visual memory and voice commands to see, remember and plan its next move like a first responder. [ Texas A&M ] So far, aerial microrobots have only been able to fly slowly along smooth trajectories, far from the swift, agile flight of real insects — until now. MIT researchers have demonstrated aerial microrobots that can fly with speed and agility that is comparable to their biological counterparts. A collaborative team designed a new AI-based controller for the robotic bug that enabled it to follow gymnastic flight paths, such as executing continuous body flips. [ MIT ] In this audio clip generated by data from the SuperCam microphone aboard NASA’s Perseverance, the sound of an electrical discharge can be heard as a Martian dust devil flies over the Mars rover. The recording was collected on Oct. 12, 2024, the 1,296th Martian day, or sol, of Perseverance’s mission on the Red Planet. [ NASA Jet Propulsion Laboratory ] In this episode, we open the archives on host Hannah Fry’s visit to our California robotics lab. Filmed earlier this year, Hannah interacts with a new set of robots—those that don’t just see, but think, plan, and do. Watch as the team goes behind the scenes to test the limits of generalization, challenging robots to handle unseen objects autonomously. [ Google DeepMind ] This GRASP on Robotics Seminar is by Parastoo Abtahi from Princeton University, on “When Robots Disappear – From Haptic Illusions in VR to Object-Oriented Interactions in AR”. Advances in audiovisual rendering have led to the commercialization of virtual reality (VR); however, haptic technology has not kept up with these advances. While a variety of robotic systems aim to address this gap by simulating the sensation of touch, many hardware limitations make realistic touch interactions in VR challenging. In my research, I explore how, by understanding human perception through the lens of sensorimotor control theory, we can design interactions that not only overcome the current limitations of robotic hardware for VR but also extend our abilities beyond what is possible in the physical world. In the first part of this talk, I will present my work on redirection illusions that leverage the limits of human perception to improve the perceived performance of encountered-type haptic devices in VR, such as the position accuracy of drones and the resolution of shape displays. In the second part, I will share how we apply these illusory interactions to physical spaces and use augmented reality (AR) to facilitate situated and bidirectional human-robot communication, bridging users’ mental models and robotic representations. [ University of Pennsylvania GRASP Laboratory ]

Video Friday: Robot Dog Shows Off Its Muscles https://spectrum.ieee.org/musculoskeletal-robot-dog

4 months ago 1 0 0 0
Preview
Ghost Robotics’ Arm Brings Manipulation to Military Quadrupeds Ghost Robotics is today announcing a major upgrade for their Vision 60 quadruped: an arm. Ghost, a company which originated at the GRASP Lab at the University of Pennsylvania , specializes in exceptionally rugged quadrupeds, and while many of its customers use its robots for public safety and disaster relief, it also provides robots to the United States military, which has very specific needs when it comes to keeping humans out of danger. In that context, it’s not unreasonable to assume that Ghost’s robots may sometimes be used to carry weapons, and despite the proliferation of robots in many roles in the Ukraine war, the idea of a legged robot carrying a weapon is not a comfortable one for many people. IEEE Spectrum spoke with Ghost co-founder and current CEO Gavin Kenneally to learn more about the new arm, and to get his perspective on selling robots to the military. The Vision 60’s new arm has six degrees of freedom. Ghost Robotics Robots for the Military Ghost Robotics initially made a name for itself with its very impressive early work with the Minitaur direct-drive quadruped in 2016. The company also made headlines in late 2021, when a now-deleted post on Twitter (now X) went viral because it included a photograph of one of Ghost’s Vision 60 quadrupeds with a rifle mounted on its back . That picture resulted in a very strong reaction , although as IEEE Spectrum reported at the time, robots with guns affixed to them wasn’t new: To mention one early example, the U.S. military had already deployed weapons on mobile robots in Iraq in 2007 . And while several legged robot companies pledged in 2022 not to weaponize their general purpose robots , the Chinese military in 2024 displayed quadrupeds from Unitree equipped with guns. (Unitree, based in China, was one of the signers of the 2022 pledge.) The issue of weaponized robots goes far beyond Ghost Robotics, and far beyond robots with legs. We’ve covered both the practical and ethical perspectives on this extensively at IEEE Spectrum , and the intensity of the debates show that there is no easy answer. But to summarize one important point made by some ethicists, some military experts, and Ghost Robotics itself: robots are replaceable, humans are not. “Customers use our robots to keep people out of harm’s way,” Ghost CEO Kenneally tells Spectrum. It’s also worth pointing out that even the companies who signed the pledge not to weaponize their general purpose robots acknowledge that military robots exist, and are accepting of that, provided that such robots are used under existing legal doctrines and operate within those safeguards—and that what constraints should or should not be imposed on these kinds of robots is best decided by policymakers rather than industry. This is essentially Ghost Robotics’ position as well, says Kenneally. “We sell our robots to U.S. and allied governments, and as part of that, the robots are used in defense applications where they will sometimes be weaponized. What’s most critical to us is that the decisions about how to use these robots are happening systematically and ethically at the government policy level.” To some extent, these decisions are already being made within the U.S. government. Department of Defense Directive 3000.09 , ‘Autonomy in Weapon Systems,’ lays out the responsibilities and limitations for how autonomous or human-directed robotics weapons systems should be developed and deployed, including requirements for human use-of-force judgements. At least in the U.S., this directive implies that there are rules and accountability for robotic weapons. Vision 60’s Versatile Arm Capabilities Ghost sees its Vision 60 quadruped as a system that its trusted customers can use as they see fit, and the manipulator enables many additional capabilities. “The primary purpose of the robot has been as a sensor platform,” Kenneally says, “but sometimes there are doors in the way, or objects that need to be moved, or you might want the robot to take a sample. So the ability to do all of that mobile manipulation has been hugely valuable for our customers.” As it turns out, arms are good for more than manipulation. “One thing that’s been very interesting is that our customers have been using the arm as a sensor boom, which is something that we hadn’t anticipated,” says Kenneally. Ghost’s robot has plenty of cameras, but they’re mostly at the viewpoint of a moderately-sized dog. The new arm offers a more human-like vantage and a way to peek around corners or over things without exposing the whole robot. Ghost was not particularly interested in building their own arm, and tried off-the-shelf options to get the manipulation bit working. And they did get the manipulation working; what didn’t work were any of those arms after the 50 kilogram robot rolled over on them. “We wanted to make sure that we could build an arm that could stand up to the same intense rigors of our customers’ operations that the rest of the robot can,” says Kenneally. “Morphologically, we actually consider the arm to be a fifth leg, so that the robot operates as a unified system for whole-body control.” The rest of the robot is exceptionally rugged, which is what makes it appealing to customers with unique needs, like special forces teams. Enough battery life for more than three hours of walking (or more than 20 hours on standby) isn’t bad, and the Vision 60 is sealed against sand and dust, and can survive complete submergence in shallow water. It can operate in extreme temperatures ranging from -40 °C to 55 °C, which has been a particular challenge for robots. And if you do manage to put it in a situation where it physically breaks one of its legs, it’s easy to swap in a spare in just a few minutes, even out in the field. The Vision 60 can open doors withe high-level direction from a human operator. Ghost Robotics Quadruped Robot Competition From China Despite Ghost quietly selling over a thousand quadrupeds to date, Kenneally is cautious about the near future for legged robots, as is anyone who has seriously considered buying one, because it’s impossible to ignore the option of just buying one from a Chinese company at about a tenth the cost of a quadruped from a company based in the U.S. or Europe. “China has identified legged robotics as a lynchpin technology that they are strategically funding,” Kenneally says. “I think it’s an extremely serious threat in the long term, and we have to take these competitors very seriously despite their current shortcomings.” There is a technological moat, for now, but if the market for legged robots follows the same trajectory as the market for drones did, that moat will shrink drastically over the next few years. The United States is poised to ban consumer drone sales from Chinese manufacturer DJI , and banned DJI drone use by federal agencies in 2017 . But it may be too late in some sense, as DJI’s global market share is something like 90 percent . Meanwhile, Unitree may have already cornered somewhere around 70 percent of the global market for quadrupeds, despite the recent publication of exploits that allow the robots to send unauthorized data to China. In the United States in particular, private sector robotics funding is unpredictable at the best of times, and Kenneally argues that to compete with Chinese-subsidized robot-makers American companies like Ghost who produce these robots domestically will need sustained U.S. government support, too. That doesn’t mean the government has to pick which companies will be the winners, but that it should find a way to support the U.S. robotics industry as a whole, if it still wants to have a meaningful one. “The quadruped industry isn’t a science project anymore,” says Kenneally. “It’s matured, and quadruped robots are going to become extremely important in both commercial and government applications. But it’s only through continued innovation that we’ll be able to stay ahead.”

Ghost Robotics’ Arm Brings Manipulation to Military Quadrupeds spectrum.ieee.org/ghost-robotics-quadruped...

4 months ago 1 0 0 0
Preview
Video Friday: Biorobotics Turns Lobster Tails Into Gripper Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications. [ EPFL ] Finally, a good humanoid robot demo! Although having said that, I never trust videos demos where it works really well once, and then just pretty well every other time. [ LimX Dynamics ] Thanks, Jinyan! I understand how these structures work, I really do. But watching something rigid extrude itself from a flexible reel will always seem a little magical. [ AAAS ] Thanks, Kyujin! I’m not sure what “industrial grade” actually means, but I want robots to be “automotive grade,” where they’ll easily operate for six months or a year without any maintenance at all. [ Pudu Robotics ] Thanks, Mandy! When you start to suspect that your robotic EV charging solution costs more than your car. [ Flexiv ] Yeah uh if the application for this humanoid is actually making robot parts with a hammer and anvil, then I’d be impressed. [ EngineAI ] Researchers at Columbia Engineering have designed a robot that can learn a human-like sense of neatness. The researchers taught the system by showing it millions of examples, not teaching it specific instructions. The result is a model that can look at a cluttered tabletop and rearrange scattered objects in an orderly fashion. [ Paper ] Why haven’t we seen this sort of thing in humanoid robotics videos yet? [ HUCEBOT ] While I definitely appreciate in-the-field testing, it’s also worth asking to what extent your robot is actually being challenged by the in-the-field field that you’ve chosen. [ DEEP Robotics ] Introducing HMND 01 Alpha Bipedal — autonomous, adaptive, designed for real-world impact. Built in 5 months, walking stably after 48 hours of training. [ Humanoid ] Unitree says that “this is to validate the overall reliability of the robot” but I really have to wonder how useful this kind of reliability validation actually is. [ Unitree ] This University of Pennsylvania GRASP on Robotics Seminar is by Jie Tan from Google DeepMind, on “Gemini Robotics: Bringing AI into the Physical World.” Recent advancements in large multimodal models have led to the emergence of remarkable generalist capabilities in digital domains, yet their translation to physical agents such as robots remains a significant challenge. In this talk, I will present Gemini Robotics, an advanced Vision-Language-Action (VLA) generalist model capable of directly controlling robots. Furthermore, I will discuss the challenges, learnings and future research directions on robot foundation models. [ University of Pennsylvania GRASP Laboratory ]

Video Friday: Biorobotics Turns Lobster Tails Into Gripper https://spectrum.ieee.org/lobster-biorobotics

4 months ago 1 0 0 0
Video Friday: Disney’s Robotic Olaf Makes His Debut Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. SOSV Robotics Matchup : 1–5 December 2025, ONLINE ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Step behind the scenes with Walt Disney Imagineering Research & Development and discover how Disney uses robotics, AI, and immersive technology to bring stories to life! From the brand new self-walking Olaf in World of Frozen and BDX Droids to cutting-edge attractions like Millennium Falcon: Smugglers Run, see how magic meets innovation. [ Disney Experiences ] We just released a new demonstration of Mentee’s V3 humanoid robots completing a real world logistics task together. Over an uninterrupted 18-minute run, the robots autonomously move 32 boxes from eight piles to storage racks of different heights. The video shows steady locomotion, dexterous manipulation, and reliable coordination throughout the entire task. And there’s an uncut 18 minute version of this at the link. [ MenteeBot ] Thanks, Yovav! This video contains graphic depictions of simulated injuries. Viewer discretion is advised. In this immersive overview, guided by the DARPA Triage Challenge program manager, retired Army Col. Jeremy C. Pamplin, M.D., you’ll experience how teams of innovators, engineers, and DARPA are redefining the future of combat casualty care. Be sure to look all around! Check out competition runs, behind-the-scenes of what it takes to put on a DARPA Challenge, and glimpses into the future of lifesaving care. Those couple of minutes starting at 6:50 with the human medic and robotic teaming was particularly cool. [ DARPA ] You don’t need to build a humanoid robot if you can just make existing humanoids a lot better. I especially love 0:45 because you know what? Humanoids should spend more time sitting down, for all kinds of reasons. And of course, thank you for falling and getting up again, albeit on some of the squishiest grass on the planet. [ Flexion ] “Human-in-the-Loop Gaussian Splatting” wins best paper title of the week. [ Paper ] via [ IEEE Robotics and Automation Letters in IEEE Xplore ] Scratch that, “Extremum Seeking Controlled Wiggling for Tactile Insertion” wins best paper title of the week. [ University of Maryland PRG ] The battery swapping on this thing is... Unfortunate. [ LimX Dynamics ] To push the boundaries of robotic capability, researchers in the Department of Mechanical Engineering at Carnegie Mellon University in collaboration with The University of Washington and Google Deepmind, have developed a new tactile sensing system that enables four-legged robots to carry unsecured, cylindrical objects on their backs. This system, known as LocoTouch, features a network of tactile sensors that spans the robot’s entire back. As an object shifts, the sensors provide real-time feedback on its position, allowing the robot to continuously adjust its posture and movement to keep the object balanced. [ Carnegie Mellon University ] This robot is in more need of googly eyes than any other robot I’ve ever seen. [ Zarrouk Lab ] DPR Construction has deployed Field AI’s autonomy software on a quadruped robot at the company’s job site in Santa Clara, CA, to greatly improve its daily surveying and data collection processes. By automating what has traditionally been a very labor intensive and time consuming process, Field AI is helping the DPR team operate more efficiently and effectively, while increasing project quality. [ FieldAI ] In our second episode of AI in Motion, our host, Waymo AI researcher Vincent Vanhoucke, talks with a robotics startup founder Sergey Levine, who left a career in academic research to build better robots for the home and workplace. [ Waymo ]

Video Friday: Disney’s Robotic Olaf Makes His Debut spectrum.ieee.org/video-friday-disney-robo...

4 months ago 2 0 0 0
Video Friday: Watch Robots Throw, Catch, and Hit a Baseball Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. SOSV Robotics Matchup : 1–5 December 2025, ONLINE ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Researchers at the RAI Institute have built a low-impedance platform to study dynamic robot manipulation. In this demo, robots play a game of catch and participate in batting practice, both with each other and with skilled humans. The robots are capable of throwing 70mph [112 kph], approaching the speed of a strong high school pitcher. The robots can catch and bat at short distances (23 feet [7 m]) requiring quick reaction times to catch balls thrown at up to 41 mph [66kph] and hit balls pitched at up to 30 mph [48kph]. That’s a nice touch with the custom “RAI” baseball gloves, but what I really want to know is how long a pair of robots can keep themselves entertained. [ RAI Institute ] This week’s best bacronym winner is GIRAF: Greatly Increased Reach AnyMAL Function. And if that arm looks like magic, that’s because it is, although with some careful pausing of the video you’ll be able to see how it works. [ Stanford BDML ] DARPA concluded the second year of the DARPA Triage Challenge on October 4, awarding top marks to DART and MSAI in Systems and Data competitions, respectively. The three-year prize competition aims to revolutionize medical triage in mass casualty incidents where medical resources are limited. [ DARPA ] We propose a robot agnostic reward function that balances the achievement of a desired end pose with impact minimization and the protection of critical robot parts during reinforcement learning. To make the policy robust to a broad range of initial falling conditions and to enable the specification of an arbitrary and unseen end pose at inference time, we introduce a simulation-based sampling strategy of initial and end poses. Through simulated and real-world experiments, our work demonstrates that even bipedal robots can perform controlled, soft falls. [ Moritz Baecher ] Oh look, more humanoid acrobatics. My prediction: once humanoid companies run out of mocapped dance moves, we’ll start seeing some freaky stuff that leverages the degrees of freedom that robots have and humans do not. You heard it here first, folks. [ MagicLab ] I challenge the next company that makes a “lights-out” video to just cut to just a totally black screen with a little “Successful Picks” counter in the corner that just goes up and up and up. [ Brightpick ] Thanks, Gilmarie! The terrain stuff is cool and all but can we just talk about the trailer instead? [ LimX Dynamics ] Presumably very picky German birblets are getting custom nesting boxes manufactured with excessively high precision by robots. [ TUM ] All those UBTECH Walker S2 robots weren’t fake, it turns out. [ UBTECH ] This is more automation than what we’d really be thinking of as robotics at this point, but I could still watch it all day. [ Motoman ] Brad Porter (Cobot) and Alfred Lin (Sequoia Capital) discuss the future of robotics, AI, and automation at the Human[X] Conference, moderated by CNBC’s Kate Rooney. They explore why collaborative robots are accelerating now, how AI is transforming physical systems, the role of humanoids, labor market shifts, and the investment trends shaping the next decade of robotics. [ Cobot ] Humanoid robots have long captured our imagination. Interest has skyrocketed along with the perception that robots are getting closer to taking on a wide range of labor-intensive tasks. In this discussion, we reflect on what we’ve learned by observing factory floors, and why we’ve grown convinced that chasing generalization in manipulation—both in hardware and behavior—isn’t just interesting, but necessary. We’ll discuss AI research threads we’re exploring at Boston Dynamics to push this mission forward, and highlight opportunities our field should collectively invest more in to turn the humanoid vision, and the reinvention of manufacturing, into a practical, economically viable product. [ Boston Dynamics ] On November 12, 2025, Tom Williams presented “Degrees of Freedom: On Robotics and Social Justice” as part of the Michigan Robotics Seminar Series. [ Michigan Robotics ] Ask the OSRF Board of Directors anything! Or really, listen to other people ask them anything. [ ROSCon ]

Video Friday: Watch Robots Throw, Catch, and Hit a Baseball https://spectrum.ieee.org/video-friday-baseball-robot

4 months ago 4 0 0 0
Advertisement