How can humans teach a robot to be lost, new simulations for exoskeletons, and an autonomous aquarium
plus how many legs on a robot is too many?
Welcome to the Michigan Robotics newsletter, a summary of what’s happening in the University of Michigan Robotics community.
As the new academic year ramps up, we’re excited to bring in new faculty who are key components to building our robotics program. This latest cohort are some pretty exceptional roboticists and collaborators, and we invite you to read about how they will help us define the discipline.
Research
TURTLMap is a novel solution that focuses on Textureless Underwater environments through a Real-Time Localization and Mapping method. The researchers show that this method is low-cost, and capable of tracking the robot accurately, while constructing a dense map of a low-textured environment in real-time and evaluate the proposed method using real-world data collected in an indoor water tank with a motion capture system and ground truth map reference, with Jingyu Song, Onur Bagoren, Razan Andigani, Advaith Sethuraman, and Katie Skinner.
VLFM: Vision-Language Frontier Maps for zero-shot semantic navigation
Understanding how humans navigate unfamiliar environments and decide where to explore can help robots do the same. Vision-Language Frontier Maps (VLFM), which is inspired by human reasoning, and the team demonstrates navigating to target objects within an office building in the real world without any prior knowledge of the environment. This work earned Naoki Yokoyama, Sehoon Ha, Dhruv Batra, Jiuguang Wang, and Bernadette Bucher IEEE Best Paper Award for Cognitive Robotics at ICRA.
Putting humans back in the loop: An affordance conceptualization of the 4th industrial revolution
The current technology epoch—sometimes called the fourth industrial revolution (4IR)—involves the innovative application of rapidly advancing digital technologies such as artificial intelligence. Nigel P. Melville, Lionel Robert, and Xiao Xiao discuss why keeping humans in the loop with artificial intelligence is essential.
Simulation training improves performance in robotic exoskeletons
A research team led by Hao Su of North Carolina State University has demonstrated a new method that leverages artificial intelligence and computer simulations to train robotic exoskeletons to autonomously help users save energy for versatile activities such as walking, running, and climbing stairs. Elliott Rouse is a co-author on the paper.
These authors, including Andrea Sipos, launched Queer in Robotics as a new affinity group dedicated to creating a welcoming and safe space for queer roboticists, increasing inclusion for queer people at conferences, and highlighting broader and nonsexualized queer issues in the field of robotics.
CRKD: Enhanced Camera-Radar object detection with cross-modality Knowledge Distillation
These researchers have developed a novel knowledge distillation framework that can transfer knowledge from a LiDAR-camera teacher to a camera-radar student to achieve enhanced perception capability for autonomous vehicles, with Lingjun Zhao, Jingyu Song, and Katie Skinner.
Building an ecosystem for the Open-Source Leg
A new $1M grant from the National Science Foundation Pathway to Enable Open-Source Ecosystems (POSE) program focuses on facilitating, creating, and growing open-source ecosystems out of open-source products. This includes building out sustainable governance, a cohesive community of developers, and a broad base of users.
Gender and security robot interactions
Security robots are becoming more common in public and private spaces, and concerns about public acceptance are increasing. Recent studies indicate that both human gender and the perceived gender of robots may influence this acceptance. A literature review and AMCIS 2024 Best Paper nominee by Xin Ye, Lionel Robert, and Samia Bhatt found mixed evidence on whether gender affects interactions with security robots.
HOUND comes with an autonomy stack geared towards aggressive offroad autonomy, giving users a real-world, open-source baseline so that you can focus on fundamental problems rather than needing to engineer the entire system from the ground up. Research led by Sidharth Talia of U. Washington, including Christoforos Mavrogiannis.
Characterizing the complexity of social robot navigation scenarios
An understanding of the inherent complexity of a social robot navigation scenario could help characterize the limitations of existing navigation algorithms and provide actionable directions for improvement, and identify a series of factors contributing to the complexity of a scenario, disambiguating between contextual and robot-related ones. Authors Andrew Stratton, Kris Hauser, Christoforos Mavrogiannis won Best Paper Award at the Workshop on Unsolved Problems in Social Robot Navigation at RSS 2024.
Watch
Jana Pavlasek highlights work on the problem of scalable, robust robotic perception and planning in challenging environments through distributed probabilistic inference, as well as approaches for teaching robotics as a discipline at the undergraduate level.
Ziyou Wu presents research which allows modeling complex multi-legged robots, from six to twelve legs, and incorporates body velocity and ground contact forces that depend on a robot's shape.
Meet Andrew Seelhof, Katharine Walters, and Nikhil Divekar, who work on powered prosthetics and exoskeletons in Robert Gregg's lab at the University of Michigan. Robert Packard and Zachary Damon of the Ann Arbor Commission on Disability Issues visit Robert Gregg’s Locomotor Control Systems Laboratory, and learn about all the work in need of study volunteers within.
Read
Autonomous aquarium: robotic fish swim for social impact
Students Devin Jones, Krystelle Fernandez, and Yatee Balan engineered a robotic school of fish prototype, named “Swarm Fish,” as part of their class project in ROB 203: Robotics Mechanisms to communicate the effects of plastic pollution on aquatic life.
ROB 204: Introduction to Human-Robot Systems
Faculty who developed this core undergraduate course to support a socially-engaged design process share the technical topics, teaching methodologies, and takeaways from teaching four semesters of the course.
Congrats
Two Robotics faculty awarded NSF CAREER grants
Nima Fazeli and Katie Skinner were both awarded NSF CAREER grants, read more on:
Kira Barton elected to Fellow of ASME
Her research innovations in modeling, control, and automation have improved diverse manufacturing processes, in particular high-resolution 3D printing. Her development of extensible and reusable digital twin frameworks has demonstrated real-time monitoring, analysis, and decision making. Barton’s research has demonstrated provable performance improvements in robotic and smart manufacturing systems and has pushed the entire research field forward. Her dedication to teaching, mentoring, and outreach has made a lasting mark on students.
ENG 100.850: Robotics Mechanisms earns teaching award
Faculty Derrick Yeo, Kelly Bowker, and Nabilah Khachab along with IAs Thomas Cuddy, Karis Hu, Lani Quatch, Prisha Agnihotri, Prakhar Gupta, Rohan Satapathy, Samantha Staudinger, Shriya Biddala, Sunny Xu, Valerie Moura, and Yi Ling Wu earned the Associate Dean for Undergraduate Education Team Teaching Award.
Voxel51, faculty startup, raised $30M in funding
Jason Corso and Brian Moore started the company that offers better management of visual data and more accurate models that can power artificial intelligence applications.
Parting shot
