Haptic Intelligence

Department Talks

Data-Driven Needle Puncture Detection for Urgent Medical Care Delivery in Space

PhD Thesis Defense
  • 23 October 2024 • 17:30—18:00
  • Rachael L'Orsa
  • Zoom

Needle decompression (ND) is a surgical procedure that treats one of the most preventable causes of trauma-related death: dangerous accumulations of air between the chest wall and the lungs. However, needle-tip overshoot of the target space can result in the inadvertent puncture of critical structures like the heart. This type of complication is fatal without urgent surgical care, which is not available in resource-poor environments like space. Since ND is done blind, operators rely on tool sensations to identify when the needle has reached its target. Needle instrumentation could enable puncture notifications to help operators limit tool-tip overshoot, but such a solution requires reliable puncture detection from manual (i.e., variable-velocity) needle insertion data streams. Data-driven puncture-detection (DDPD) algorithms are appropriate for this application, but their performance has historically been unacceptably low for use in safety-critical applications. We contribute towards the development of an intelligent device for manual ND assistance by proposing two novel DDPD algorithms. Three data sets are collected that provide needle forces, torques, and displacements during insertions into ex vivo porcine tissue analogs for the human chest, and factors affecting DDPD algorithm performance are analyzed in these data. Puncture event features are examined for each sensor, and the suitability of accelerometer measurements and diffuse reflectance is evaluated for ND. Finally, DDPD ensembles are proposed that yield a 5.1-fold improvement in precision as compared to the traditional force-only DDPD approach. These results lay a foundation for improving the urgent delivery of percutaneous procedures in space and other resource-poor settings.

Organizers: Katherine Kuchenbecker Rachael Lorsa


  • Lorena Velásquez
  • Hybrid - Webex plus in-person attendance in Oxygen (5N18)

Individuals with limb loss often choose prosthetic devices to complete activities of daily living (ADLs) as they can provide enhanced dexterity and customizable utility. Despite these benefits, high abandonment rates persist due to uncomfortable, cumbersome, and unreliable designs. Despite restoring motor function, dexterous sensorimotor control remains severely impaired due to the absence of haptic feedback. This presentation details the design and evaluation of tendon-actuated mock prostheses with integrated state-based haptic feedback and their anthropomorphic tendon-actuated end effectors.

Organizers: Katherine Kuchenbecker Uli Bartels


  • Yijie Gong
  • Hybrid - Webex plus in-person attendance in 2P04 (MPI-IS Stuttgart)

Teleoperation allows workers on a construction site to assemble pre-fabricated building components by controlling powerful machines from a safe distance. However, teleoperation's primary reliance on visual feedback limits the operator's efficiency in situations with stiff contact or poor visibility, compromising their situational awareness and thus increasing the difficulty of the task. To bridge this gap, we created AiroTouch, a naturalistic vibrotactile feedback system tailored for use on construction sites but suitable for many other applications of telerobotics. Then we evaluate AiroTouch and explore the effects of the naturalistic vibrotactile feedback it delivers in three user studies conducted either in laboratory settings or on a construction site.

Organizers: Yijie Gong Katherine Kuchenbecker


Designing Mobile Robots for Physical Interaction with Sandy Terrains

IS Colloquium
  • 18 March 2024 • 16:00—17:00
  • Dr. Hannah Stuart
  • Hybrid - Webex plus in-person attendance in Copper (2R04)

One day, robots will widely support exploration and development of unstructured natural environments. Much of the work I will present in this lecture is supported by NASA and is focused on robot design research relevant to accessing the surfaces of the Moon or Mars. Tensile elements appear repeatedly across the wide array of missions envisioned to support human or robotic exploration and habitation of the Moon. With a single secured tether either rovers or astronauts, or both, could belay down into steep lunar craters for the exploration of permanently shadowed regions; the tether prevents catastrophic slipping or falling and mitigates risk while we search for water resources. Like a tent that relies on tensioned ropes to sustain its structure, tensegrity-based antennae, dishes, and habitations can be made large and strong using very lightweight materials. We ask: Where does the tension in these lightweight systems go? Ultimately, these concepts will require anchors that attach cables autonomously, securely, and reliably to the surrounding regolith to react tensile forces. Thus, the development of new autonomous burrowing and anchoring technologies and modeling techniques to guide design and adoption is critical across multiple space relevant programs. Yet burrowing and anchoring problems are hard for multiple reasons, which remain fundamental areas of discovery. Our goal is to understand how the mechanics of granular and rocky interaction influences the design and control small-scale robotic systems for such forceful manipulations. I will present new mobility and anchoring strategies to enable small robots to resist, or "tug," massive loads in loose terrain, like regolith. The idea is that multiple tethered agents can work together to perform large-scale manipulations, even where traction and gravity is low. Resulting generalizable methods for rapidly modeling granular interactions also inform new mobility gaits to move over, through, or under loose sand more efficiently.

Organizers: Katherine Kuchenbecker


Creating a Haptic Empathetic Robot Animal That Feels Touch and Emotion

PhD Thesis Defense
  • 23 February 2024 • 13:00—13:45
  • Rachael Bevill Burns
  • Hybrid - Webex plus in-person attendance in N2.025 (MPI-IS Tübingen)

Social touch, such as a hug or a poke on the shoulder, is an essential aspect of everyday interaction. Humans use social touch to gain attention, communicate needs, express emotions, and build social bonds. Despite its importance, touch sensing is very limited in most commercially available robots. By endowing robots with social-touch perception, one can unlock a myriad of new interaction possibilities. In this talk, I will present my work on creating a Haptic Empathetic Robot Animal (HERA), a koala-like robot for children with autism. I will demonstrate the importance of establishing design guidelines based on one’s target audience, which we investigated through interviews with autism specialists. I will share our work on creating full-body tactile sensing for the NAO robot using low-cost, do-it-yourself (DIY) methods, and I will introduce an approach to model long-term robot emotions using second-order dynamics.

Organizers: Katherine Kuchenbecker Rachael Burns


  • Marie Großmann
  • Hybrid - Webex plus in-person attendance in 5N18

The sensory perception of the world, including seeing and hearing, tasting and smelling,touching and feeling, are necessary social skills to become a social counterpart. In this context,the construction of a perceptible technology is an intersection where technical artifacts have the capability to interact and sense their environment. Sensors as technical artifacts not only measure various (physical) states, with their presented results influencing perceptions and actions, but they also undergo technical and computational processing. Sensors generate differences by capturing and measuring variations in their surroundings. In this talk, I will share insights from my qualitative social research in the lab, using a sociological engagement with technology, materiality, and science research as a starting point to sharpen a sociological perspective on constructing technical perceptions. The focus will lie on how knowledge on perceptions is implemented in technology and materiality when constructing sensors.

Organizers: Katherine Kuchenbecker


  • Dr. Janneke Schwaner
  • Hybrid - Webex plus in-person attendance in 5N18

Animals seem to effortlessly navigate complex terrain. This is in stark contrast with even the most advanced robot, illustrating that navigating complex terrain is by no means trivial. Humans’ neuromusculoskeletal system is equipped with two key mechanisms that allow us to recover from unexpected perturbations: muscle intrinsic properties and sensory-driven feedback control. We used unique in vivo and in situ approaches to explore how guinea fowl (Numida meleagris) integrate these two mechanisms to maintain robust locomotion. For example, our work showed a modular task-level control of leg length and leg angular trajectory during navigating speed perturbations while walking, with different neuromechanical control and perturbation sensitivity in each actuation mode. We also discovered gait-specific control mechanisms in walking and running over obstacles. Additionally, by combining in vivo and in situ experimental approaches, we found that guinea fowl LG muscles do not operate at optimal muscle lengths during force production during walking and running, providing a safety factor to potential unexpected perturbations. Lastly, we will also highlight some work showing how kangaroo rats circumvent mechanical limitations of skeletal muscles to jump as well as how these animals overcome angular momentum limitations during aerial reorientation during predator escape leaps. Elucidating frameworks of functions, adaptability, and individual variation across neuromuscular systems will provide a stepping-stone for understanding fundamental muscle mechanics, sensory feedback, and neuromuscular health. Additionally, this research has the potential to reveal the functional significance of individual morphological, physiological, and neuromuscular variation in relation to locomotion. This knowledge can subsequently inform individualized rehabilitation approaches and treatment of neuromuscular conditions, such as stroke-related motor impairments, as they require an integrated understanding of dynamic interactions between musculoskeletal mechanics and sensorimotor control. This work also provides foundational knowledge for the development of dynamic assistive devices and robots that can navigate through complex terrains.

Organizers: Katherine Kuchenbecker Andrew Schulz


Project neuroArm: Image-guided Medical Robotics Program

Talk
  • 17 October 2023 • 14:00—15:00
  • Dr. Diego Ospina
  • Hybrid - Webex plus in-person attendance in 5N18

Project neuroArm was established in 2002, with the idea of building the world’s first robot for brain surgery and stereotaxy. With the launch (2007) and integration of the neuroArm robot in the neurosurgical operating room (May 2008), the project continues to spawn newer technological innovations, advance tele-robotics through sensors and AI, and intelligent surgical systems towards improving safety of surgery. This talk will provide a high-level overview of two such technologies the team at Project neuroArm is currently developing and deploying: i) neuroArm+HD, a medical-grade sensory immersive workstation designed to enhance learning, performance, and safety in robot-assisted microsurgery and tele-operations; and ii) SmartForceps, a sensorized surgical bipolar forceps for real-time recording, displaying, monitoring, and uploading of tool-tissue interaction forces during surgery.

Organizers: Katherine Kuchenbecker Rachael Lorsa


Towards Seamless Handovers with Legged Manipulators

Talk
  • 10 October 2023 • 14:00—15:00
  • Andreea Tulbure
  • Hybrid - Webex plus in-person attendance in 5N18

Deploying perception and control modules for handovers is challenging because they require a high degree of robustness and generalizability to work reliably for a diversity of objects and situations, but also adaptivity to adjust to individual preferences. On legged robots, deployment is particularly challenging because of the limited computational resources and the additional sensing noise resulting from locomotion. In this talk, I will discuss how we tackle some of these challenges, by first introducing our perception framework and discussing the insights of the first human-robot handover user study with legged manipulators. Furthermore, I will show how we combine imitation and reinforcement learning to achieve some degree of adaptivity during handovers. Finally, I will present our work in which the robot takes into account the post-handover task of the collaboration partner when handing over an object. This is beneficial for situations where the human range of motion is constrained during the handover or time is crucial.

Organizers: Katherine Kuchenbecker


Gesture-Based Nonverbal Interaction for Exercise Robots

PhD Thesis Defense
  • 09 October 2023 • 13:30—14:30
  • Mayumi Mohan
  • Webex plus in-person attendance in N3.022 in the Tübingen site of MPI-IS

When teaching or coaching, humans augment their words with carefully timed hand gestures, head and body movements, and facial expressions to provide feedback to their students. Robots, however, rarely utilize these nuanced cues. A minimally supervised social robot equipped with these abilities could support people in exercising, physical therapy, and learning new activities. This thesis examines how the intuitive power of human gestures can be harnessed to enhance human-robot interaction. To address this question, this research explores gesture-based interactions to expand the capabilities of a socially assistive robotic exercise coach, investigating the perspectives of both novice users and exercise-therapy experts. This thesis begins by concentrating on the user's engagement with the robot, analyzing the feasibility of minimally supervised gesture-based interactions. This exploration seeks to establish a framework in which robots can interact with users in a more intuitive and responsive manner. The investigation then shifts its focus toward the professionals who are integral to the success of these innovative technologies: the exercise-therapy experts. Roboticists face the challenge of translating the knowledge of these experts into robotic interactions. We address this challenge by developing a teleoperation algorithm that can enable exercise therapists to create customized gesture-based interactions for a robot. Thus, this thesis lays the groundwork for dynamic gesture-based interactions in minimally supervised environments, with implications for not only exercise-coach robots but also broader applications in human-robot interaction.

Organizers: Mayumi Mohan Katherine Kuchenbecker