Header logo is hi
Department Talks
  • Dr. Adam Spiers
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

This talk will focus on three topics of my research at Yale University, which centers on themes of human and robotic manipulation and haptic perception. My major research undertaking at Yale has involved running a quantitative study of daily upper-limb prosthesis use in unilateral amputees. This work aims to better understand the techniques employed by long-term users of artificial arms and hands in order to inform future prosthetic device design and therapeutic interventions. While past attempts to quantify prosthesis-use have implemented either behavioral questionnaires or observations of specific tasks in a structured laboratory settings, our approach involves participants completing many hours of self-selected household chores in their own homes while wearing a head mounted video camera. I will discuss how we have addressed the processing of such a large and unstructured data set, in addition to our current findings. Complementary to my work in prosthetics, I will also discuss my work on several novel robotic grippers which aim to enhance the grasping, manipulation and object identification capabilities of robotic systems. These grippers implement underactuated designs, machine learning approaches or variable friction surfaces to provide low-cost, model-free and easily reproducible solutions to what have been traditionally been considered complex problems in robotic manipulation, i.e. stable grasp acquisition, fast tactile object recognition and within-hand object manipulation. Finally, I will present a brief overview of my efforts designing and testing shape-changing haptic interfaces, a largely unexplored feedback modality that I believe has huge potential for discretely communicating information to people with and without sensory impairments. This technology has been implemented in a pedestrian navigation system and evaluated in a variety of scenarios, including a large scale immersive theatre production with visually impaired artistic collaborators and almost 100 participants.

Organizers: Katherine Kuchenbecker


  • Prof. Christian Wallraven
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 5H7

Already starting at birth, humans integrate information from several sensory modalities in order to form a representation of the environment - such as when a baby explores, manipulates, and interacts with objects. The combination of visual and touch information is one of the most fundamental sensory integration processes, as touch information (such as body-relative size, shape, texture, material, temperature, and weight) can easily be linked to the visual image, thereby providing a grounding for later visual-only recognition. Previous research on such integration processes has so far mainly focused on low-level object properties (such as curvature, or surface granularity) such that little is known on how the human actually forms a high-level multisensory representation of objects. Here, I will review research from our lab that investigates how the human brain processes shape using input from vision and touch. Using a large variety of novel, 3D-printed shapes we were able to show that touch is actually equally good at shape processing than vision, suggesting a common, multisensory representation of shape. We next conducted a series of imaging experiments (using anatomical, functional, and white-matter analyses) that chart the brain networks that process this shape representation. I will conclude the talk with a brief medley of other haptics-related research in the lab, including robot learning, braille, and haptic face recognition.

Organizers: Katherine Kuchenbecker


Maternal weight and metabolism related to fetal autonomic nervous system

IS Colloquium
  • 19 January 2018 • 11:00 12:00
  • Haliza Mat Husin
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Organizers: Katherine Kuchenbecker


  • Professor Brent Gillespie
  • MPI-IS Stuttgart, Heisenbergstr.3, Werner-Köster-Hörsaal 2R 4 and broadcast

Relative to most robots and other machines, the human body is soft, its actuators compliant, and its control quite forgiving. But having a body that bends under load seems like a bad set-up for motor dexterity: the brain is faced with controlling more rather than fewer degrees of freedom. Undeniably, though, the soft body approach leads to superior solutions. Robots are putzes by comparison! While de-putzifying robots (perhaps by making them softer) is an endeavor I will discuss to some degree, in this talk I will focus on the design of robots intended to work cooperatively with humans, using physical interaction and haptic feedback in the axis of control. I will propose a backdrivable robot with forgiving control as a teammate for humans, with the aim of meeting pressing needs in rehabilitation robotics and semi-autonomous driving. In short, my lab is working to create alternatives to the domineering robot who wants complete control. Giving up complete control leads to “slacking” and loss of therapeutic benefit in rehabilitation and loss of vigilance and potential for disaster in driving. Cooperative or shared control is premised on the idea that two heads, especially two heads with complementary capabilities, are better than one. But the two heads must agree on a goal and a motor plan. How can one agent read the motor intent of another using only physical interaction signals? A few old-school control principles from biology and engineering to the rescue! One key is provided by von Holst and Mittelsteadt’s famous Reafference Principle, published in 1950 to describe how a hierarchically organized neural control system distinguishes what they called reafference from exafference—roughly: expected from unexpected. A second key is provided by Francis and Wonham’s Internal Model Principle, published in in 1976 and considered an enabler for the disk drive industry. If we extend the Reafference Principle with model-based control and use the Internal Model Principle to treat predictable exogenous (exafferent) signals, then we arrive at a theory that I will argue puts us into position to extract motor intent and thereby enable effective control sharing between humans and robots. To support my arguments I will present results from a series of experiments in which we asked human participants to move expected and unexpected loads, to track predictable and unpredictable reference signals, to exercise with self-assist and other-assist, and to share control over a simulated car with an automation system.

Organizers: Katherine Kuchenbecker