Header logo is hi
Department Talks
  • Preeya Khanna
  • Heisenbergstr. 3, Room 2P4

Actions constitute the way we interact with the world, making motor disabilities such as Parkinson’s disease and stroke devastating. The neurological correlates of the injured brain are challenging to study and correct given the adaptation, redundancy, and distributed nature of our motor system. However, recent studies have used increasingly sophisticated technology to sample from this distributed system, improving our understanding of neural patterns that support movement in healthy brains, or compromise movement in injured brains. One approach to translating these findings to into therapies to restore healthy brain patterns is with closed-loop brain-machine interfaces (BMIs). While closed-loop BMIs have been discussed primarily as assistive technologies the underlying techniques may also be useful for rehabilitation.

Organizers: Katherine Kuchenbecker


Machine Learning for Tactile Manipulation

IS Colloquium
  • 13 April 2018 • 11:00 12:00
  • Jan Peters
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Today’s robots have motor abilities and sensors that exceed those of humans in many ways: They move more accurately and faster; their sensors see more and at a higher precision and in contrast to humans they can accurately measure even the smallest forces and torques. Robot hands with three, four, or five fingers are commercially available, and, so are advanced dexterous arms. Indeed, modern motion-planning methods have rendered grasp trajectory generation a largely solved problem. Still, no robot to date matches the manipulation skills of industrial assembly workers despite that manipulation of mechanical objects remains essential for the industrial assembly of complex products. So, why are current robots still so bad at manipulation and humans so good?

Organizers: Katherine Kuchenbecker


A New Perspective on Usability Applied to Robotics

Talk
  • 04 April 2018 • 14:00 15:00
  • Dr. Vincent Berenz
  • Stuttgart 2P4

For many service robots, reactivity to changes in their surroundings is a must. However, developing software suitable for dynamic environments is difficult. Existing robotic middleware allows engineers to design behavior graphs by organizing communication between components. But because these graphs are structurally inflexible, they hardly support the development of complex reactive behavior. To address this limitation, we propose Playful, a software platform that applies reactive programming to the specification of robotic behavior. The front-end of Playful is a scripting language which is simple (only five keywords), yet results in the runtime coordinated activation and deactivation of an arbitrary number of higher-level sensory-motor couplings. When using Playful, developers describe actions of various levels of abstraction via behaviors trees. During runtime an underlying engine applies a mixture of logical constructs to obtain the desired behavior. These constructs include conditional ruling, dynamic prioritization based on resources management and finite state machines. Playful has been successfully used to program an upper-torso humanoid manipulator to perform lively interaction with any human approaching it.

Organizers: Katherine Kuchenbecker Mayumi Mohan Alexis Block


  • Dr. Adam Spiers
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

This talk will focus on three topics of my research at Yale University, which centers on themes of human and robotic manipulation and haptic perception. My major research undertaking at Yale has involved running a quantitative study of daily upper-limb prosthesis use in unilateral amputees. This work aims to better understand the techniques employed by long-term users of artificial arms and hands in order to inform future prosthetic device design and therapeutic interventions. While past attempts to quantify prosthesis-use have implemented either behavioral questionnaires or observations of specific tasks in a structured laboratory settings, our approach involves participants completing many hours of self-selected household chores in their own homes while wearing a head mounted video camera. I will discuss how we have addressed the processing of such a large and unstructured data set, in addition to our current findings. Complementary to my work in prosthetics, I will also discuss my work on several novel robotic grippers which aim to enhance the grasping, manipulation and object identification capabilities of robotic systems. These grippers implement underactuated designs, machine learning approaches or variable friction surfaces to provide low-cost, model-free and easily reproducible solutions to what have been traditionally been considered complex problems in robotic manipulation, i.e. stable grasp acquisition, fast tactile object recognition and within-hand object manipulation. Finally, I will present a brief overview of my efforts designing and testing shape-changing haptic interfaces, a largely unexplored feedback modality that I believe has huge potential for discretely communicating information to people with and without sensory impairments. This technology has been implemented in a pedestrian navigation system and evaluated in a variety of scenarios, including a large scale immersive theatre production with visually impaired artistic collaborators and almost 100 participants.

Organizers: Katherine Kuchenbecker


  • Prof. Christian Wallraven
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 5H7

Already starting at birth, humans integrate information from several sensory modalities in order to form a representation of the environment - such as when a baby explores, manipulates, and interacts with objects. The combination of visual and touch information is one of the most fundamental sensory integration processes, as touch information (such as body-relative size, shape, texture, material, temperature, and weight) can easily be linked to the visual image, thereby providing a grounding for later visual-only recognition. Previous research on such integration processes has so far mainly focused on low-level object properties (such as curvature, or surface granularity) such that little is known on how the human actually forms a high-level multisensory representation of objects. Here, I will review research from our lab that investigates how the human brain processes shape using input from vision and touch. Using a large variety of novel, 3D-printed shapes we were able to show that touch is actually equally good at shape processing than vision, suggesting a common, multisensory representation of shape. We next conducted a series of imaging experiments (using anatomical, functional, and white-matter analyses) that chart the brain networks that process this shape representation. I will conclude the talk with a brief medley of other haptics-related research in the lab, including robot learning, braille, and haptic face recognition.

Organizers: Katherine Kuchenbecker


  • Haliza Mat Husin
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Background: Pre-pregnancy obesity and inadequate maternal weight gain during pregnancy can lead to adverse effects in the newborn but also to metabolic, cardiovascular and even neurological diseases in older ages of the offspring. Heart activity can be used as a proxy for the activity of the autonomic nervous system (ANS). The aim of this study is to evaluate the effect of pre-pregnancy weight, maternal weight gain and maternal metabolism on the ANS of the fetus in healthy pregnancies.

Organizers: Katherine Kuchenbecker


  • Professor Brent Gillespie
  • MPI-IS Stuttgart, Heisenbergstr.3, Werner-Köster-Hörsaal 2R 4 and broadcast

Relative to most robots and other machines, the human body is soft, its actuators compliant, and its control quite forgiving. But having a body that bends under load seems like a bad set-up for motor dexterity: the brain is faced with controlling more rather than fewer degrees of freedom. Undeniably, though, the soft body approach leads to superior solutions. Robots are putzes by comparison! While de-putzifying robots (perhaps by making them softer) is an endeavor I will discuss to some degree, in this talk I will focus on the design of robots intended to work cooperatively with humans, using physical interaction and haptic feedback in the axis of control. I will propose a backdrivable robot with forgiving control as a teammate for humans, with the aim of meeting pressing needs in rehabilitation robotics and semi-autonomous driving. In short, my lab is working to create alternatives to the domineering robot who wants complete control. Giving up complete control leads to “slacking” and loss of therapeutic benefit in rehabilitation and loss of vigilance and potential for disaster in driving. Cooperative or shared control is premised on the idea that two heads, especially two heads with complementary capabilities, are better than one. But the two heads must agree on a goal and a motor plan. How can one agent read the motor intent of another using only physical interaction signals? A few old-school control principles from biology and engineering to the rescue! One key is provided by von Holst and Mittelsteadt’s famous Reafference Principle, published in 1950 to describe how a hierarchically organized neural control system distinguishes what they called reafference from exafference—roughly: expected from unexpected. A second key is provided by Francis and Wonham’s Internal Model Principle, published in in 1976 and considered an enabler for the disk drive industry. If we extend the Reafference Principle with model-based control and use the Internal Model Principle to treat predictable exogenous (exafferent) signals, then we arrive at a theory that I will argue puts us into position to extract motor intent and thereby enable effective control sharing between humans and robots. To support my arguments I will present results from a series of experiments in which we asked human participants to move expected and unexpected loads, to track predictable and unpredictable reference signals, to exercise with self-assist and other-assist, and to share control over a simulated car with an automation system.

Organizers: Katherine Kuchenbecker