Header logo is hi
Department Talks

The computational skin. Tactile perception based on slip movements.

IS Colloquium
  • 02 July 2018 • 14:30 15:30
  • Prof. Dr. Cornelius Schwarz
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Touch requires mechanical contact and is governed by the physics of friction. Frictional movements may convert the continuous 3D profile of textural objects into discrete and probabilistic movement events of the viscoelastic integument (skin/hair) called stick-slip movements (slips). This complex transformation may further be determined by the microanatomy and the active movements of the sensing organ. Thus, the integument may realize a computation, transforming the tactile world in a context dependent way - long before it even activates neurons. The possibility that the tactile world is perceived through these ‘fractured goggles’ of friction has been largely ignored by classical perceptual and neuro-scientific work. I will present biomechanical, neuro-scientific, and behavioral work supporting the slip hypothesis.

Organizers: Katherine Kuchenbecker


Haptic Engineering and Science at Multiple Scales

Talk
  • 20 June 2018 • 11:00 12:00
  • Yon Visell, PhD
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

I will describe recent research in my lab on haptics and robotics. It has been a longstanding challenge to realize engineering systems that can match the amazing perceptual and motor feats of biological systems for touch, including the human hand. Some of the difficulties of meeting this objective can be traced to our limited understanding of the mechanics, and to the high dimensionality of the signals, and to the multiple length and time scales - physical regimes - involved. An additional source of richness and complication arises from the sensitive dependence of what we feel on what we do, i.e. on the tight coupling between touch-elicited mechanical signals, object contacts, and actions. I will describe research in my lab that has aimed at addressing these challenges, and will explain how the results are guiding the development of new technologies for haptics, wearable computing, and robotics.

Organizers: Katherine Kuchenbecker


  • Wenzhen Yuan
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Why cannot the current robots act intelligently in the real-world environment? A major challenge lies in the lack of adequate tactile sensing technologies. Robots need tactile sensing to understand the physical environment, and detect the contact states during manipulation. Progress requires advances in the sensing hardware, but also advances in the software that can exploit the tactile signals. We developed a high-resolution tactile sensor, GelSight, which measures the geometry and traction field of the contact surface. For interpreting the high-resolution tactile signal, we utilize both traditional statistical models and deep neural networks. I will describe my research on both exploration and manipulation. For exploration, I use active touch to estimate the physical properties of the objects. The work has included learning the hardness of artificial objects, as well as estimating the general properties of natural objects via autonomous tactile exploration. For manipulation, I study the robot’s ability to detect slip or incipient slip with tactile sensing during grasping. The research helps robots to better understand and flexibly interact with the physical world.

Organizers: Katherine Kuchenbecker


Making Haptics and its Design Accessible

IS Colloquium
  • 28 May 2018 • 11:00 12:00
  • Karon MacLean
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Today’s advances in tactile sensing and wearable, IOT and context-aware computing are spurring new ideas about how to configure touch-centered interactions in terms of roles and utility, which in turn expose new technical and social design questions. But while haptic actuation, sensing and control are improving, incorporating them into a real-world design process is challenging and poses a major obstacle to adoption into everyday technology. Some classes of haptic devices, e.g., grounded force feedback, remain expensive and limited in range. I’ll describe some recent highlights of an ongoing effort to understand how to support haptic designers and end-users. These include a wealth of online experimental design tools, and DIY open sourced hardware and accessible means of creating, for example, expressive physical robot motions and evolve physically sensed expressive tactile languages. Elsewhere, we are establishing the value of haptic force feedback in embodied learning environments, to help kids understand physics and math concepts. This has inspired the invention of a low-cost, handheld and large motion force feedback device that can be used in online environments or collaborative scenarios, and could be suitable for K-12 school contexts; this is ongoing research with innovative education and technological elements. All our work is available online, where possible as web tools, and we plan to push our research into a broader openhaptics effort.

Organizers: Katherine Kuchenbecker


  • Preeya Khanna
  • Heisenbergstr. 3, Room 2P4

Actions constitute the way we interact with the world, making motor disabilities such as Parkinson’s disease and stroke devastating. The neurological correlates of the injured brain are challenging to study and correct given the adaptation, redundancy, and distributed nature of our motor system. However, recent studies have used increasingly sophisticated technology to sample from this distributed system, improving our understanding of neural patterns that support movement in healthy brains, or compromise movement in injured brains. One approach to translating these findings to into therapies to restore healthy brain patterns is with closed-loop brain-machine interfaces (BMIs). While closed-loop BMIs have been discussed primarily as assistive technologies the underlying techniques may also be useful for rehabilitation.

Organizers: Katherine Kuchenbecker


Machine Learning for Tactile Manipulation

IS Colloquium
  • 13 April 2018 • 11:00 12:00
  • Jan Peters
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Today’s robots have motor abilities and sensors that exceed those of humans in many ways: They move more accurately and faster; their sensors see more and at a higher precision and in contrast to humans they can accurately measure even the smallest forces and torques. Robot hands with three, four, or five fingers are commercially available, and, so are advanced dexterous arms. Indeed, modern motion-planning methods have rendered grasp trajectory generation a largely solved problem. Still, no robot to date matches the manipulation skills of industrial assembly workers despite that manipulation of mechanical objects remains essential for the industrial assembly of complex products. So, why are current robots still so bad at manipulation and humans so good?

Organizers: Katherine Kuchenbecker


A New Perspective on Usability Applied to Robotics

Talk
  • 04 April 2018 • 14:00 15:00
  • Dr. Vincent Berenz
  • Stuttgart 2P4

For many service robots, reactivity to changes in their surroundings is a must. However, developing software suitable for dynamic environments is difficult. Existing robotic middleware allows engineers to design behavior graphs by organizing communication between components. But because these graphs are structurally inflexible, they hardly support the development of complex reactive behavior. To address this limitation, we propose Playful, a software platform that applies reactive programming to the specification of robotic behavior. The front-end of Playful is a scripting language which is simple (only five keywords), yet results in the runtime coordinated activation and deactivation of an arbitrary number of higher-level sensory-motor couplings. When using Playful, developers describe actions of various levels of abstraction via behaviors trees. During runtime an underlying engine applies a mixture of logical constructs to obtain the desired behavior. These constructs include conditional ruling, dynamic prioritization based on resources management and finite state machines. Playful has been successfully used to program an upper-torso humanoid manipulator to perform lively interaction with any human approaching it.

Organizers: Katherine Kuchenbecker Mayumi Mohan Alexis Block


  • Dr. Adam Spiers
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

This talk will focus on three topics of my research at Yale University, which centers on themes of human and robotic manipulation and haptic perception. My major research undertaking at Yale has involved running a quantitative study of daily upper-limb prosthesis use in unilateral amputees. This work aims to better understand the techniques employed by long-term users of artificial arms and hands in order to inform future prosthetic device design and therapeutic interventions. While past attempts to quantify prosthesis-use have implemented either behavioral questionnaires or observations of specific tasks in a structured laboratory settings, our approach involves participants completing many hours of self-selected household chores in their own homes while wearing a head mounted video camera. I will discuss how we have addressed the processing of such a large and unstructured data set, in addition to our current findings. Complementary to my work in prosthetics, I will also discuss my work on several novel robotic grippers which aim to enhance the grasping, manipulation and object identification capabilities of robotic systems. These grippers implement underactuated designs, machine learning approaches or variable friction surfaces to provide low-cost, model-free and easily reproducible solutions to what have been traditionally been considered complex problems in robotic manipulation, i.e. stable grasp acquisition, fast tactile object recognition and within-hand object manipulation. Finally, I will present a brief overview of my efforts designing and testing shape-changing haptic interfaces, a largely unexplored feedback modality that I believe has huge potential for discretely communicating information to people with and without sensory impairments. This technology has been implemented in a pedestrian navigation system and evaluated in a variety of scenarios, including a large scale immersive theatre production with visually impaired artistic collaborators and almost 100 participants.

Organizers: Katherine Kuchenbecker


  • Prof. Christian Wallraven
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 5H7

Already starting at birth, humans integrate information from several sensory modalities in order to form a representation of the environment - such as when a baby explores, manipulates, and interacts with objects. The combination of visual and touch information is one of the most fundamental sensory integration processes, as touch information (such as body-relative size, shape, texture, material, temperature, and weight) can easily be linked to the visual image, thereby providing a grounding for later visual-only recognition. Previous research on such integration processes has so far mainly focused on low-level object properties (such as curvature, or surface granularity) such that little is known on how the human actually forms a high-level multisensory representation of objects. Here, I will review research from our lab that investigates how the human brain processes shape using input from vision and touch. Using a large variety of novel, 3D-printed shapes we were able to show that touch is actually equally good at shape processing than vision, suggesting a common, multisensory representation of shape. We next conducted a series of imaging experiments (using anatomical, functional, and white-matter analyses) that chart the brain networks that process this shape representation. I will conclude the talk with a brief medley of other haptics-related research in the lab, including robot learning, braille, and haptic face recognition.

Organizers: Katherine Kuchenbecker


  • Haliza Mat Husin
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Background: Pre-pregnancy obesity and inadequate maternal weight gain during pregnancy can lead to adverse effects in the newborn but also to metabolic, cardiovascular and even neurological diseases in older ages of the offspring. Heart activity can be used as a proxy for the activity of the autonomic nervous system (ANS). The aim of this study is to evaluate the effect of pre-pregnancy weight, maternal weight gain and maternal metabolism on the ANS of the fetus in healthy pregnancies.

Organizers: Katherine Kuchenbecker