Header logo is hi
Department Talks

Artificial haptic intelligence for human-machine systems

IS Colloquium
  • 24 October 2018 • 11:00 12:00
  • Veronica J. Santos
  • 5H7 at MPI-IS in Stuttgart

The functionality of artificial manipulators could be enhanced by artificial “haptic intelligence” that enables the identification of object features via touch for semi-autonomous decision-making and/or display to a human operator. This could be especially useful when complementary sensory modalities, such as vision, are unavailable. I will highlight past and present work to enhance the functionality of artificial hands in human-machine systems. I will describe efforts to develop multimodal tactile sensor skins, and to teach robots how to haptically perceive salient geometric features such as edges and fingertip-sized bumps and pits using machine learning techniques. I will describe the use of reinforcement learning to teach robots goal-based policies for a functional contour-following task: the closure of a ziplock bag. Our Contextual Multi-Armed Bandits approach tightly couples robot actions to the tactile and proprioceptive consequences of the actions, and selects future actions based on prior experiences, the current context, and a functional task goal. Finally, I will describe current efforts to develop real-time capabilities for the perception of tactile directionality, and to develop models for haptically locating objects buried in granular media. Real-time haptic perception and decision-making capabilities could be used to advance semi-autonomous robot systems and reduce the cognitive burden on human teleoperators of devices ranging from wheelchair-mounted robots to explosive ordnance disposal robots.

Organizers: Katherine Kuchenbecker

  • Mariacarla Memeo
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

The increasing availability of on-line resources and the widespread practice of storing data over the internet arise the problem of their accessibility for visually impaired people. A translation from the visual domain to the available modalities is therefore necessary to study if this access is somewhat possible. However, the translation of information from vision to touch is necessarily impaired due to the superiority of vision during the acquisition process. Yet, compromises exist as visual information can be simplified, sketched. A picture can become a map. An object can become a geometrical shape. Under some circumstances, and with a reasonable loss of generality, touch can substitute vision. In particular, when touch substitutes vision, data can be differentiated by adding a further dimension to the tactile feedback, i.e. extending tactile feedback to three dimensions instead of two. This mode has been chosen because it mimics our natural way of following object profiles with fingers. Specifically, regardless if a hand lying on an object is moving or not, our tactile and proprioceptive systems are both stimulated and tell us something about which object we are manipulating, what can be its shape and size. The goal of this talk is to describe how to exploit tactile stimulation to render digital information non visually, so that cognitive maps associated with this information can be efficiently elicited from visually impaired persons. In particular, the focus is to deliver geometrical information in a learning scenario. Moreover, a completely blind interaction with virtual environment in a learning scenario is something little investigated because visually impaired subjects are often passive agents of exercises with fixed environment constraints. For this reason, during the talk I will provide my personal answer to the question: can visually impaired people manipulate dynamic virtual content through touch? This process is much more challenging than only exploring and learning a virtual content, but at the same time it leads to a more conscious and dynamic creation of the spatial understanding of an environment during tactile exploration.

Organizers: Katherine Kuchenbecker


  • Gokhan Serhat
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Continuum structures need to be designed for optimal vibrational characteristics in various fields. Recent developments in the finite element analysis (FEA) and numerical optimization methods allow creating more accurate computational models, which favors designing superior systems and reduces the need for experimentation. In this talk, I will present my work on FEA-based optimization of thin shell structures for improved dynamic properties where the focus will be on laminated composites. I will initially explain multi-objective optimization strategies for enhancing load-carrying and vibrational performance of plate structures. The talk will continue with the design of curved panels for optimal free and forced dynamic responses. After that, I will present advanced methods that I developed for modeling and optimization of variable-stiffness structures. Finally, I will outline the state-of-the-art techniques regarding numerical simulation of the finger in contact with surfaces and propose potential research directions.

Organizers: Katherine Kuchenbecker


  • Prof. Peter Pott
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

The FLEXMIN haptic robotic system is a single-port tele-manipulator for robotic surgery in the small pelvis. Using a transanal approach it allows bi-manual tasks such as grasping, monopolar cutting, and suturing with a footprint of Ø 160 x 240 mm³. Forces up to 5 N in all direction can be applied easily. In addition to provide low latency and highly dynamic control over its movements, high-fidelity haptic feedback was realised using built-in force sensors, lightweight and friction-optimized kinematics as well as dedicated parallel kinematics input devices. After a brief description of the system and some of its key aspects, first evaluation results will be presented. In the second half of the talk the Institute of Medical Device Technology will be presented. The institute was founded in July 2017 and has ever since started a number of projects in the field of biomedical actuation, medical systems and robotics and advanced light microscopy. To illustrate this a few snapshots of bits and pieces will be presented that are condensation nuclei for the future.

Organizers: Katherine Kuchenbecker


The Computational Skin. Tactile Perception based on Slip Movements.

IS Colloquium
  • 02 July 2018 • 14:30 15:30
  • Prof. Dr. Cornelius Schwarz
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Touch requires mechanical contact and is governed by the physics of friction. Frictional movements may convert the continuous 3D profile of textural objects into discrete and probabilistic movement events of the viscoelastic integument (skin/hair) called stick-slip movements (slips). This complex transformation may further be determined by the microanatomy and the active movements of the sensing organ. Thus, the integument may realize a computation, transforming the tactile world in a context dependent way - long before it even activates neurons. The possibility that the tactile world is perceived through these ‘fractured goggles’ of friction has been largely ignored by classical perceptual and neuro-scientific work. I will present biomechanical, neuro-scientific, and behavioral work supporting the slip hypothesis.

Organizers: Katherine Kuchenbecker


Haptic Engineering and Science at Multiple Scales

Talk
  • 20 June 2018 • 11:00 12:00
  • Yon Visell, PhD
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

I will describe recent research in my lab on haptics and robotics. It has been a longstanding challenge to realize engineering systems that can match the amazing perceptual and motor feats of biological systems for touch, including the human hand. Some of the difficulties of meeting this objective can be traced to our limited understanding of the mechanics, and to the high dimensionality of the signals, and to the multiple length and time scales - physical regimes - involved. An additional source of richness and complication arises from the sensitive dependence of what we feel on what we do, i.e. on the tight coupling between touch-elicited mechanical signals, object contacts, and actions. I will describe research in my lab that has aimed at addressing these challenges, and will explain how the results are guiding the development of new technologies for haptics, wearable computing, and robotics.

Organizers: Katherine Kuchenbecker


  • Wenzhen Yuan
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Why cannot the current robots act intelligently in the real-world environment? A major challenge lies in the lack of adequate tactile sensing technologies. Robots need tactile sensing to understand the physical environment, and detect the contact states during manipulation. Progress requires advances in the sensing hardware, but also advances in the software that can exploit the tactile signals. We developed a high-resolution tactile sensor, GelSight, which measures the geometry and traction field of the contact surface. For interpreting the high-resolution tactile signal, we utilize both traditional statistical models and deep neural networks. I will describe my research on both exploration and manipulation. For exploration, I use active touch to estimate the physical properties of the objects. The work has included learning the hardness of artificial objects, as well as estimating the general properties of natural objects via autonomous tactile exploration. For manipulation, I study the robot’s ability to detect slip or incipient slip with tactile sensing during grasping. The research helps robots to better understand and flexibly interact with the physical world.

Organizers: Katherine Kuchenbecker


Making Haptics and its Design Accessible

IS Colloquium
  • 28 May 2018 • 11:00 12:00
  • Karon MacLean
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Today’s advances in tactile sensing and wearable, IOT and context-aware computing are spurring new ideas about how to configure touch-centered interactions in terms of roles and utility, which in turn expose new technical and social design questions. But while haptic actuation, sensing and control are improving, incorporating them into a real-world design process is challenging and poses a major obstacle to adoption into everyday technology. Some classes of haptic devices, e.g., grounded force feedback, remain expensive and limited in range. I’ll describe some recent highlights of an ongoing effort to understand how to support haptic designers and end-users. These include a wealth of online experimental design tools, and DIY open sourced hardware and accessible means of creating, for example, expressive physical robot motions and evolve physically sensed expressive tactile languages. Elsewhere, we are establishing the value of haptic force feedback in embodied learning environments, to help kids understand physics and math concepts. This has inspired the invention of a low-cost, handheld and large motion force feedback device that can be used in online environments or collaborative scenarios, and could be suitable for K-12 school contexts; this is ongoing research with innovative education and technological elements. All our work is available online, where possible as web tools, and we plan to push our research into a broader openhaptics effort.

Organizers: Katherine Kuchenbecker


  • Preeya Khanna
  • Heisenbergstr. 3, Room 2P4

Actions constitute the way we interact with the world, making motor disabilities such as Parkinson’s disease and stroke devastating. The neurological correlates of the injured brain are challenging to study and correct given the adaptation, redundancy, and distributed nature of our motor system. However, recent studies have used increasingly sophisticated technology to sample from this distributed system, improving our understanding of neural patterns that support movement in healthy brains, or compromise movement in injured brains. One approach to translating these findings to into therapies to restore healthy brain patterns is with closed-loop brain-machine interfaces (BMIs). While closed-loop BMIs have been discussed primarily as assistive technologies the underlying techniques may also be useful for rehabilitation.

Organizers: Katherine Kuchenbecker


Machine Learning for Tactile Manipulation

IS Colloquium
  • 13 April 2018 • 11:00 12:00
  • Jan Peters
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Today’s robots have motor abilities and sensors that exceed those of humans in many ways: They move more accurately and faster; their sensors see more and at a higher precision and in contrast to humans they can accurately measure even the smallest forces and torques. Robot hands with three, four, or five fingers are commercially available, and, so are advanced dexterous arms. Indeed, modern motion-planning methods have rendered grasp trajectory generation a largely solved problem. Still, no robot to date matches the manipulation skills of industrial assembly workers despite that manipulation of mechanical objects remains essential for the industrial assembly of complex products. So, why are current robots still so bad at manipulation and humans so good?

Organizers: Katherine Kuchenbecker


A New Perspective on Usability Applied to Robotics

Talk
  • 04 April 2018 • 14:00 15:00
  • Dr. Vincent Berenz
  • Stuttgart 2P4

For many service robots, reactivity to changes in their surroundings is a must. However, developing software suitable for dynamic environments is difficult. Existing robotic middleware allows engineers to design behavior graphs by organizing communication between components. But because these graphs are structurally inflexible, they hardly support the development of complex reactive behavior. To address this limitation, we propose Playful, a software platform that applies reactive programming to the specification of robotic behavior. The front-end of Playful is a scripting language which is simple (only five keywords), yet results in the runtime coordinated activation and deactivation of an arbitrary number of higher-level sensory-motor couplings. When using Playful, developers describe actions of various levels of abstraction via behaviors trees. During runtime an underlying engine applies a mixture of logical constructs to obtain the desired behavior. These constructs include conditional ruling, dynamic prioritization based on resources management and finite state machines. Playful has been successfully used to program an upper-torso humanoid manipulator to perform lively interaction with any human approaching it.

Organizers: Katherine Kuchenbecker Mayumi Mohan Alexis Block