Header logo is hi
Department Talks
  • Yasemin Vardar
  • 2P4 in Heisenbergstr. 3

Sign-Up! is a career-building program for female post-docs in the Max Planck Society. This program aims to prepare post-doctoral researchers for their future scientific careers by several interactive training sessions and networking activities. As a selected member of this program, I will summarize the workshops that I participated in this year. My talk will cover topics about success factors in scientific careers, career planning, professional communication and leadership, self-presentation, and research funding.

Organizers: Katherine J. Kuchenbecker


Anthropomorphism in Surgical Robotics and Wearable Technologies

IS Colloquium
  • 03 June 2019 • 11:00 12:00
  • Dr Antonia Tzemanaki
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Over the past century, abdominal surgery has seen a rapid transition from open procedures to less invasive methods such as laparoscopy and robot-assisted minimally invasive surgery (R-A MIS), as they involve reduced blood loss, postoperative morbidity and length of hospital stay. Furthermore, R-A MIS has offered refined accuracy and more ergonomic instruments for surgeons, further minimising trauma to the patient. However, training surgeons in MIS procedures is becoming increasingly long and arduous, while commercially available robotic systems adopt a design similar to conventional laparoscopic instruments with limited novelty. Do these systems satisfy their users? What is the role and importance of haptics? Taking into account the input of end-users as well as examining the high intricacy and dexterity of the human hand can help to bridge the gap between R-A MIS and open surgery. By adopting designs inspired by the human hand, robotic tele-operated systems could become more accessible not only in the surgical domain but, beyond, in areas that benefit from user-centred design such as stroke rehabilitation, as well as in areas where safety issues prevent use of autonomous robots, such as assistive technologies and nuclear industry.

Organizers: Katherine J. Kuchenbecker


Human Factors Research in Minimally Invasive Surgery

IS Colloquium
  • 23 May 2019 • 11:00 12:00
  • Caroline G. L. Cao, Ph.D.
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Health care is probably the last remaining unsafe critical system. A large proportion of reported medical errors occur in the hospital operating room (OR), a highly complex sociotechnical environment. As technology is being introduced into the OR faster than surgeons can learn to use them, surgical errors result from the unfamiliar instrumentation, increased motoric, perceptual and cognitive demands on the surgeons, as well as the lack of adequate training. Effective technology design for minimally invasive surgery requires an understanding of the system constraints of remote surgery, and the complex interaction between humans and technology in the OR. This talk will describe research activities in the Ergonomics in Remote Environments Laboratory at Wright State University, which address some of these human factors issues.

Organizers: Katherine J. Kuchenbecker


  • Hojin Lee
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Haptic technologies in both kinesthetic and tactile aspects benefit a brand-new opportunity to recent human-machine interactive applications. In this talk, I, who believe in that one of the essential role of a researcher is pioneering new insights and knowledge, will present my previous research topics about haptic technologies and human-machine interactive applications in two branches: laser-based mid-air haptics and sensorimotor skill learning. For the former branch, I will introduce our approach named indirect laser radiation and its application. Indirect laser radiation utilizes a laser and a light-absorbing elastic medium to evoke a tapping-like tactile sensation. For the latter, I will introduce our data-driven approach for both modeling and learning of sensorimotor skills (especially, driving) with kinesthetic assistance and artificial neural networks; I call it human-like haptic assistance. To unify two different branches of my earlier studies for exploring the feasibility of the sensory channel named "touch", I will present a general research paradigm for human-machine interactive applications to which current haptic technologies can aim in future.

Organizers: Katherine J. Kuchenbecker


  • Ravali Gourishetti
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

Needle insertion is the most essential skill in medical care; training has to be imparted not only for physicians but also for nurses and paramedics. In most needle insertion procedures, haptic feedback from the needle is the main stimulus that novices are to be trained in. For better patient safety, the classical methods of training the haptic skills have to be replaced with simulators based on new robotic and graphics technologies. The main objective of this work is to develop analytical models of needle insertion (a special case of epidural anesthesia) including the biomechanical and psychophysical concepts that simulate the needle-tissue interaction forces in linear heterogeneous tissues and to validate the model with a series of experiments. The biomechanical and perception models were validated with experiments in two stages: with and without the human intervention. The second stage is the validation using the Turing test with two different experiments: 1) to observe the perceptual difference between the simulated and the physical phantom model, and 2) to verify the effectiveness of perceptual filter between the unfiltered and filtered model response. The results showed that the model could replicate the physical phantom tissues with good accuracy. This can be further extended to a non-linear heterogeneous model. The proposed needle/tissue interaction force models can be used more often in improving realism, performance and enabling future applications in needle simulators in heterogeneous tissue. Needle insertion training simulator was developed with the simulated models using Omni Phantom and clinical trials are conducted for the face validity and construct validity. The face validity results showed that the degree of realism of virtual environments and instruments had the overall lowest mean score and ease of usage and training in hand – eye coordination had the highest mean score. The construct validity results showed that the simulator was able to successfully differentiate force and psychomotor signatures of anesthesiologists with experiences less than 5 years and more than 5 years. For the performance index of the trainees, a novel measure, Just Controllable Difference (JCD) was proposed and a preliminary study on JCD measure is explored using two experiments for the novice. A preliminary study on the use of clinical training simulations, especially needle insertion procedure in virtual environments is emphasized on two objectives: Firstly, measures of force JND with the three fingers and secondly, comparison of these measures in Non-Immersive Virtual Reality (NIVR) to that of the Immersive Virtual Reality (IVR) using psychophysical study with the Force Matching task, Constant Stimuli method, and Isometric Force Probing stimuli. The results showed a better force JND in the IVR compared to that of the NIVR. Also, a simple state observer model was proposed to explain the improvement of force JND in the IVR. This study would quantitatively reinforce the use of the IVR for the design of various medical simulators.

Organizers: Katherine J. Kuchenbecker


Perceptual and Affective Characteristics of Tactile Stimuli

Talk
  • 14 February 2019 • 15:00 16:00
  • Yongjae Yoo
  • 2P4 in Heisenbergstr. 3

With the advent of technology, tactile stimuli are adopted widely in many human-computer interactions. However, their perceptual and emotional characteristics are not much studied yet. In this talk, to help in understanding these characteristics, I will introduce my perception and emotion studies, as well as my future research plan. For perceptual characteristics, I will introduce an estimation method for perceived intensity of superimposed vibrations, verbal expressions for vibrotactile stimuli, and adjectival magnitude functions. Then, I will present a vibrotactile authoring tool that utilizes the adjectival magnitude functions as an application. For affective characteristics, I will introduce my emotion studies that investigate the effects of physical parameters of vibrotactile and thermal stimuli on the emotional responses using the valence-arousal space (V-A space). Then, as an application, I will present an emotion augmenting method that changes the emotion of visual stimuli in mobile devices using tactile stimuli.

Organizers: Katherine J. Kuchenbecker


Artificial Haptic Intelligence for Human-Machine Systems

IS Colloquium
  • 25 October 2018 • 11:00 12:00
  • Veronica J. Santos
  • N2.025 at MPI-IS in Tübingen

The functionality of artificial manipulators could be enhanced by artificial “haptic intelligence” that enables the identification of object features via touch for semi-autonomous decision-making and/or display to a human operator. This could be especially useful when complementary sensory modalities, such as vision, are unavailable. I will highlight past and present work to enhance the functionality of artificial hands in human-machine systems. I will describe efforts to develop multimodal tactile sensor skins, and to teach robots how to haptically perceive salient geometric features such as edges and fingertip-sized bumps and pits using machine learning techniques. I will describe the use of reinforcement learning to teach robots goal-based policies for a functional contour-following task: the closure of a ziplock bag. Our Contextual Multi-Armed Bandits approach tightly couples robot actions to the tactile and proprioceptive consequences of the actions, and selects future actions based on prior experiences, the current context, and a functional task goal. Finally, I will describe current efforts to develop real-time capabilities for the perception of tactile directionality, and to develop models for haptically locating objects buried in granular media. Real-time haptic perception and decision-making capabilities could be used to advance semi-autonomous robot systems and reduce the cognitive burden on human teleoperators of devices ranging from wheelchair-mounted robots to explosive ordnance disposal robots.

Organizers: Katherine J. Kuchenbecker Adam Spiers


Artificial Haptic Intelligence for Human-Machine Systems

IS Colloquium
  • 24 October 2018 • 11:00 12:00
  • Veronica J. Santos
  • 5H7 at MPI-IS in Stuttgart

The functionality of artificial manipulators could be enhanced by artificial “haptic intelligence” that enables the identification of object features via touch for semi-autonomous decision-making and/or display to a human operator. This could be especially useful when complementary sensory modalities, such as vision, are unavailable. I will highlight past and present work to enhance the functionality of artificial hands in human-machine systems. I will describe efforts to develop multimodal tactile sensor skins, and to teach robots how to haptically perceive salient geometric features such as edges and fingertip-sized bumps and pits using machine learning techniques. I will describe the use of reinforcement learning to teach robots goal-based policies for a functional contour-following task: the closure of a ziplock bag. Our Contextual Multi-Armed Bandits approach tightly couples robot actions to the tactile and proprioceptive consequences of the actions, and selects future actions based on prior experiences, the current context, and a functional task goal. Finally, I will describe current efforts to develop real-time capabilities for the perception of tactile directionality, and to develop models for haptically locating objects buried in granular media. Real-time haptic perception and decision-making capabilities could be used to advance semi-autonomous robot systems and reduce the cognitive burden on human teleoperators of devices ranging from wheelchair-mounted robots to explosive ordnance disposal robots.

Organizers: Katherine J. Kuchenbecker


Control Systems for a Surgical Robot on the Space Station

IS Colloquium
  • 23 October 2018 • 16:30 17:30
  • Chris Macnab
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

As part of a proposed design for a surgical robot on the space station, my research group has been asked to look at controls that can provide literally surgical precision. Due to excessive time delay, we envision a system with a local model being controlled by a surgeon while the remote system on the space station follows along in a safe manner. Two of the major design considerations that come into play for the low-level feedback loops on the remote side are 1) the harmonic drives in a robot will cause excessive vibrations in a micro-gravity environment unless active damping strategies are employed and 2) when interacting with a human tissue environment the robot must apply smooth control signals that result in precise positions and forces. Thus, we envision intelligent strategies that utilize nonlinear, adaptive, neural-network, and/or fuzzy control theory as the most suitable. However, space agencies, or their engineering sub-contractors, typically provide gain and phase margin characteristics as requirements to the engineers involved in a control system design, which are normally associated with PID or other traditional linear control schemes. We are currently endeavouring to create intelligent controls that have guaranteed gain and phase margins using the Cerebellar Model Articulation Controller.

Organizers: Katherine J. Kuchenbecker


  • Mariacarla Memeo
  • MPI-IS Stuttgart, Heisenbergstr. 3, Room 2P4

The increasing availability of on-line resources and the widespread practice of storing data over the internet arise the problem of their accessibility for visually impaired people. A translation from the visual domain to the available modalities is therefore necessary to study if this access is somewhat possible. However, the translation of information from vision to touch is necessarily impaired due to the superiority of vision during the acquisition process. Yet, compromises exist as visual information can be simplified, sketched. A picture can become a map. An object can become a geometrical shape. Under some circumstances, and with a reasonable loss of generality, touch can substitute vision. In particular, when touch substitutes vision, data can be differentiated by adding a further dimension to the tactile feedback, i.e. extending tactile feedback to three dimensions instead of two. This mode has been chosen because it mimics our natural way of following object profiles with fingers. Specifically, regardless if a hand lying on an object is moving or not, our tactile and proprioceptive systems are both stimulated and tell us something about which object we are manipulating, what can be its shape and size. The goal of this talk is to describe how to exploit tactile stimulation to render digital information non visually, so that cognitive maps associated with this information can be efficiently elicited from visually impaired persons. In particular, the focus is to deliver geometrical information in a learning scenario. Moreover, a completely blind interaction with virtual environment in a learning scenario is something little investigated because visually impaired subjects are often passive agents of exercises with fixed environment constraints. For this reason, during the talk I will provide my personal answer to the question: can visually impaired people manipulate dynamic virtual content through touch? This process is much more challenging than only exploring and learning a virtual content, but at the same time it leads to a more conscious and dynamic creation of the spatial understanding of an environment during tactile exploration.

Organizers: Katherine J. Kuchenbecker