Commonly used in minimally invasive robotic surgery and hazardous material handling, telerobotic systems empower humans to manipulate items by remotely controlling a robot. The user sends commands and receives multimodal feedback via the teleoperation interface, which is where we focus our attention. How can such systems support the operator to perform tasks with outcomes that are as good as (or even better than) those accomplished via direct manipulation? We work to create new ways to capture operator input, deliver haptic feedback, and otherwise augment the operator's abilities, and we systematically study how these technologies affect the operator.
Remotely accomplishing complex tasks such as surgical suturing benefits from a rich bidirectional interface that is optimized for human capabilities. Because the addition of force feedback tends to drive bilateral teleoperators unstable, most such systems include no haptic cues; the operator thus has to learn to rely on what he or she can see. We work on inventing and refining clever ways of stably providing haptic feedback during teleoperation, often by focusing on tactile rather than kinesthetic cues. A main thrust of this work centers on vibrotactile feedback of the robot's contact vibrations, as this approach is both simple and highly effective. We are also looking into how visual augmented reality can enrich the operator's experience in minimally invasive robotic surgery.
Importantly, we study how the addition of these novel technologies affects the operator over both short and long time scales. Having direct access to physical contact information changes the processing required for one to complete a task. Our investigations in this domain also show that the haptic signals captured during teleoperation contain significant information about the manual skill of the operator currently controlling the robot.