A human user can carry out tasks in a remote environment by controlling a robot; the interaction site might be just across the room from the user, several kilometers away, deep in the ocean, or orbiting above Earth's surface. Such a scenario is known as bilateral teleoperation or telemanipulation. The remote robot's job is to represent the user's actions in the remote environment; the user sends these commands and receives multimodal feedback via the teleoperation interface, which is where we focus our attention.
Remotely accomplishing complex tasks such as surgical suturing requires a rich bidirectional that is optimized for human capabilities. Because the addition of force feedback tends to drive bilateral teleoperators unstable, most such systems include no haptic cues; the operator thus has to learn to rely on what he or she can see. We work on inventing and refining clever ways of stably providing haptic feedback during teleoperation, often by focusing on tactile rather than kinesthetic cues. One main thrust of our work centers on vibrotactile feedback of the robot's contact vibrations, as this approach is both simple and highly effective.
We also study how the addition of haptic cues affects the operator over both short and long time scales. Having direct access to physical contact information changes the processing required for one to complete a task; different kinds of feedback have different effects that need to be understood both quantitatively and qualitatively. Our investigations in this domain also show that the haptic signals captured during teleoperation contain significant information about the manual skill of the operator currently controlling the robot.
While most of our teleoperation research has considered only manipulation tasks, we also study teleoperation of humanoid robots that have both task-oriented and social functions. We are interested in inventing lightweight and effective systems that enable the operator to control the robot's hand gestures and facial expressions with high fidelity.