A human teleoperates the head of Max, our Baxter robot, via camera-based pose estimation and facial emotion recognition.
Many situations arise where it is beneficial for a human to control the movements of a robot at a distance, such as handling hazardous materials, doing surgery deep inside the human body, or taking part in remote meetings with other people. In these scenarios, the robot's control interface has a significant influence on the effectiveness of the interaction.
This project aims to invent an intuitive and user-friendly interface for remotely controlling Max, our Baxter Research Robot (developed by Rethink Robotics). Given our ongoing research in social-physical human-robot interaction, we are particularly interested in an interface that will allow the operator to control the robot in an expressive and physically interactive manner. We envision scenarios where the remote-controlled robot acts as an exercise partner or coach, performing collaborative tasks or playing interactive games with a human. Such a scenario may take place in an elderly care home or a physical rehabilitation center.
We are in the process of assembling a control interface suitable for the envisioned use case [ ]. In our present design, Max's arm and head movement are controlled via an inertial-sensor-based motion-capture suit (XSens MVN) worn by the operator. The same person simultaneously controls the robot's face via camera-based facial emotion recognition, as shown in the figure above. We will refine and evaluate this interface and then use it to prototype a wide range of social-physical interactions between a human and a robot.