Haptic Intelligence

Instrumentation, Data, and Algorithms for Visually Understanding Haptic Surface Properties

2018

Ph.D. Thesis

hi


Autonomous robots need to efficiently walk over varied surfaces and grasp diverse objects. We hypothesize that the association between how such surfaces look and how they physically feel during contact can be learned from a database of matched haptic and visual data recorded from various end-effectors' interactions with hundreds of real-world surfaces. Testing this hypothesis required the creation of a new multimodal sensing apparatus, the collection of a large multimodal dataset, and development of a machine-learning pipeline. This thesis begins by describing the design and construction of the Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short), an untethered handheld sensing device that emulates the capabilities of the human senses of vision and touch. Its sensory modalities include RGBD vision, egomotion, contact force, and contact vibration. Three interchangeable end-effectors (a steel tooling ball, an OptoForce three-axis force sensor, and a SynTouch BioTac artificial fingertip) allow for different material properties at the contact point and provide additional tactile data. We then detail the calibration process for the motion and force sensing systems, as well as several proof-of-concept surface discrimination experiments that demonstrate the reliability of the device and the utility of the data it collects. This thesis then presents a large-scale dataset of multimodal surface interaction recordings, including 357 unique surfaces such as furniture, fabrics, outdoor fixtures, and items from several private and public material sample collections. Each surface was touched with one, two, or three end-effectors, comprising approximately one minute per end-effector of tapping and dragging at various forces and speeds. We hope that the larger community of robotics researchers will find broad applications for the published dataset. Lastly, we demonstrate an algorithm that learns to estimate haptic surface properties given visual input. Surfaces were rated on hardness, roughness, stickiness, and temperature by the human experimenter and by a pool of purely visual observers. Then we trained an algorithm to perform the same task as well as infer quantitative properties calculated from the haptic data. Overall, the task of predicting haptic properties from vision alone proved difficult for both humans and computers, but a hybrid algorithm using a deep neural network and a support vector machine achieved a correlation between expected and actual regression output between approximately ρ = 0.3 and ρ = 0.5 on previously unseen surfaces.

Author(s): Alexander L. Burka
Year: 2018
Month: August

Department(s): Haptic Intelligence
Research Project(s): Feeling With Your Eyes: Visual-Haptic Surface Interaction
Bibtex Type: Ph.D. Thesis (phdthesis)
Paper Type: Thesis

Address: Philadelphia, USA
School: University of Pennsylvania

Note: Department of Electrical and Systems Engineering

BibTex

@phdthesis{Burka18-PHD-Surface,
  title = {Instrumentation, Data, and Algorithms for Visually Understanding Haptic Surface Properties},
  author = {Burka, Alexander L.},
  school = {University of Pennsylvania},
  address = {Philadelphia, USA},
  month = aug,
  year = {2018},
  note = {Department of Electrical and Systems Engineering},
  doi = {},
  month_numeric = {8}
}