Many scenarios arise wherein the high-frequency accelerations of a tool need to be captured and either portrayed for a human to feel or analyzed by a computer system. For example, this approach provides a simple and realistic way for a surgeon to feel tactile information from a remotely controlled surgical robot. Similarly, vibrations are the main signal of interest for the modeling of haptic textures.
In most of these scenarios, 3D accelerations are captured from real contact interactions and used as a realistic vibration source. Because human cannot perceive the direction of high-frequency vibrations, haptics researchers usually reduce these 3D accelerations into 1D signals and render them using a single-axis vibration actuator. Interestingly, this dimensional reduction can be performed in many ways, and the chosen approach has a substantial impact on the quality of the resulting waveform.
This research project implements a real-time conversion system that simultaneously measures 3D accelerations using an In-Pen and renders the corresponding 1D vibrations using an Out-Pen. The user can freely interact with various objects using the In-Pen, which contains a 3-axis high-bandwidth accelerometer. The captured accelerations are converted to a single-axis signal, and the Out-Pen renders the reduced signal for the user to feel using a Tactile Labs Haptuator.
Our system can quickly switch between seven different conversion methods ranging from the simple use of a single-axis signal to applying principal component analysis (PCA). After gathering informal feedback via conference demonstrations [ ], we are now investigating both the quantitative signal similarity and the qualitative perceptual similarity between the 3D accelerations and the reduced vibrations to determine which method is optimal for each scenario where such an algorithm is needed.