Header logo is hi


186 results (BibTeX)

2018


no image
Automatically Rating Trainee Skill at a Pediatric Laparoscopic Suturing Task

Oquendo, Y. A., Riddle, E. W., Hiller, D., Blinman, T. A., Kuchenbecker, K. J.

Surgical Endoscopy, 32(4):1840-1857, April 2018 (article)

DOI [BibTex]

2018

DOI [BibTex]


no image
Immersive Low-Cost Virtual Reality Treatment for Phantom Limb Pain: Evidence from Two Cases

Ambron, E., Miller, A., Kuchenbecker, K. J., Buxbaum, L. J., Coslett, H. B.

Frontiers in Neurology, 9(67):1-7, 2018 (article)

DOI [BibTex]

DOI [BibTex]

2017


no image
Evaluation of high-fidelity simulation as a training tool in transoral robotic surgery

Bur, A. M., Gomez, E. D., Newman, J. G., Weinstein, G. S., Bert W. O’Malley, J., Rassekh, C. H., Kuchenbecker, K. J.

Laryngoscope, 127(12):2790-2795, December 2017 (article)

DOI [BibTex]

2017

DOI [BibTex]


no image
Using contact forces and robot arm accelerations to automatically rate surgeon skill at peg transfer

Brown, J. D., O’Brien, C. E., Leung, S. C., Dumon, K. R., Lee, D. I., Kuchenbecker, K. J.

IEEE Transactions on Biomedical Engineering, 64(9):2263-2275, September 2017 (article)

[BibTex]

[BibTex]


no image
Stiffness Perception during Pinching and Dissection with Teleoperated Haptic Forceps

Ng, C., Zareinia, K., Sun, Q., Kuchenbecker, K. J.

In Proceedings of the International Symposium on Robot and Human Interactive Communication (RO-MAN), pages: 456-463, August 2017 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Ungrounded haptic augmented reality system for displaying texture and friction

Culbertson, H., Kuchenbecker, K. J.

IEEE/ASME Transactions on Mechatronics, 22(4):1839-1849, August 2017 (article)

[BibTex]

[BibTex]


no image
A Wrist-Squeezing Force-Feedback System for Robotic Surgery Training

Brown, J. D., Fernandez, J. N., Cohen, S. P., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 107-112, June 2017 (inproceedings)

Abstract
Over time, surgical trainees learn to compensate for the lack of haptic feedback in commercial robotic minimally invasive surgical systems. Incorporating touch cues into robotic surgery training could potentially shorten this learning process if the benefits of haptic feedback were sustained after it is removed. In this paper, we develop a wrist-squeezing haptic feedback system and evaluate whether it holds the potential to train novice da Vinci users to reduce the force they exert on a bimanual inanimate training task. Subjects were randomly divided into two groups according to a multiple baseline experimental design. Each of the ten participants moved a ring along a curved wire nine times while the haptic feedback was conditionally withheld, provided, and withheld again. The realtime tactile feedback of applied force magnitude significantly reduced the integral of the force produced by the da Vinci tools on the task materials, and this result remained even when the haptic feedback was removed. Overall, our findings suggest that wrist-squeezing force feedback can play an essential role in helping novice trainees learn to minimize the force they exert with a surgical robot.

DOI [BibTex]

DOI [BibTex]


no image
Design of a Parallel Continuum Manipulator for 6-DOF Fingertip Haptic Display

Young, E. M., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 599-604, June 2017, Finalist for best poster paper (inproceedings)

Abstract
Despite rapid advancements in the field of fingertip haptics, rendering tactile cues with six degrees of freedom (6 DOF) remains an elusive challenge. In this paper, we investigate the potential of displaying fingertip haptic sensations with a 6-DOF parallel continuum manipulator (PCM) that mounts to the user's index finger and moves a contact platform around the fingertip. Compared to traditional mechanisms composed of rigid links and discrete joints, PCMs have the potential to be strong, dexterous, and compact, but they are also more complicated to design. We define the design space of 6-DOF parallel continuum manipulators and outline a process for refining such a device for fingertip haptic applications. Following extensive simulation, we obtain 12 designs that meet our specifications, construct a manually actuated prototype of one such design, and evaluate the simulation's ability to accurately predict the prototype's motion. Finally, we demonstrate the range of deliverable fingertip tactile cues, including a normal force into the finger and shear forces tangent to the finger at three extreme points on the boundary of the fingertip.

DOI [BibTex]

DOI [BibTex]


no image
Handling Scan-time Parameters in Haptic Surface Classification

Burka, A., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 424-429, June 2017 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
High Magnitude Unidirectional Haptic Force Display Using a Motor/Brake Pair and a Cable

Hu, S., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 394-399, June 2017 (inproceedings)

Abstract
Clever electromechanical design is required to make the force feedback delivered by a kinesthetic haptic interface both strong and safe. This paper explores a onedimensional haptic force display that combines a DC motor and a magnetic particle brake on the same shaft. Rather than a rigid linkage, a spooled cable connects the user to the actuators to enable a large workspace, reduce the moving mass, and eliminate the sticky residual force from the brake. This design combines the high torque/power ratio of the brake and the active output capabilities of the motor to provide a wider range of forces than can be achieved with either actuator alone. A prototype of this device was built, its performance was characterized, and it was used to simulate constant force sources and virtual springs and dampers. Compared to the conventional design of using only a motor, the hybrid device can output higher unidirectional forces at the expense of free space feeling less free.

DOI [BibTex]

DOI [BibTex]


no image
Physically Interactive Exercise Games with a Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at the 2017 IEEE World Haptics Conference (WHC), June 2017 (misc)

[BibTex]

[BibTex]


no image
Perception of force and stiffness in the presence of low-frequency haptic noise

Gurari, N., Okamura, A. M., Kuchenbecker, K. J.

PLoS ONE, 12(6):e0178605, June 2017 (article)

[BibTex]

[BibTex]


no image
Proton Pack: Visuo-Haptic Surface Data Recording

Burka, A., Kuchenbecker, K. J.

Hands-on demonstration presented at the 2017 IEEE World Haptics Conference (WHC), June 2017 (misc)

[BibTex]

[BibTex]


no image
Teaching a Robot to Collaborate with a Human Via Haptic Teleoperation

Hu, S., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the 2017 IEEE World Haptics Conference (WHC), June 2017 (misc)

[BibTex]

[BibTex]


no image
How Should Robots Hug?

Block, Alexis E., Kuchenbecker, Katherine J.

Work-in-progress paper (2 pages) presented at the 2017 IEEE World Haptics Conference (WHC), June 2017 (misc)

[BibTex]

[BibTex]


no image
Evaluation of a vibrotactile simulator for dental caries detection

Kuchenbecker, K. J., Parajon, R., Maggio, M. P.

Simulation in Healthcare, 12(3):148-156, June 2017 (article)

[BibTex]

[BibTex]


no image
Proton 2: Increasing the Sensitivity and Portability of a Visuo-haptic Surface Interaction Recorder

Burka, A., Rajvanshi, A., Allen, S., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pages: 439-445, May 2017 (inproceedings)

Abstract
The Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short) is a new handheld visuo-haptic sensing system that records surface interactions. We previously demonstrated system calibration and a classification task using external motion tracking. This paper details improvements in surface classification performance and removal of the dependence on external motion tracking, necessary before embarking on our goal of gathering a vast surface interaction dataset. Two experiments were performed to refine data collection parameters. After adjusting the placement and filtering of the Proton's high-bandwidth accelerometers, we recorded interactions between two differently-sized steel tooling ball end-effectors (diameter 6.35 and 9.525 mm) and five surfaces. Using features based on normal force, tangential force, end-effector speed, and contact vibration, we trained multi-class SVMs to classify the surfaces using 50 ms chunks of data from each end-effector. Classification accuracies of 84.5% and 91.5% respectively were achieved on unseen test data, an improvement over prior results. In parallel, we pursued on-board motion tracking, using the Proton's camera and fiducial markers. Motion tracks from the external and onboard trackers agree within 2 mm and 0.01 rad RMS, and the accuracy decreases only slightly to 87.7% when using onboard tracking for the 9.525 mm end-effector. These experiments indicate that the Proton 2 is ready for portable data collection.

DOI [BibTex]

DOI [BibTex]


no image
An Interactive Augmented-Reality Video Training Platform for the da Vinci Surgical System

Carlson, J., Kuchenbecker, K. J.

Short paper presented at the Workshop on C4 Surgical Robots at the IEEE International Conference on Robotics and Automation (ICRA), May 2017 (misc)

Abstract
Teleoperated surgical robots such as the Intuitive da Vinci Surgical System facilitate minimally invasive surgeries, which decrease risk to patients. However, these systems can be difficult to learn, and existing training curricula on surgical simulators do not offer students the realistic experience of a full operation. This paper presents an augmented-reality video training platform for the da Vinci that will allow trainees to rehearse any surgery recorded by an expert. While the trainee operates a da Vinci in free space, they see their own instruments overlaid on the expert video. Tools are identified in the source videos via color segmentation and kernelized correlation filter tracking, and their depth is calculated from the da Vinci’s stereoscopic video feed. The user tries to follow the expert’s movements, and if any of their tools venture too far away, the system provides instantaneous visual feedback and pauses to allow the user to correct their motion. The trainee can also rewind the expert video by bringing either da Vinci tool very close to the camera. This combined and augmented video provides the user with an immersive and interactive training experience.

[BibTex]

[BibTex]


no image
Hand-Clapping Games with a Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at ACM/IEEE International Conference on Human-Robot Interaction (HRI), March 2017 (misc)

Abstract
Robots that work alongside humans might be more effective if they could forge a strong social bond with their human partners. Hand-clapping games and other forms of rhythmic social-physical interaction may foster human-robot teamwork, but the design of such interactions has scarcely been explored. At the HRI 2017 conference, we will showcase several such interactions taken from our recent work with the Rethink Robotics Baxter Research Robot, including tempo-matching, Simon says, and Pat-a-cake-like games. We believe conference attendees will be both entertained and intrigued by this novel demonstration of social-physical HRI.

[BibTex]

[BibTex]


no image
Automatic OSATS Rating of Trainee Skill at a Pediatric Laparoscopic Suturing Task

Oquendo, Y. A., Riddle, E. W., Hiller, D., Blinman, T. A., Kuchenbecker, K. J.

Surgical Endoscopy, Proceedings of the 2017 Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), 31(Supplement 1):S28, Springer, March 2017 (article)

Abstract
Introduction: Minimally invasive surgery has revolutionized surgical practice, but challenges remain. Trainees must acquire complex technical skills while minimizing patient risk, and surgeons must maintain their skills for rare procedures. These challenges are magnified in pediatric surgery due to the smaller spaces, finer tissue, and relative dearth of both inanimate and virtual simulators. To build technical expertise, trainees need opportunities for deliberate practice with specific performance feedback, which is typically provided via tedious human grading. This study aimed to validate a novel motion-tracking system and machine learning algorithm for automatically evaluating trainee performance on a pediatric laparoscopic suturing task using a 1–5 OSATS Overall Skill rating. Methods: Subjects (n=14) ranging from medical students to fellows per- formed one or two trials of an intracorporeal suturing task in a custom pediatric laparoscopy training box (Fig. 1) after watching a video of ideal performance by an expert. The position and orientation of the tools and endoscope were recorded over time using Ascension trakSTAR magnetic motion-tracking sensors, and both instrument grasp angles were recorded over time using flex sensors on the handles. The 27 trials were video-recorded and scored on the OSATS scale by a senior fellow; ratings ranged from 1 to 4. The raw motion data from each trial was processed to calculate over 200 preliminary motion parameters. Regularized least-squares regression (LASSO) was used to identify the most predictive parameters for inclusion in a regression tree. Model performance was evaluated by leave-one-subject-out cross validation, wherein the automatic scores given to each subject’s trials (by a model trained on all other data) are compared to the corresponding human rater scores. Results: The best-performing LASSO algorithm identified 14 predictive parameters for inclusion in the regression tree, including completion time, linear path length, angular path length, angular acceleration, grasp velocity, and grasp acceleration. The final model’s raw output showed a strong positive correlation of 0.87 with the reviewer-generated scores, and rounding the output to the nearest integer yielded a leave-one-subject-out cross-validation accuracy of 77.8%. Results are summarized in the confusion matrix (Table 1). Conclusions: Our novel motion-tracking system and regression model automatically gave previously unseen trials overall skill scores that closely match scores from an expert human rater. With additional data and further development, this system may enable creation of a motion-based training platform for pediatric laparoscopic surgery and could yield insights into the fundamental components of surgical skill.

[BibTex]

[BibTex]


no image
How Much Haptic Surface Data is Enough?

Burka, A., Kuchenbecker, K. J.

Workshop paper (5 pages) presented at the AAAI Spring Symposium on Interactive Multi-Sensory Object Perception for Embodied Agents, March 2017 (misc)

Abstract
The Proton Pack is a portable visuo-haptic surface interaction recording device that will be used to collect a vast multimodal dataset, intended for robots to use as part of an approach to understanding the world around them. In order to collect a useful dataset, we want to pick a suitable interaction duration for each surface, noting the tradeoff between data collection resources and completeness of data. One interesting approach frames the data collection process as an online learning problem, building an incremental surface model and using that model to decide when there is enough data. Here we examine how to do such online surface modeling and when to stop collecting data, using kinetic friction as a first domain in which to apply online modeling.

link (url) [BibTex]

link (url) [BibTex]


no image
Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces

Culbertson, H., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 10(1):63-74, January 2017 (article)

[BibTex]


no image
Effects of Grip-Force, Contact, and Acceleration Feedback on a Teleoperated Pick-and-Place Task

Khurshid, R. P., Fitter, N. T., Fedalei, E. A., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 10(1):40-53, January 2017 (article)

[BibTex]

[BibTex]


no image
The tactile perception of transient changes in friction

Gueorguiev, D., Vezzoli, E., Mouraux, A., Lemaire-Semail, B., Thonnard, J.

Journal of The Royal Society Interface, 14(137), The Royal Society, 2017 (article)

Abstract
When we touch an object or explore a texture, frictional strains are induced by the tactile interactions with the surface of the object. Little is known about how these interactions are perceived, although it becomes crucial for the nascent industry of interactive displays with haptic feedback (e.g. smartphones and tablets) where tactile feedback based on friction modulation is particularly relevant. To investigate the human perception of frictional strains, we mounted a high-fidelity friction modulating ultrasonic device on a robotic platform performing controlled rubbing of the fingertip and asked participants to detect induced decreases of friction during a forced-choice task. The ability to perceive the changes in friction was found to follow Weber{\textquoteright}s Law of just noticeable differences, as it consistently depended on the ratio between the reduction in tangential force and the pre-stimulation tangential force. The Weber fraction was 0.11 in all conditions demonstrating a very high sensitivity to transient changes in friction. Humid fingers experienced less friction reduction than drier ones for the same intensity of ultrasonic vibration but the Weber fraction for detecting changes in friction was not influenced by the humidity of the skin.

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Feeling multiple edges: The tactile perception of short ultrasonic square reductions of the finger-surface friction

Gueorguiev, D., Vezzoli, E., Sednaoui, T., Grisoni, L., Lemaire-Semail, B.

In 2017 IEEE World Haptics Conference (WHC), pages: 125-129, 2017 (inproceedings)

DOI [BibTex]

DOI [BibTex]

2016


no image
Qualitative User Reactions to a Hand-Clapping Humanoid Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 317-327, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

[BibTex]

2016

[BibTex]


no image
Designing and Assessing Expressive Open-Source Faces for the Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 340-350, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

[BibTex]

[BibTex]


no image
Rhythmic Timing in Playful Human-Robot Social Motor Coordination

Fitter, N. T., Hawkes, D. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 296-305, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

[BibTex]

[BibTex]


no image
Using IMU Data to Demonstrate Hand-Clapping Games to a Robot

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pages: 851 - 856, October 2016, Interactive presentation given by Fitter (inproceedings)

[BibTex]

[BibTex]


no image
ProtonPack: A Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pages: 58-65, sep 2016, Oral presentation given by Burka (inproceedings)

[BibTex]

[BibTex]


no image
Equipping the Baxter Robot with Human-Inspired Hand-Clapping Skills

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, pages: 105-112, August 2016, Oral presentation given by Fitter (inproceedings)

[BibTex]

[BibTex]


no image
Reproducing a Laser Pointer Dot on a Secondary Projected Screen

Hu, S., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), pages: 1645-1650, July 2016, Oral presentation given by Hu (inproceedings)

[BibTex]

[BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

In Proceedings of the Biomechanics and Neural Control of Movement Conference (BANCOM), June 2016, Extended abstract. Poster presentation given by Fjeld (inproceedings)

[BibTex]

[BibTex]


no image
Deep Learning for Tactile Understanding From Visual and Haptic Data

Gao, Y., Hendricks, L. A., Kuchenbecker, K. J., Darrell, T.

In Proceedings of the IEEE International Conference on Robotics and Automation, pages: 536-543, May 2016, Oral presentation given by Gao (inproceedings)

[BibTex]

[BibTex]


no image
Robust Tactile Perception of Artificial Tumors Using Pairwise Comparisons of Sensor Array Readings

Hui, J. C. T., Block, A. E., Taylor, C. J., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 305-312, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Hui (inproceedings)

[BibTex]

[BibTex]


no image
Data-Driven Comparison of Four Cutaneous Displays for Pinching Palpation in Robotic Surgery

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 147-154, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Brown (inproceedings)

[BibTex]

[BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

[BibTex]

[BibTex]


no image
Design and Implementation of a Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 350-352, April 2016, Work-in-progress paper. Poster presentation given by Burka (inproceedings)

[BibTex]

[BibTex]


no image
Objective assessment of robotic surgical skill using instrument contact vibrations

D. Gomez, E., Aggarwal, R., McMahan, W., Bark, K., Kuchenbecker, K. J.

Surgical Endoscopy, 30(4):1419-1431, April 2016 (article)

[BibTex]

[BibTex]


no image
Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

IEEE Transactions on Biomedical Engineering, 63(2):278-287, February 2016 (article)

[BibTex]

[BibTex]


no image
Psychophysical Power Optimization of Friction Modulation for Tactile Interfaces

Sednaoui, T., Vezzoli, E., Gueorguiev, D., Amberg, M., Chappaz, C., Lemaire-Semail, B.

In Haptics: Perception, Devices, Control, and Applications, pages: 354-362, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
Ultrasonic vibration and electrovibration can modulate the friction between a surface and a sliding finger. The power consumption of these devices is critical to their integration in modern mobile devices such as smartphones. This paper presents a simple control solution to reduce up to 68.8 {\%} this power consumption by taking advantage of the human perception limits.

[BibTex]

[BibTex]


no image
Peripheral vs. central determinants of vibrotactile adaptation

Klöcker, A., Gueorguiev, D., Thonnard, J. L., Mouraux, A.

Journal of Neurophysiology, 115(2):685-691, 2016, PMID: 26581868 (article)

Abstract
Long-lasting mechanical vibrations applied to the skin induce a reversible decrease in the perception of vibration at the stimulated skin site. This phenomenon of vibrotactile adaptation has been studied extensively, yet there is still no clear consensus on the mechanisms leading to vibrotactile adaptation. In particular, the respective contributions of 1) changes affecting mechanical skin impedance, 2) peripheral processes, and 3) central processes are largely unknown. Here we used direct electrical stimulation of nerve fibers to bypass mechanical transduction processes and thereby explore the possible contribution of central vs. peripheral processes to vibrotactile adaptation. Three experiments were conducted. In the first, adaptation was induced with mechanical vibration of the fingertip (51- or 251-Hz vibration delivered for 8 min, at 40× detection threshold). In the second, we attempted to induce adaptation with transcutaneous electrical stimulation of the median nerve (51- or 251-Hz constant-current pulses delivered for 8 min, at 1.5× detection threshold). Vibrotactile detection thresholds were measured before and after adaptation. Mechanical stimulation induced a clear increase of vibrotactile detection thresholds. In contrast, thresholds were unaffected by electrical stimulation. In the third experiment, we assessed the effect of mechanical adaptation on the detection thresholds to transcutaneous electrical nerve stimuli, measured before and after adaptation. Electrical detection thresholds were unaffected by the mechanical adaptation. Taken together, our results suggest that vibrotactile adaptation is predominantly the consequence of peripheral mechanoreceptor processes and/or changes in biomechanical properties of the skin.

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Silent Expectations: Dynamic Causal Modeling of Cortical Prediction and Attention to Sounds That Weren’t

Chennu, S., Noreika, V., Gueorguiev, D., Shtyrov, Y., Bekinschtein, T. A., Henson, R.

Journal of Neuroscience, 36(32):8305-8316, Society for Neuroscience, 2016 (article)

Abstract
There is increasing evidence that human perception is realized by a hierarchy of neural processes in which predictions sent backward from higher levels result in prediction errors that are fed forward from lower levels, to update the current model of the environment. Moreover, the precision of prediction errors is thought to be modulated by attention. Much of this evidence comes from paradigms in which a stimulus differs from that predicted by the recent history of other stimuli (generating a so-called {\textquotedblleft}mismatch response{\textquotedblright}). There is less evidence from situations where a prediction is not fulfilled by any sensory input (an {\textquotedblleft}omission{\textquotedblright} response). This situation arguably provides a more direct measure of {\textquotedblleft}top-down{\textquotedblright} predictions in the absence of confounding {\textquotedblleft}bottom-up{\textquotedblright} input. We applied Dynamic Causal Modeling of evoked electromagnetic responses recorded by EEG and MEG to an auditory paradigm in which we factorially crossed the presence versus absence of {\textquotedblleft}bottom-up{\textquotedblright} stimuli with the presence versus absence of {\textquotedblleft}top-down{\textquotedblright} attention. Model comparison revealed that both mismatch and omission responses were mediated by increased forward and backward connections, differing primarily in the driving input. In both responses, modeling results suggested that the presence of attention selectively modulated backward {\textquotedblleft}prediction{\textquotedblright} connections. Our results provide new model-driven evidence of the pure top-down prediction signal posited in theories of hierarchical perception, and highlight the role of attentional precision in strengthening this prediction.SIGNIFICANCE STATEMENT Human auditory perception is thought to be realized by a network of neurons that maintain a model of and predict future stimuli. Much of the evidence for this comes from experiments where a stimulus unexpectedly differs from previous ones, which generates a well-known {\textquotedblleft}mismatch response.{\textquotedblright} But what happens when a stimulus is unexpectedly omitted altogether? By measuring the brain{\textquoteright}s electromagnetic activity, we show that it also generates an {\textquotedblleft}omission response{\textquotedblright} that is contingent on the presence of attention. We model these responses computationally, revealing that mismatch and omission responses only differ in the location of inputs into the same underlying neuronal network. In both cases, we show that attention selectively strengthens the brain{\textquoteright}s prediction of the future.

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Touch uses frictional cues to discriminate flat materials

Gueorguiev, D., Bochereau, S., Mouraux, A., Hayward, V., Thonnard, J.

Scientific reports, 6, pages: 25553, Nature Publishing Group, 2016 (article)

[BibTex]

[BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

[BibTex]

[BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

[BibTex]

[BibTex]

2015


no image
Reducing Student Anonymity and Increasing Engagement

Kuchenbecker, K. J.

University of Pennsylvania Almanac, 62(18):8, November 2015 (article)

[BibTex]

2015

[BibTex]


no image
Surgeons and Non-Surgeons Prefer Haptic Feedback of Instrument Vibrations During Robotic Surgery

Koehn, J. K., Kuchenbecker, K. J.

Surgical Endoscopy, 29(10):2970-2983, October 2015 (article)

[BibTex]


no image
Displaying Sensed Tactile Cues with a Fingertip Haptic Device

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 8(4):384-396, October 2015 (article)

[BibTex]

[BibTex]


no image
Analysis of the Instrument Vibrations and Contact Forces Caused by an Expert Robotic Surgeon Doing FRS Tasks

Brown, J. D., O’Brien, C., Miyasaka, K., Dumon, K. R., Kuchenbecker, K. J.

In Proc. Hamlyn Symposium on Medical Robotics, pages: 75-76, London, England, June 2015, Poster presentation given by Brown (inproceedings)

[BibTex]

[BibTex]