A-SLIP is a complete system for real-time slip estimation in robotic grasping. The system integrates piezoelectric microphones into a parallel-jaw gripper and uses a convolutional neural network to process synchronized multi-channel audio spectrograms, jointly estimating slip presence, direction, and magnitude in the grasp plane.
The A-SLIP sensor consists of piezoelectric microphones embedded behind a textured silicone contact pad in a parallel-jaw gripper. The textured surface promotes structured vibrations during slip, while the piezoelectric microphones capture broadband acoustic signals with minimal footprint.
A-SLIP uses a lightweight convolutional network that processes synchronized multi-channel log-mel spectrograms from the piezoelectric microphones. The model employs channel and temporal attention mechanisms to jointly predict slip presence, direction, and magnitude through a unified multi-objective learning framework.
@inproceedings{a_slip_2026,
title={A-SLIP: Acoustic Sensing for Continuous In-hand Slip Estimation},
author={Yoo*, Uksang and Mao*, Yuemin and Oh, Jean and Ichnowski, Jeffrey},
year={2026},
note={Under review. * indicates equal contribution}
}