The ability of humans to navigate our environment is heavily dependent on the quality and speed of sensing. Tactile sensing, or the sensation of touch and pressure, is especially important for both coarse and fine object manipulation tasks, such as handling parcels and writing. Humans utilize tactile sensing data in real time for muscle path planning, fast tactile perception, tacto-object recognition, and overall human mobility. Similarly, robots tasked with service activities that require interfacing with humans or objects require fast and reliable tactile feedback.
We aim to develop a new prototype of tactile sensor with integrated machine-learning capabilities. In our group, we are particularly interested in how this next-generation skin can be used to enhance physical human-robot collaboration.
Incrementally learning objects by Touch (2012)
Human beings not only possess the remarkable ability to distinguish objects through tactile feedback but are further able to improve upon recognition competence through experience. In this work, we explore tactile-based object recognition with learners capable of incremental learning.
We propose and compare two novel discriminative and generative tactile learners that produce probability distributions over objects during object grasping/palpation. To enable iterative improvement, our online methods incorporate training samples as they become available. We also describe incremental unsupervised learning mechanisms, based on novelty scores and extreme value theory, when teacher labels are not available.
We present experimental results for both supervised and unsupervised learning tasks using the iCub humanoid, with tactile sensors on its five-fingered anthropomorphic hand, and 10 different object classes. Our classifiers perform comparably to state- of-the-art methods (C4.5 and SVM classifiers) and findings indicate that tactile signals are highly relevant for making accurate object classifications. We also show that accurate “early” classifications are possible using only 20-30% of the grasp sequence. For unsupervised learning, our methods generate high quality clusterings relative to the widely- used sequential k-means and self-organising map (SOM), and we present analyses into the differences between the approaches.
Online Spatio-Temporal Gaussian Process Experts with Application to Tactile Classification, Harold Soh, Yanyu Su and Yiannis Demiris, IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 2012. (Cognitive Robotics Best Paper Award Finalist) [ PDF ]
Iteratively Learning Objects by Touch: Online Discriminative and Generative Models for Tactile-based Recognition, Harold Soh and Yiannis Demiris, IEEE Transactions on Haptics, 2014. [ Pre-print PDF | IEEE Link ]
Texture Identification by Touch (2019)
Empowering robots with the sense of touch may augment their understanding of interacted objects and the environment beyond standard sensory modalities (e.g., vision). This work investigates the effect of hybridizing touch and sliding movements for tactile-based texture classification.
We develop three machine-learning methods within a framework to discriminate between surface textures; the first two methods use hand-engineered features, whilst the third leverages convo- lutional and recurrent neural network layers to learn feature representations from raw data.
To compare these methods, we constructed a dataset comprising tactile data from 23 textures gathered using the iCub platform under a loosely constrained setup, i.e., with nonlinear motion. In line with findings from neuroscience, our experiments show that a good initial estimate can be obtained via touch data, which can be further refined via sliding; combining both touch and sliding data results in 98% classification accuracy over unseen test data.
Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach, Tasbolat Taunyazov, Hui Fang Koh, Yan Wu, Caixia Cai and Harold Soh, IEEE International Conference on Robotics and Automation (ICRA), 2019 [ PDF | Data@Github ]