The ability of humans to navigate our environment is heavily dependent on the quality and speed of sensing. Tactile sensing, or the sensation of touch and pressure, is especially important for both coarse and fine object manipulation tasks, such as handling parcels and writing. Humans utilize tactile sensing data in real time for muscle path planning, fast tactile perception, tacto-object recognition, and overall human mobility. Similarly, robots tasked with service activities that require interacting with humans or objects require fast and reliable tactile feedback.

In our lab, we are particularly interested in how perception with next-generation skin can be used to enhance physical human-robot interaction for trustworthy collaboration.

Event-Driven Visual-Tactile Perception for Robots (2020)

In this work, we take crucial steps towards efficient visual-tactile perception for robotic systems. We gain inspiration from biological systems, which are asynchronous and event-driven. In contrast to resource-hungry deep learning methods, event-driven perception forms an alternative approach that promises power-efficiency and low-latency — features that are ideal for real-time mobile robots. More details at

Further Reading:

C20-3Event-Driven Visual-Tactile Sensing and Learning for Robots
Tasbolat Taunyazov, Weicong Sng, Hian Hian See, Brian Lim, Jethro Kuan, Abdul Fatir Ansari, Benjamin Tee, and Harold Soh
Robotics: Science and Systems Conference (RSS), 2020
[ PDF | Paper Site ] Show BibTeX

    title={Event-Driven Visual-Tactile Sensing and Learning for Robots}, 
    author={Tasbolat Taunyazoz and Weicong Sng and Hian Hian See and Brian Lim and Jethro Kuan and Abdul Fatir Ansari and Benjamin Tee and Harold Soh},
    booktitle = {Proceedings of Robotics: Science and Systems}, 
    year      = {2020}, 
    month     = {July}}

Texture Identification by Touch (2019)

Empowering robots with the sense of touch may augment their understanding of interacted objects and the environment beyond standard sensory modalities (e.g., vision). This work investigates the effect of hybridizing touch and sliding movements for tactile-based texture classification.

Tactile Sensor activations while sliding the iCub Fore-arm across a textured surface.

We develop three machine-learning methods within a framework to discriminate between surface textures; the first two methods use hand-engineered features, whilst the third leverages convo- lutional and recurrent neural network layers to learn feature representations from raw data.

To compare these methods, we constructed a dataset comprising tactile data from 23 textures gathered using the iCub platform under a loosely constrained setup, i.e., with nonlinear motion. In line with findings from neuroscience, our experiments show that a good initial estimate can be obtained via touch data, which can be further refined via sliding; combining both touch and sliding data results in 98% classification accuracy over unseen test data.

Further Reading:

Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach, Tasbolat Taunyazov, Hui Fang Koh, Yan Wu, Caixia Cai and Harold Soh, IEEE International Conference on Robotics and Automation (ICRA), 2019 [ PDF | Data@Github ]

Incrementally learning objects by Touch (2012)

Human beings not only possess the remarkable ability to distinguish objects through tactile feedback but are further able to improve upon recognition competence through experience. In this work, we explore tactile-based object recognition with learners capable of incremental learning.

We propose and compare two novel discriminative and generative tactile learners that produce probability distributions over objects during object grasping/palpation. To enable iterative improvement, our online methods incorporate training samples as they become available. We also describe incremental unsupervised learning mechanisms, based on novelty scores and extreme value theory, when teacher labels are not available.

We present experimental results for both supervised and unsupervised learning tasks using the iCub humanoid, with tactile sensors on its five-fingered anthropomorphic hand, and 10 different object classes. Our classifiers perform comparably to state- of-the-art methods (C4.5 and SVM classifiers) and findings indicate that tactile signals are highly relevant for making accurate object classifications. We also show that accurate “early” classifications are possible using only 20-30% of the grasp sequence. For unsupervised learning, our methods generate high quality clusterings relative to the widely- used sequential k-means and self-organising map (SOM), and we present analyses into the differences between the approaches.

Further Reading:

Online Spatio-Temporal Gaussian Process Experts with Application to Tactile Classification, Harold Soh, Yanyu Su and Yiannis Demiris, IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 2012. (Cognitive Robotics Best Paper Award Finalist) [ PDF ]

Iteratively Learning Objects by Touch: Online Discriminative and Generative Models for Tactile-­based Recognition, Harold Soh and Yiannis Demiris, IEEE Transactions on Haptics, 2014. [ Pre-print PDF | IEEE Link ]