Machine Learning in Robotics and Tactile Sensing


Robots are can use RGB cameras and LiDAR to map environments, detect objects and estimate depth. However, vision alone isn’t enough. Cameras struggle in low light, objects can be occluded and small-scale surface details are hard to capture. When it comes to grasping fragile or irregular items, sight cannot replace touch. This is where tactile sensing becomes critical .
Traditional sensors each have limitations:
Tactile sensing can be implemented in several ways :
Measure deformation of a flexible membrane using structured light or marker arrays.
Pros: Highly information dense, relatively inexpensive.
Cons: Bulky due to camera positioning.
Detect changes in local magnetic fields caused by deformation.
Pros: Small form factor, embeddable in thin membranes.
Cons: Complex manufacturing, lower resolution.
Measure changes in resistivity or capacitance under pressure.
Pros: Capture complex surface dynamics.
Cons: Low resolution and tuning challenges.
Each approach trades off resolution, scalability and practicality.
One particularly interesting design is the Artificial Cilia Sensor .
This sensor uses:
The design enables:
This hybrid approach captures both surface detail and structural resistance. This is closer to how biological touch operates.
To test performance, a structured dataset was created :
Each sensor configuration produced:
This results in time-series tactile signals representing force dynamics during grasping.
To interpret this high-dimensional signal data, two models were explored :
The goal: classify material and texture based purely on tactile data.
Performance was evaluated under two conditions :
Trained and tested on subsets of combined grasp data
Trained on one grasp set, tested on another
The performance drop highlights a core robotics challenge: Generalising across unseen grasps and force patterns is significantly harder than learning within controlled distributions. This mirrors broader ML realities: robustness under distribution shift remains difficult.
In practice, this plays out on the warehouse floor in highly dynamic environments. Amazon’s robotic systems operate in fabric storage pods where items of different shapes, weights and textures are densely packed together. Using 6-axis force-torque sensors in the end-effector, the robot doesn’t just “see” the item, it feels resistance as it makes contact. As the gripper closes, tactile feedback detects micro-slips, pressure distribution and unexpected deformation. If a grasp is unstable or infeasible, the system can adjust grip force in real time or escalate to a human operator in a collaborative (co-bot) setup
Touch plays a critical role in preventing damage and improving planning through self-supervised updates from previous runs. This is not fully autonomous robotics replacing humans, it is collaborative, sensor-augmented systems.
Tactile sensing shifts robotics from rigid automation to adaptive manipulation.
But the research also surfaces deeper challenges:
Its an exciting sector and will be great to see how it continues to develop.