The new Bi-Touch system, designed by scientists at the University of Bristol and based at the Bristol Robotics Laboratory, allows robots to carry out manual tasks by sensing what to do from a digital helper.
The findings show how an AI agent interprets its environment through tactile and proprioceptive feedback, and then control the robots' behaviours, enabling precise sensing, gentle interaction, and effective object manipulation to accomplish robotic tasks.
This development could revolutionise industries such as fruit picking, domestic service, and eventually recreate touch in artificial limbs.
Paper: ‘Bi-Touch: Bimanual Tactile Manipulation With Sim-to-Real Deep Reinforcement Learning’ by Yijiong Lin, Nathan Lepora et al. in IEEE Robotics and Automation Letters.