21 September,2025 11:26 AM IST | Mumbai | A Correspondent
Pic/tohoku.ac.jp
Grabbing a cup of coffee may seem simple for humans, but replicating the seamless coordination of sight and touch in robots has long been a challenge. Now, an international research team has unveiled a breakthrough system that integrates visual and tactile information, enabling robotic arms to adapt more naturally to
their environment.
Dubbed TactileAloha, the method builds on Stanford University's open-source ALOHA dual-arm robot platform, which traditionally relied only on vision. By adding tactile sensing, researchers found the robots could better distinguish textures and object orientations, such as aligning Velcro strips.
The approach, powered by vision-tactile transformer technology, significantly improved task success rates compared to conventional vision-only systems, according to findings published July 2, 2025, in IEEE Robotics and Automation Letters.
Video link: https://youtu.be/tOuNN-fTDo8