Wednesday, September 18, 2024

Combining existing sensors with machine learning algorithms improves robots' intrinsic sense of touch

The intrinsic sense of touch. Sensitive writing or drawing on the structure (A) is automatically interpreted utilizing convolutional neural networks (B). The accurate reconstruction of the physical interaction is achieved through nonlinear dimensionality reduction via machine learning techniques (C). The provided intrinsic sense of touch offers various interaction modalities without necessity of any explicit tactile sensors (D). Credit: Maged Iskandar

A team of roboticists at the German Aerospace Center's Institute of Robotics and Mechatronics finds that combining traditional internal force-torque sensors with machine-learning algorithms can give robots a new way to sense touch.

In their study published in the journal Science Robotics, the group took an entirely new approach to give robots a sense of touch that does not involve artificial skin.

For living creatures, touch is a two-way street; when you touch something, you feel its texture, temperature and other features. But you can also be touched, as when someone or something else comes in contact with a part of your body. In this new study, the research team found a way to emulate the latter type of touch in a robot by combining internal force-torque sensors with a machine-learning algorithm. 

Multi-point sensing and touch recognition during dynamic motion. Credit: Maged Iskandar

Recognizing that much of the sense of being touched comes due to torque (tension felt in the wrist for example, if pressure is applied to the fingers), the researchers put extra-sensitive force-torque sensors in the joints of a robot arm. The sensors detect pressure applied to the arm coming from multiple directions at once.

They then used a machine-learning application to teach the robot how to interpret various types of tension. This allowed the robot to recognize different types of touch scenarios. The robot was able to tell, for example, when it was being touched on a certain place along its arm. It also did away with the need to cover the entire robot with an artificial sensing skin.

Touch recognition. The interpretation of written digits on the robot surface as machine-readable code is utilized to intuitively command the robot (A). The touch trajectory for writing digit one is applied, the trajectory is successfully recognized and the assigned task is executed accordingly. Similarly applying digit three triggers the execution of the corresponding task. Likewise, virtual functional buttons can be placed anywhere on the structure to assign high-level tasks (B). Credit: Maged Iskandar

The researchers found that the AI application made the arm so sensitive that it could identify which of the numbers painted on its arm was being pressed—or in another case, to identify numbers drawn on its arm by a person using a fingertip.

This approach could open up new ways to interact with many types of robots, particularly those that are used in industrial environments working closely with human companions. 

by Bob Yirka , Tech Xplore

Source: Combining existing sensors with machine learning algorithms improves robots' intrinsic sense of touch (techxplore.com)

No comments:

Post a Comment