Listen to this article
|
Researchers from the Electronics and Telecommunications Research Institute (ETRI) are developing tactile sensors that detect pressure regardless of the direction it’s applied.
ETRI is introducing a tactile sensor that fits into a robotic finger with stiffness and a shape similar to a human finger. The robotic finger can flexibly handle everything from hard objects to deformable soft objects.
“This sensor technology advances the interaction between robots and humans by a significant step and lays the groundwork for robots to be more deeply integrated into our society and industries,” said Kim Hye-jin, a principal researcher at ETRI’s Intelligent Components and Sensors Research Section.
ETRI entered into a mutual cooperation agreement with Wonik Robotics to jointly develop the technology. Wonik will bring its expertise in robotic hands, including its experience developing the “Allegro Hand,” to the project. It previously supplied robotic hands to companies like Meta Platforms, Google, NVIDIA, Microsoft, and Boston Dynamics. The companies jointly exhibited related achievements at the Smart Factory & Automotive Industry Exhibition, held at COEX in Seoul.
Overcoming the technical limitations of pressure sensors
ETRI’s research team says its technology can overcome the technical limitations of pressure sensors applied to existing robotic fingers. Previously, these sensors could show distorted signals depending on the direction in which the object was grasped. The team said it’s also highly rated for its performance and reliability.
The sensor can detect pressure from various directions, even in a three-dimensional finger form, while also providing the flexibility to handle objects as naturally as a human hand, according to the team. These abilities make up the core of the technology.
ETRI was able to advance this robotic finger technology by integrating omnidirectional pressure sensing with flexible air chamber tactile sensor technology, high-resolution signal processing circuit technology, and intelligent algorithm technology capable of real-time determination of an object’s stiffness.
Additionally, the team enhanced the sensor’s precision in pressure detection by including LED lights that change colors according to pressure changes. This provides intuitive feedback to users. The team took this a step further and also integrated vibration detection and wireless communication capabilities to further strengthen communication between humans and robots.
Unlike traditional sensors, which have sensors directly placed in the area where pressure is applied, these tactile sensors are not directly exposed to the area where pressure is applied. This allows for stable operation over long periods, even with continuous contact. The team says this improves the scalability of applications for robotic hands.
Looking ahead to the future
ETRI says the development of intelligent robotic hands that can adjust its grip strength according to the stiffness of objects will bring about innovation in ultra-precise object recognition. The team expects the commercialization timeline to begin in the latter half of 2024.
Sensitive and robust robotic fingers could help robots perform more complex and delicate tasks in various fields, including in the manufacturing and service sectors. ETRI expects that, through tactile sensor technology, robots will be able to manipulate a wide range of objects more precisely and significantly improve interaction with humans.
In the future, the research team plans to develop an entire robotic hand with these tactile sensors. Additionally, they aim to extend their development to a super-sensory hand that surpasses human sensory capabilities, including pressure temperature, humidity, light, and ultrasound sensors.
Through the team’s collaboration with Wonik, it has developed a robotic hand capable of recognizing objects through tactile sensors and flexibility controlling force. The research team plans to continue various studies to enable robots to handle objects and perceive the world as human hands do through sensors.
Tell Us What You Think!