Listen to this article
|
A collaboration between NVIDIA and academic researchers is prepping robots for surgery. Researchers from the University of Toronto, UC Berkeley, ETH Zurich, Georgia Tech and NVIDIA developed ORBIT-Surgical. It is a simulation framework to train robots that could augment the skills of surgical teams while reducing surgeons’ cognitive load.
ORBIT-Surgical supports more than a dozen maneuvers inspired by the training curriculum for laparoscopic procedures, a.k.a. minimally invasive surgery. Examples include grasping small objects like needles, passing them from one arm to another, and placing them with high precision.
The researchers built the physics-based framework using NVIDIA Isaac Sim, a robotics simulation platform for designing, training and testing AI-based robots. They trained reinforcement learning and imitation learning algorithms on NVIDIA GPUs and used NVIDIA Omniverse, a platform for developing and deploying advanced 3D applications. The university and NVIDIA collaborators also used pipelines based on Universal Scene Description (OpenUSD) to enable photorealistic rendering.
The Intuitive Foundation, a nonprofit supported by robotic surgery leader Intuitive Surgical, provided the community-supported da Vinci Research Kit (dVRK). With it, the ORBIT-Surgical research team demonstrated how training a digital twin in simulation transfers to a physical robot in a lab environment in the video below.
The researchers presented ORBIT-Surgical this month at ICRA, the IEEE International Conference on Robotics and Automation in Yokohama, Japan. The open-source code package is now available on GitHub.
A stitch in AI saves nine with ORBIT-Surgical
ORBIT-Surgical is based on Isaac Orbit, a modular framework for robot learning built on Isaac Sim. Orbit includes support for various libraries for reinforcement learning and imitation learning, where artificial intelligence agents are trained to mimic ground-truth expert examples.
The surgical framework enables developers to train robots like the dVRK to manipulate both rigid and soft objects using reinforcement learning and imitation learning frameworks running on NVIDIA RTX GPUs.
ORBIT-Surgical introduced more than a dozen benchmark tasks for surgical training, including one-handed tasks such as picking up a piece of gauze, inserting a shunt into a blood vessel (see video below), or lifting a suture needle to a specific position. It also included two-handed tasks, like handing a needle from one arm to another, passing a threaded needle through a ring pole, and reaching two arms to specific positions while avoiding obstacles.
By developing a surgical simulator that takes advantage of GPU acceleration and parallelization, the team said it was able to boost robot learning speed by an order of magnitude compared to existing surgical frameworks. The researchers found that the robot’s digital twin could be trained to complete tasks like inserting a shunt and lifting a suture needle in under two hours on a single NVIDIA RTX GPU.
With the visual realism enabled by rendering in Omniverse, ORBIT-Surgical also allowed researchers to generate high-fidelity synthetic data, which could help train AI models for perception tasks such as segmenting surgical tools in real-world videos captured in the operating room.
A proof of concept showed that combining simulation and real-world data significantly improved the accuracy of an AI model to segment surgical needles from images — helping reduce the need for large, expensive real-world datasets for training such models, said the team.
Read the paper behind ORBIT-Surgical, and learn more about NVIDIA-authored papers at ICRA.
About the author
Isha Salian writes on deep learning, science, and healthcare, among other topics, as part of NVIDIA’s corporate communications team. She first joined the company as an intern in summer 2015. Salian has a journalism M.A., as well as undergraduate degrees in communication and English, from Stanford University.
Editor’s note: This article was syndicated from NVIDIA’s blog.
Tell Us What You Think!