Key Points
- AI-driven robots trained through video observation can now perform surgical tasks.
- Instead of programming each movement, the robot learns by watching hundreds of videos.
- The robot successfully handled needle manipulation, tissue lifting, and suturing.
- The AI model autonomously corrected errors, like retrieving a dropped needle.
Researchers at Johns Hopkins University and Stanford University have developed a surgical robot capable of performing tasks with human-like precision after merely watching recorded videos of real surgeries. The team’s work utilizes the da Vinci Surgical System, a popular robotic platform for minimally invasive surgery, to demonstrate how AI can acquire skills through imitation learning.
Traditionally, robotic surgery requires painstaking programming of every movement. However, researchers bypassed this approach by employing imitation learning, allowing the robot to observe and replicate human actions. Hundreds of videos captured from wrist-mounted cameras demonstrated key tasks like needle manipulation, tissue lifting, and suturing. This data enabled the AI to interpret human hand movements as kinematic patterns, converting them into mathematical instructions it could execute.
The AI model developed is remarkably capable, even autonomously correcting errors such as retrieving a dropped needle—something it wasn’t explicitly taught. According to Axel Krieger, a Johns Hopkins assistant professor and lead researcher, this method marks a “significant step forward in medical robotics,” as the AI could predict necessary robotic movements with only visual input.
While robots performing surgeries might seem unsettling, their precision could sometimes surpass human capabilities, reducing medical errors and enhancing safety. This advancement in robotic surgery also allows human surgeons to concentrate on complex procedures requiring adaptability and decision-making skills beyond current AI capabilities.
Plans include teaching AI to perform surgeries autonomously. This research aligns with a broader trend in healthcare robotics, where developers like Perceptive also explore AI-guided robots for procedures, as demonstrated in a recent dental operation.