Abstract
Autonomy in robotic surgery is very challengingin unstructured environments, especially when interacting withdeformable soft tissues. This creates a challenge for model-based control methods that must account for deformationdynamics during tissue manipulation. Previous works in vision-based perception can capture the geometric changes withinthe scene, however, integration with dynamic properties toachieve accurate and safe model-based controllers has notbeen considered before. Considering the mechanic couplingbetween the robot and the environment, it is crucial to developa registered, simulated dynamical model. In this work, wepropose an online, continuous,real-to-simregistration methodto bridge from 3D visual perception to position-based dynamics(PBD) modeling of tissues. The PBD method is employed tosimulate soft tissue dynamics as well as rigid tool interactionsfor model-based control. Meanwhile, a vision-based strategy isused to generate 3D reconstructed point cloud surfaces that canbe used to register and update the simulation, accounting fordifferences between the simulation and the real world. To verifythis real-to-sim approach, tissue manipulation experiments havebeen conducted on the da Vinci Researach Kit. Our real-to-simapproach successfully reduced registration errors online, whichis especially important for safety during autonomous control.Moreover, the result show higher accuracy in occluded areasthan fusion-based reconstruction.
Abstract (translated)
URL
https://arxiv.org/abs/2011.00800