**3.2 Dynamic fusion of digital twin model and real surgical scene**

The interactive process observed by doctors during remote surgery is a delayed scene, while the digital twin environment can accurately reflect the realtime position and posture of surgical instruments being operated by the doctor. By integrating these two, comprehensive remote operation information can be provided to ensure operational safety and avoid damage to intra-abdominal organs. Since both are time-varying scenes, a virtual scene perspective coordinate system is first constructed based on the endoscopic camera coordinate system. Then, artificial intelligence techniques such as deep learning and image morphology are used to accurately segment the real instrument pole, determine the control points and the scaling relationship between the twin and real scenes, and reverse correct the coordinate system deviation introduced by the camera and disparity. Based on the control point registration, virtual and real image fusion is achieved to accurately present the twin instrument model under the real endoscopic image.
