Mai, Hsin-YuHsin-YuMaiLi, Yi-ShiuanYi-ShiuanLiHAN-PANG HUANG2026-04-212026-04-2120259798331596408https://www.scopus.com/record/display.uri?eid=2-s2.0-105031779916&origin=resultslisthttps://scholars.lib.ntu.edu.tw/handle/123456789/737397This paper proposes an MR-based interactive control framework for surgical robotics, specifically designed for ophthalmic scenarios. Leveraging Microsoft HoloLens 2 and MRTK3, the system enables users to intuitively operate a mobile robotic platform and a 6-degree-of-freedom robotic arm via natural hand gestures, integrating forward/inverse kinematics, trajectory planning, and Unity-ROS real-time communication. Users can define 3D target points and designate movement paths within the MR environment using customized gestures. The system then calculates joint angles through inverse kinematics and transmits the commands to the physical robot via a communication module. By incorporating spatial mesh generation and QR code-based CAD model registration, precise alignment between virtual and physical components is achieved. Interactive gesture control is also implemented to facilitate re-al-time path designation and execution. The system sup-ports both manual and autonomous control modes, with A∗ path planning and Damped Least Squares applied for navigation and end-effector positioning. Additional modules, including autonomous path planning, CAD model visualization, spatial understanding, and UI interactions, form a comprehensive and immersive surgical simulation and training platform.falseInteractive Surgical Robot Control Using Mixed Realityconference paper10.1109/CACS67552.2025.112882442-s2.0-105031779916