Neurosurgery requires exceptional precision and comprehensive preoperative planning to ensure optimal patient outcomes. Despite technological advancements, there remains a need for intuitive, accessible tools to enhance surgical preparation and medical education in this field. Traditional methods often lack the immersive experience necessary for surgeons to visualize complex procedures and critical neurovascular structures, while existing advanced solutions may be cost-prohibitive or require specialized hardware. This research presents a novel markerless web-based augmented reality (AR) application designed to address these challenges in neurointerventional preoperative planning and education. Utilizing MediaPipe for precise facial localization and segmentation, and React Three Fiber for immersive 3D visualization, the application offers an intuitive platform for complex preoperative procedures. A virtual 2-RPS parallel positioner or ``Skull-Bot" model is projected onto the user's face in real-time, simulating surgical tool control with high precision. Key features include the ability to import and auto-scale head anatomy to the user's dimensions and real-time auto-tracking of head movements once aligned. The web-based nature enables simultaneous access by multiple users, facilitating collaboration during surgeries and allowing medical students to observe live procedures. A pilot study involving three participants evaluated the application's auto-scaling and auto-tracking capabilities through various head rotation exercises. This research contributes to the field by offering a cost-effective, accessible, and collaborative tool for improving neurosurgical planning and education, potentially leading to better surgical outcomes and more comprehensive training for medical professionals.
This work was supported in part by the Hong Kong Research Grants Council (RGC)-General Research Fund (GRF) 14211420, RGC - Collaborative Research Fund under grant CRF C4026-21GF, in part by the Hong Kong Research Grants Council (RGC) Research Impact Fund under grant R4020-22, and in part by the Guangdong Basic and Applied Basic Research Foundation (GBABF) under Grant 2021B1515120035. (Corresponding Author: Hongliang Ren)
@inproceedings{ho2024web,
author = {Ho, Hon Lung and Wang, Yupeng and Wang, An and Bai, Long and Ren, Hongliang},
title = {Web-based Augmented Reality with Auto-Scaling and Real-Time Head Tracking towards Markerless Neurointerventional Preoperative Planning and Training of Head-mounted Robotic Needle Insertion},
booktitle = {{The IEEE International Conference on Robotics and Biomimetics (ROBIO)}},
year = {2024},
}