This study aims to develop and test a computer vision-based method for tracking and navigating a bronchoscope during lung procedures, such as biopsy or ablation. A vision-based algorithm was developed to track bronchoscope rotation and identify airway branches for navigation. The algorithm was tested on a phantom and a preclinical swine subject. The 3D airway tree was segmented from pre-procedural cone-beam CT. The airway tree was subdivided into segments and the 3D coordinates were stored using centerline extraction. Feature-based rotational tracking was calculated using SURF and brute-force matcher. Bifurcation detection was accomplished by image processing and blob detection. The localization of the bronchoscope within the airway tree was performed based on the projection of the child branches relative to the parent and related to the 3D image. A sufficient number of features to identify rotational positioning of the bronchoscope were found in 720 out of 811 (89%) video frames with an error of 3.2±2.2 degrees. Airway bifurcations were correctly identified in 29 out of 31 (90%) cases and the bronchoscope was correctly localized within a segment in seven out of seven (100%) cases. In conclusion, a computer vision-based method for tracking in the airways accurately identified the rotation of a bronchoscope and classified bifurcations to assist in navigation without the use of electromagnetic, position detection, or fiber optic shape-sensing technologies. Implementation of this technology could enable cost-controlled adoption of bronchoscopic technologies for trainees and might be utilized in low-resource settings unequipped with expensive robotic and tracking systems for diagnosis and management of suspected lung cancer.
PurposeAn augmented reality (AR) system was developed to facilitate free-hand real-time needle guidance for transperineal prostate (TP) procedures and to overcome the limitations of a traditional guidance grid.ApproachThe HoloLens AR system enables the superimposition of annotated anatomy derived from preprocedural volumetric images onto a patient and addresses the most challenging part of free-hand TP procedures by providing real-time needle tip localization and needle depth visualization during insertion. The AR system accuracy, or the image overlay accuracy (n = 56), and needle targeting accuracy (n = 24) were evaluated within a 3D-printed phantom. Three operators each used a planned-path guidance method (n = 4) and free-hand guidance (n = 4) to guide needles into targets in a gel phantom. Placement error was recorded. The feasibility of the system was further evaluated by delivering soft tissue markers into tumors of an anthropomorphic pelvic phantom via the perineum.ResultsThe image overlay error was 1.29 ± 0.57 mm, and needle targeting error was 2.13 ± 0.52 mm. The planned-path guidance placements showed similar error compared to the free-hand guidance (4.14 ± 1.08 mm versus 4.20 ± 1.08 mm, p = 0.90). The markers were successfully implanted either into or in close proximity to the target lesion.ConclusionsThe HoloLens AR system can provide accurate needle guidance for TP interventions. AR support for free-hand lesion targeting is feasible and may provide more flexibility than grid-based methods, due to the real-time 3D and immersive experience during free-hand TP procedures.
Purpose: This study aims to analyze a social distance monitoring and contact tracing assistance tool for preventing the spread of COVID-19 in a busy indoor working hospital environment. Method: A camera-based tool was developed. The tool estimates physical distance between multiple individuals in real-time and also tracks individuals and records their contact time when in violation social distance requirements for retrospect review. Both stereo- and monocular-camera tools are implemented and their accuracy and efficiency are evaluated and compared. Video was captured by a ZED M camera which was set close to the ceiling of a lab space. Three people within the field of view of the camera completed various movements. The distance (binary, <6 feet or >6 feet) and contact time between each pair was recorded as ground truth and compared to the video software analysis. Additionally, the contact time between any two individuals was calculated and compared to ground truth. Results: The overall accuracy of social distance detection was 95.1% and 74.4%, with a false-negative rate (when the tool predicts individuals are far enough apart, when they are actually too close) of 7.2% and 23.5% for the stereo and monocular tools, respectively. Conclusions: A stereo-camera social distance monitoring and contact tracing assistance tool can accurately detect social distance among multiple people, and keep an accurate contact record for each individual. While a monocular camera tool provided some level of certainty, a stereo camera tool was shown to be superior.
Purpose: This study aims to investigate the accuracy of a cross platform augmented reality (AR) system for percutaneous needle interventions irrespective of operator error. In particular, we study the effect of the relative position and orientation of the AR device and the marker, the location of the target, and the angle of needle on the overlay accuracy. Method: A needle guidance AR platform developed using Unity and Vuforia SDK platforms was used to display a planned needle trajectory for targets via mobile and wearable devices. To evaluate the system accuracy, a custom phantom embedded with metal fiducial markers and an adjustable needle guide was designed to mimic different relative position and orientation scenarios of the smart device and the marker. After segmenting images of CT-visible fiducial markers as well as different needle trajectories, error was defined by comparing them to the corresponding augmented target/needle trajectory projected by smartphone and smartglasses devices. Results: The augmentation error for targets and needle trajectories were reported as a function of marker position and orientation, as well as the location of the targets. Overall, the image overlay error for needle trajectory was 0.28±0.32° (Max = 0.856°) and 0.41±0.23° (Max = 0.805°) using the iPhone and HoloLens glasses, respectively. The overall image overlay error for targets was 1.75±0.59 mm for iPhone, and 1.74±0.86 mm for HoloLens. Conclusions: The image overlay error caused by different sources can be quantified for different AR devices.
Purpose: A new version of a grid-template-mimicking MR-compatible robot is developed to assist during in-gantry MRI-guided focal laser ablation of prostate cancer. This robot replaces the grid template and provides higher positioning resolution, and allows autonomous needle alignment directly from the targeting and navigation software, with the needle insertion manually performed for safety. Method: A substantially more compact solution is designed and prototyped to allow comfortable accommodation between the patient’s legs while in MRI bore. The controller software was reconfigured and embedded into the custom navigation and multi-focal ablation software, OncoNav (NIH). OncoNav performs robot-to-image registration, target planning, controlling the robot, ablation planning, and 3D temperature analysis for monitoring. For free space accuracy study, 5 targets were selected across the workspace and the robot was commanded 5 times to each target. Then, a thermochromic phantom study was designed consisting of acrylamide gel and color changing ink for testing the overall workflow. 4 spherical metal fiducials were embedded in the phantom at different locations. After each targeting, laser ablation was applied in two of the targets. Finally, the phantom was sliced for gross observation of guidance and treatment accuracy. Results: in-the-air accuracy was 0.38±0.27 mm. The overall targeting accuracy including robot, registration, and insertion error was 2.17±0.47 mm in phantom. Ablation successfully covered ellipsoids around the targets. The workflow was acceptably smooth. Conclusions: The new robot can accurately assist in targeting small targets followed by focal laser ablation. Sterility and regulatory hurdles will be addressed with specific design approaches as the next step.
In transperineal prostate biopsy or ablation, a grid-template is typically used to guide the needle. The guidance method has limited positioning resolution and lack of needle angulation selections that are referenced to ultrasound imaging or TRUS-MRI fusion targets. To overcome the limitation, a novel augmented reality (AR) system that use smart see-through glasses and smartphone as a needle guidance device for transperineal prostate procedure was developed. The AR system is comprised of a MRI/CT scanner, a pre-procedural image analysis and visualization software, AR devices (smart-glasses, smartphone), a newly-developed AR app, as well as a local network. The AR app displays the lesion and planned needle trajectory, which are derived from the pre-procedural images, on the AR devices. A special designed image marker frame that affixed to the patient’s perineum was used to track the pre-procedural image with the AR devices. The displayed needle plan was always referenced to the patient and remains independent from the position and orientation of the devices. Multiple devices can be used simultaneously and communicate via a local network. We evaluated the AR system accuracy with iPhone and R-7 glasses in a phantom study. The image overlay accuracy was 0.58±0.43o and 1.62±1.52o in iPhone and R-7 glasses respectively. The accuracy of iPhone guidance was 1.9±0.97 mm (lateral) and 1.0±0.5 mm (in-direction), the accuracy of R-7 guidance was 2.8±1.4mm (lateral) and 2.3±1.5mm (indirection). AR system using smart-glasses and smartphone can provide accurate needle guidance and see-through-the-skin display for needle based transperineal prostate interventions like biopsy and ablation.
Accurate needle placement largely depends on physicians’ visuospatial skills in CT-guided interventions. To reduce the reliance on operator experience and enhance accuracy, we developed an augmented reality system using smart seethrough glasses to facilitate and assist bedside needle angle guidance. The AR system was developed using Unity and Vuforia SDK. It displays the planned needle angle on the glasses’ see-through screens in real-time based on the glasses orientation. The displayed angle is always referenced to the CT table and independent from the physical orientation of the glasses. The see-through feature allows the operator to compare the actual needle and the planned needle angle continuously. The glasses’ orientation was tracked by its built-in gyroscope. The offset between the embedded gyroscope and the glasses’ display frame was pre-calibrated. A quick one-touch calibration method between the glasses and CT frame was implemented. Hardware accuracy and guidance accuracy was evaluated in phantom studies. In the first test, a needle was inserted in the phantom and scanned with CT. The measured angle in the CT scan was set on the glasses. We took a snapshot from the lens and compared the needle vector and guideline in the saved snapshot. The hardware accuracy was within 0.98 ± 0.85 degree. In the second test, after each insertion guided by the glasses, a CT scan was taken to validate the insertion angle error. The accuracy of the guidance was within 1.33 ± 0.73 degree. Smart glasses can provide accurate guidance for needle based interventions with minimal disturbance of the standard clinical workflow.
Monitoring temperature during a cone-beam CT (CBCT) guided ablation procedure is important for prevention of over-treatment and under-treatment. In order to accomplish ideal temperature monitoring, a thermometry map must be generated. Previously, this was attempted using CBCT scans of a pig shoulder undergoing ablation.1 We are extending this work by using CBCT scans of real patients and incorporating more processing steps. We register the scans before comparing them due to the movement and deformation of organs. We then automatically locate the needle tip and the ablation zone. We employ a robust change metric due to image noise and artifacts. This change metric takes windows around each pixel and uses an equation inspired by Time Delay Analysis to calculate the error between windows with the assumption that there is an ideal spatial offset. Once the change map is generated, we correlate change data with measured temperature data at the key points in the region. This allows us to transform our change map into a thermal map. This thermal map is then able to provide an estimate as to the size and temperature of the ablation zone. We evaluated our procedure on a data set of 12 patients who had a total of 24 ablation procedures performed. We were able to generate reasonable thermal maps with varying degrees of accuracy. The average error ranged from 2.7 to 16.2 degrees Celsius. In addition to providing estimates of the size of the ablation zone for surgical guidance, 3D visualizations of the ablation zone and needle are also produced.
KEYWORDS: Thermometry, Calibration, Computed tomography, Data modeling, Temperature metrology, Sensors, Signal to noise ratio, Software development, 3D modeling, Image processing
Temperature monitoring and therefore the final treatment zone achieved during a cone-beam CT (CBCT) guided ablation can prevent overtreatment and undertreatment. A novel method is proposed to detect changes in consecutive CBCT images obtained from projection reconstructions during an ablation procedure. The possibility is explored of using this method to generate thermometry maps from CBCT images, which can be used as an input function for ablation treatment planning. This novel method uses a baseline and an intermittent CBCT scan, which are routinely acquired to confirm the needle position and monitor progress of the ablation. Accurate registration is required and assumed in vitro and ex vivo. A Wronskian change detector algorithm is applied on the compensated images to obtain a difference image between the intermittent and baseline scans. Finally, a thermal map created by applying a calibration determined experimentally is used to obtain the corresponding temperature at each pixel or voxel. We applied Wronskian change detector to detect the difference of two CBCT images, which have low signal to noise ratio, and calibrate Wronskian change model to temperature data using a gel phantom. We tested the temperature mapping with water and gel phantoms as well as pig shoulder. The experimental results show this method can detect temperature change within 5°C for a voxel size of 1mm3 (within clinical relevancy), and by consequence delineate the ablation zone. The preliminary experimental results show that CBCT thermometry is possible and promising, but may require pre-processing, such as registration for motion compensation between the baseline and intermittent scans. Further, quantitative evaluations have to be conducted for validation prior to clinical assessment and translation. CBCT is a widely available technology that could make thermometry clinically practical as an enabling component of iterative ablation treatment planning.
We present a method towards optimization of multiple ablation probe placement to provide efficient coverage
of a tumor for thermal therapy while respecting clinical needs such as limiting the sites of probe insertions at
the pleura/liver surface, choosing secure probe trajectories and locations, avoiding ablation of critical structures,
reducing ablation of healthy tissue and overlap of ablation zones. The ablation optimizer treats each ablation
location independently, and the number of ablation probe placements itself is treated as a variable to be optimized.
This allows us to potentially feedback the ablation after deployment and re-optimize the next steps during the
plan. The optimization method uses a new class of derivate-free algorithms for solving a non-linear mixed
variable problem with hard and soft constraints derived from clinical images. Our methods use discretization
of the ablation volume, which can accommodate irregular shape of the ablation zone. The non-gradient based
strategy produce new candidates to yield a feasible solution within a few iterations. In our simulation experiments
this strategy typically reduced the ablation zone overlap and ablated healthy tissue ablated by 46% and 29%,
respectively in a single iteration, resulting in a feasible solution to be found within 35 iterations. Our method
for optimization provides efficient implementation for planning the coverage of a tumor while respecting clinical
constraints. The ablation planning can be combined with navigation assistance to enable accurate translation
and feedback of the plan.
In this paper we present a surgical assistant system for implanting prosthetic aortic valve transapically under MRI
guidance, in a beating heart. The system integrates an MR imaging system, a robotic system, as well as user interfaces
for a surgeon to plan the procedure and manipulate the robot. A compact robotic delivery module mounted on a robotic
arm is used for delivering both balloon-expandable and self-expanding prosthesis. The system provides different user
interfaces at different stages of the procedure. A compact fiducial pattern close to the volume of interest is proposed for
robot registration. The image processing and the transformation recovery methods using this fiducial in MRI are
presented. The registration accuracy obtained by using this compact fiducial is comparable to the larger multi-spherical
marker registration method. The registration accuracy using these two methods is less than 0.62±0.50 deg (mean ± std.
dev.) and 0.63±0.72 deg (mean ± std. dev.), respectively. We evaluated each of the components and show that they can
work together to form a complete system for transapical aortic valve replacement.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.