Low-cost augmented reality goggles enable precision fluorescence-guided cancer surgery

Disparities in surgical outcomes often result from subjective than objective decisions dictated by surgical training, experience, and available resources. To improve outcomes, surgeons have adopted advancements in robotics, endoscopy, and intra-operative imaging including fluorescence-guided surgery (FGS), which highlight tumors in real-time without using ionizing radiation. However, like many medical innovations, technical, economic, and logistic challenges have hindered widespread adoption of FGS beyond high-resource centers. To overcome these impediments, we developed the fully-wearable and battery-powered fluorescence imaging augmented reality Raspberry Pi-based goggle system (FAR-Pi). Novel device design ensures distance-independent coalignment between real and augmented FAR-Pi views and offers higher spatial resolution, depth of focus, and fluorescence detection sensitivity than existing bulkier, pricier, and wall-powered technologies. When paired with pan-tumor targeting fluorescent agents such as LS301, FAR-Pi objectively identifies tumors in vivo. As an open-source, affordable, and adaptable system, FAR-Pi is poised to democratize access to FGS and improve health outcomes worldwide.

video signal to be transferred via HDMI cables, ensured video signal fidelity over 5 meters 142 (Supplement, Sections 3A-C, 5A).

144
Laser diode array control and synchronization 145 146 To synchronize frame capture and laser excitation, we developed a custom 'HAT'  In vivo evaluation 169 170 The ability of the FAR-Pi system to detect in vivo accumulation of an intravenously 171 administered cancer-targeting fluorophore 22 was tested using a subcutaneous breast cancer 172 model in n=4 mice. Figure 5C shows a coaligned overlay of the visible camera image with 173 the NIR-camera-derived LS301 SBR spatial map for four different excitation filter options 174 in one mouse. Across all mice the average LS301 peak SBR signal ( Figure 5C,  In a see-through FGS system, misalignment between the projected fluorescence signal and 186 the surgeon's view could result in misinterpretation of tumor position and inaccurate tissue 187 excision. We therefore measured misalignment between the camera and surgeon's view by 188 placing a resolution target 50 cm from a camera that mimics the surgeon's eye, and then 189 positioned the FAR-Pi imaging module either at (1) mid-forehead, (2) centered at eye

196
In each case, a four-point projective transformation derived from matching fiducial 197 markers on resolution target images captured at a 50 cm calibration distance corrected 198 misalignment ( Figure 6A, B, C 'Corrected'). However, with the target moving 10 cm 199 closer or further from the calibration distance, vertical and horizontal alignment errors  an external monitor to a wearable head-mounted system, head-mounted see-through FGS 235 systems continue to be wall-powered, complex, expensive, relatively bulky, and replete 236 with significant mismatch between the detected NIR signal and the surgeon's view. In this 237 work, we show that inexpensive off-the-shelf components coupled with tools of the (illumination module, imaging module, computation module, and heads-up display) are 241 compact, accessible, and wearable.

243
Existing FGS systems employ large and expensive cart-based laser diodes, effectively 244 tethering the surgeon to the cart with optical fibers 6 . We found that a low-cost, compact, 245 fully-wearable, and battery-powered laser diode array provides similar irradiance to 246 tethered systems while freeing the surgeon from being connected with cables to a cart or 247 wall power. Some groups have utilized less bulky LED modules, but these have broad 248 spectra and, unlike the laser diode array, require additional clean-up filters to ensure that 249 their NIR tails do not overlap with the emitted fluorescence signal 17 . A halogen lamp 18 is 250 a relatively inexpensive alternative, but it also requires additional optical filters and poses 251 a challenge for wearable devices given that it is wall-powered and generates significant 252 heat.

254
Though commercial FGS systems rely upon more expensive camera sensors (i.e. CCD, 255 EMCCD, ICCD, or sCMOS) with ≥10 bit-depth and enhanced NIR quantum efficiency 6 , 256 we found that a low-cost RPiV2 NoIR CMOS camera sensor operating at 8-bit depth with RPiV2 sensor (sensitive to visible light only) in a compact (~6 cm 3 ) 3D-printed enclosure. 265 We found that a software-based projective transform, determined from a calibration step at 266 one distance, could be applied at any distance in real-time to achieve optically coaligned 267 dual white light and NIR imaging. This represents an innovative approach where software 268 correction is able to overcome inaccuracies inherent in aligning components using 3D

309
While a custom-built AR HMD would provide more engineering flexibility, our goal was 310 to make the FAR-Pi simple to source and build, and therefore we repurposed off-the-shelf 311 AR glasses (the Rokid Air) for medical imaging. Off-the-shelf augmented reality glasses 312 can be challenging to utilize when feeding a non-stereoscopic camera feed into them  (Figure 7, S12). Assembly details including alternative approaches to 397 achieve dual-camera streaming and to make the computational module head-mounted are 398 provided in Supplement Section 4 ( Figures S8, S10, S11, S13, and S14).

400
We modified the RPiCM4 device tree to allow synchronization of laser excitation with 401 every other camera frame, and then subtracted consecutive frame pixel-by-pixel grayscale