Background: Selective internal radiation therapy with Yttrium-90 microspheres is an effective therapy for liver cancer and liver metastases. Yttrium-90 is mainly a high-energy beta particle emitter. These beta particles emit Bremsstrahlung radiation during their interaction with tissue making post-therapy imaging of the radioactivity distribution feasible. Nevertheless, image quality and quantification is difficult due to the continuous energy spectrum which makes resolution modelling, and attenuation and scatter estimation challenging.
Methods: In this study, a modified hybrid kernelised expectation maximisation is used to improve resolution and contrast and reduce noise. The iterative part of the kernel was frozen at the 72nd sub-iteration to avoid over-fitting of noise and background. A NEMA phantom with spherical inserts was used for the optimisation and validation of the algorithm, and data from 5 patients treated with Selective internal radiation therapy were used as proof of clinical relevance of the method.
Results: The results suggest a maximum improvement of 56% for region of interest mean recovery coefficient at fixed coefficient of variation and better identification of the hot volumes in the NEMA phantom. Similar improvements were achieved with patient data, showing 47% mean value improvement over the gold standard used in hospitals.
Conclusions: Such quantitative improvements could facilitate improved dosimetry calculations with SPECT when treating patients with Selective internal radiation therapy, as well as provide a more visible position of the cancerous lesions in the liver.