Welcome to Mediso

Please select your region:

North America
Europe

Please select your language:

Go

Artificial intelligence-based analysis of whole-body bone scintigraphy: The quest for the optimal deep learning algorithm and comparison with human observer performance

2023.01.18.

Ghasem et al., Zeitschrift für Medizinische Physik, 2023

Purpose

Whole-body bone scintigraphy (WBS) is one of the most widely used modalities in diagnosing malignant bone diseases during the early stages. However, the procedure is time-consuming and requires vigour and experience. Moreover, interpretation of WBS scans in the early stages of the disorders might be challenging because the patterns often reflect normal appearance that is prone to subjective interpretation. To simplify the gruelling, subjective, and prone-to-error task of interpreting WBS scans, we developed deep learning (DL) models to automate two major analyses, namely (i) classification of scans into normal and abnormal and (ii) discrimination between malignant and non-neoplastic bone diseases, and compared their performance with human observers.

Methods

Bone scintigraphy imaging

WBS scans were performed 2-4 hours post-injection following intravenous injection of 555 to 925 MBq of 99mTc-MDP on a dual-head gamma camera (Siemens Symbia Encore, Siemens ECAM IP1 and Mediso AnyScan® S) equipped with low-energy parallel-hole high-resolution collimators in supine arm-down position. The energy acquisition window was centered on 140 KeV with a 20% window, a scan velocity of 12-15 cm/min in continuous mode, and a matrix size of 1024 × 256.

Figure: An instance of normal and pathological cases according to nuclear medicine physicians’ reports.

After applying our exclusion criteria on 7188 patients from three different centers, 3772 and 2248 patients were enrolled for the first and second analyses, respectively. Data were split into two parts, including training and testing, while a fraction of training data were considered for validation. Ten different CNN models were applied to single- and dual-view input (posterior and anterior views) modes to find the optimal model for each analysis. In addition, three different methods, including squeeze-and-excitation (SE), spatial pyramid pooling (SPP), and attention-augmented (AA), were used to aggregate the features for dual-view input models.

Figure: Workflow of applied deep learning models. Ant: anterior, Post: posterior, SPP: spatial pyramid pooling, SE: squeeze-and-excitation, AA: attention-augmented.

Model performance was reported through area under the receiver operating characteristic (ROC) curve (AUC), accuracy, sensitivity, and specificity and was compared with the DeLong test applied to ROC curves. The test dataset was evaluated by three nuclear medicine physicians (NMPs) with different levels of experience to compare the performance of AI and human observers.

Results

DenseNet121_AA (DensNet121, with dual-view input aggregated by AA) and InceptionResNetV2_SPP achieved the highest performance (AUC = 0.72) for the first and second analyses, respectively. Moreover, on average, in the first analysis, Inception V3 and InceptionResNetV2 CNN models and dual-view input with AA aggregating method had superior performance. In addition, in the second analysis, DenseNet121 and InceptionResNetV2 as CNN methods and dual-view input with AA aggregating method achieved the best results. Conversely, the performance of AI models was significantly higher than human observers for the first analysis, whereas their performance was comparable in the second analysis, although the AI model assessed the scans in a drastically lower time.

Conclusion

Using the models designed in this study, a positive step can be taken toward improving and optimizing WBS interpretation. By training DL models with larger and more diverse cohorts, AI could potentially be used to assist physicians in the assessment of WBS images.

Full article on Nuclear Medicine Communications.

 

 

 

 

 

 

 

 

 

How can we help you?

Don't hesitate to contact us for technical information or to find out more about our products and services.

Get in touch