Background: Cytomorphology is an essential method of phenotype diagnostics in lab hematology. Recently, we have integrated automated scanning of peripheral blood smears and AI-based classification of blood cell images into our diagnostic routine (data reported at ASH 2021) which has been further evaluated in the BELUGA study (NCT04466059). Now we present a modified workflow for scanning and assessment of bone marrow (BM) smears including classification of all kinds of benign and malignant BM cell types.

Aim: To provide a cloud-based AI tool to support manual review of BM smears.

Methods: We established a 3-step fully automated and AI-based workflow starting with a 10x pre-scan for identification of areas of interest (AOI) and capture of variable numbers of AOI at 40x magnification (with oil). Resulting fields of view (FOV, 2048x1496 pixels) were processed by 2 independent deep neural networks (DNN) for object detection and classification.

First, 37 BM scans (10x magnification) were annotated by hem experts using freely drawn polygon labels, distinguishing between 6 quality classes of areas for cytomorphological investigation. 185,000 grid images of the annotated regions were used to train a DNN to identify AOI of optimal quality and flag positions for subsequent scanning at 40x magnification. We used a Metasystems (Altlussheim, GER) Metafer System (Zeiss Axio Imager.Z2 microscope, automatic slide feeder SFx80, automated oil disperser) for scanning.

To train an object detection (OD) model, we ran the AOI detection network on 19 BM smears which provided 173 FOV (40x) of BM cell layers. Each single cell was marked by human investigators using exact polygon labels on a digital interactive pencil display (Cintiq DTK2260K0A from Wacom). In total, 19,697 cells were labeled. We set up a supervised ML model, using the labeled images as an input. We fine-tuned the imagenet dataset pre-trained Termausnet with UNet16 model with our dataset and applied a postprocessing step for cell level validation instead of the pixel level one. To reduce overfitting, image augmentation algorithms were applied.

Then the OD model was used to acquire 156,049 precisely outlined single cell images (without artifact classes) from 8047 FOVs from 147 BM smears, including normal/reactive patterns and all kinds of BM neoplasms. 0 to 248 intact single cells per FOV were captured depending on smear quality. Out of these images, a balanced set of 38,372 images was classified by hem experts, referring to 23 predefined cell types, including signature cell types of different leukemias (e.g. hairy cells, APL cells) as well as tumour cells. The final training consisted of 476 to 29,370 (median: 2,178) cell images per cell type.

Using these images as an input, we set up a supervised ML model, outputting predicted probabilities of the 23 pre-defined classes. We used ImageNet-pretrained Xception as our base model. We trained, evaluated and deployed the model using Amazon SageMaker. For the large amount of data, most of the computing process was run in a cloud based web platform.

Results: AOI identification DNN was able to detect relevant regions in BM smears in an average of 6 min. Subsequent acquisition of high resolution images from the 50 positions with highest quality values took an average of 1:30 min.

The OD DNN was able to identify nucleated cells at 96% sensitivity and 90% specifity in an independent test set of 11 FOV. In order not to miss relevant objects, recall was overweighted over precision (5:1), leaving correction of false positive labels to human reviewers in an assumed digital workflow. For the object classification (OC) of single cells, the 3rd DNN was able to predict the predefined 23 BM cell types at acceptable median accuracy (86%) in an independent test set of 12,798 images. For 7 critical pathological cell forms (blasts, APL cells, hairy cells, lymphoma cells, tumour cells, inv16 eosinophils), median accuracy amounted to 86% as well.

Conclusion: Combination of 3 independent DNN provides a fully automated workflow for high throughput hem labs. It includes high resolution scanning of AOI in BM smears and identification and classification of single cells. Overview images (10x magnification) and single cell images (40x magnification with oil) can be reviewed in digital galleries by humans, prioritizing cases with pathological cell types identified by AI. Use of cloud-based platforms is inevitable with regard to huge amounts of processed data.

Pohlkamp:MLL Munich Leukemia Laboratory: Current Employment. Nadarajah:MLL Munich Leukemia Laboratory: Current Employment. Maschek:MLL Munich Leukemia Laboratory: Current Employment. Drescher:MetaSystems GmbH: Current Employment. Hänselmann:MetaSystems GmbH: Current Employment. Haferlach:MLL Munich Leukemia Laboratory: Current Employment, Other: Ownership. Kern:MLL Munich Leukemia Laboratory: Current Employment, Other: Ownership. Haferlach:Munich Leukemia Laboratory: Current Employment, Other: Part ownership.

Author notes


Asterisk with author names denotes non-ASH members.

Sign in via your Institution