Deep-learning-based segmentation of individual tooth and bone with periodontal ligament interface details for simulation purposes

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

2 Citationer (Scopus)
25 Downloads (Pure)

Abstract

The process of constructing precise geometry of human jaws from cone beam computed tomography (CBCT) scans is crucial for building finite element models and treatment planning. Despite the success of deep learning techniques, they struggle to accurately identify delicate features such as thin structures and gaps between the tooth-bone interfaces where periodontal ligament resides, especially when trained on limited data. Therefore, segmented geometries obtained through automated methods still require extensive manual adjustment to achieve a smooth and organic 3D geometry that is suitable for simulations. In this work, we require the model to provide anatomically correct segmentation of teeth and bones which preserves the space for the periodontal ligament layers. To accomplish the task with few accurate labels, we pre-train a modified MultiPlanar UNet as the backbone model using inferior segmentations, i.e., tooth-bone segmentation with no space in the tooth-bone interfaces, and fine-tune the model with a dedicated loss function over accurate delineations that considers the space. We demonstrate that our approach can produce proper tooth-bone segmentations with gap interfaces that are fit for simulations when applied to human jaw CBCT scans. Furthermore, we propose a marker-based watershed segmentation applied on the MultiPlanar UNet probability map to separate individual tooth. This has advantages when the segmentation task is challenged by common artifacts caused by restorative materials or similar intensities in the teeth-teeth interfaces in occurrence of crowded teeth phenomenon. Code and segmentation results are available at https://github.com/diku-dk/AutoJawSegment.

OriginalsprogEngelsk
TidsskriftIEEE Access
Vol/bind11
Sider (fra-til)102460-102470
ISSN2169-3536
DOI
StatusUdgivet - 2023

Bibliografisk note

Publisher Copyright:
Author

Citationsformater