Full metadata record
DC poleHodnotaJazyk
dc.contributor.authorSchmidt, Christian
dc.contributor.authorOverhoff, Heinrich Martin
dc.contributor.editorSkala, Václav
dc.date.accessioned2024-07-29T18:22:30Z-
dc.date.available2024-07-29T18:22:30Z-
dc.date.issued2024
dc.identifier.citationWSCG 2024: full papers proceedings: 32. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 247-254.en
dc.identifier.issn2464–4625 (online)
dc.identifier.issn2464–4617 (print)
dc.identifier.urihttp://hdl.handle.net/11025/57396
dc.format8 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherVáclav Skala - UNION Agencyen
dc.rights© Václav Skala - UNION Agencyen
dc.subjectnádor prsucs
dc.subjectklasifikacecs
dc.subjectkonvoluční neuronové sítěcs
dc.subjectultrazvukcs
dc.subjectnádorové podoblastics
dc.titleDeep learning-based classification of breast tumors using selected subregions of lesions in sonogramsen
dc.typekonferenční příspěvekcs
dc.typeconferenceObjecten
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.description.abstract-translatedBreast cancer, a prevalent disease among women, demands early detection for better clinical outcomes. While mammography is widely used for breast cancer screening, its limitation in e.g., dense breast tissue necessitates additional diagnostic tools. Ultrasound breast imaging provides valuable tumor information (features) which are used for standardized reporting, aiding in the screening process and precise biopsy targeting. Previous studies have demonstrated that the classification of regions of interest (ROIs), including only the lesion, outperforms whole image classification. Therefore, our objective is to identify essential lesion features within such ROIs, which are sufficient for accurate tumor classification, enhancing the robustness of diagnostic image acquisition. For our experiments, we employ convolutional neural networks (CNNs) to first segment suspicious lesions’ ROIs. In a second step, we generate different ROI subregions: top/bottom half, horizontal subslices and ROIs with cropped out center areas. Subsequently these ROI subregions are classified into benign vs. malignant lesions with a second CNN. Our results indicate that outermost ROI subslices perform better than inner ones, likely due to increased contour visibility. Removing the inner 66% of the ROI did not significantly impact classification outcomes (p = 0.35). Classifying half ROIs did not negatively impact accuracy compared to whole ROIs, with bottom ROI performing slightly better than top ROI, despite significantly lower image contrast in that region. Therefore, even visually less favorable images can be reliably analyzed when the lesion’s contour is depicted. In conclusion, our study underscores the importance of understanding tumor features in ultrasound imaging, supporting enhanced diagnostic approaches to improve breast cancer detection and management.en
dc.subject.translatedbreast tumoren
dc.subject.translatedclassificationen
dc.subject.translatedconvolutional neural networksen
dc.subject.translatedCNNen
dc.subject.translatedultrasounden
dc.subject.translatedtumor subregionsen
dc.identifier.doihttps://doi.org/10.24132/10.24132/CSRN.3401.25
dc.type.statusPeer revieweden
Vyskytuje se v kolekcích:WSCG 2024: Full Papers Proceedings

Soubory připojené k záznamu:
Soubor Popis VelikostFormát 
C41-2024.pdfPlný text1,25 MBAdobe PDFZobrazit/otevřít


Použijte tento identifikátor k citaci nebo jako odkaz na tento záznam: http://hdl.handle.net/11025/57396

Všechny záznamy v DSpace jsou chráněny autorskými právy, všechna práva vyhrazena.