MOSBY enables multi-omic inference and spatial biomarker discovery from whole slide images DOI Creative Commons
Yasin Şenbabaoğlu, Vignesh Prabhakar, Aminollah Khormali

et al.

Scientific Reports, Journal Year: 2024, Volume and Issue: 14(1)

Published: Aug. 6, 2024

The utility of deep neural nets has been demonstrated for mapping hematoxylin-and-eosin (H&E) stained image features to expression individual genes. However, these models have not employed discover clinically relevant spatial biomarkers. Here we develop MOSBY (Multi-Omic translation whole slide images Spatial Biomarker discoverY) that leverages contrastive self-supervised pretraining extract improved H&E features, learns a between and bulk omic profiles (RNA, DNA, protein), utilizes tile-level information We validate gene set predictions with transcriptomic serially-sectioned CD8 IHC data. demonstrate MOSBY-inferred colocalization survival-predictive power orthogonal expression, enable concordance indices highly competitive survival-trained multimodal networks. identify (1) an ER stress-associated feature as chemotherapy-specific risk factor in lung adenocarcinoma, (2) the T effector cell vs cysteine signatures negative prognostic multiple cancer indications. discovery biologically interpretable biomarkers showcases model unraveling novel insights biology well informing clinical decision-making.

Language: Английский

CIMIL-CRC: A clinically-informed multiple instance learning framework for patient-level colorectal cancer molecular subtypes classification from H&E stained images DOI

Hadar Hezi,

M Gelber,

Alexander Balabanov

et al.

Computer Methods and Programs in Biomedicine, Journal Year: 2024, Volume and Issue: 259, P. 108513 - 108513

Published: Nov. 19, 2024

Language: Английский

Citations

1

Prediction of cancer treatment response from histopathology images through imputed transcriptomics DOI Creative Commons
Danh-Tai Hoang, Gal Dinstag, Leandro C. Hermida

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2022, Volume and Issue: unknown

Published: June 9, 2022

ABSTRACT Advances in artificial intelligence have paved the way for leveraging hematoxylin and eosin (H&E)-stained tumor slides precision oncology. We present ENLIGHT-DeepPT, an approach predicting response to multiple targeted immunotherapies from H&E-slides. In difference existing approaches that aim predict treatment directly slides, ENLIGHT-DeepPT is indirect two-step consisting of (1) DeepPT, a new deep-learning framework predicts genome-wide mRNA expression (2) ENLIGHT, which based on DeepPT inferred values. successfully transcriptomics all 16 TCGA cohorts tested generalizes well two independent datasets. Importantly, true responders five patients’ involving four different treatments spanning six cancer types with overall odds ratio 2.44, increasing baseline rate by 43.47% among predicted responders, without need any data training. Furthermore, its prediction accuracy these datasets comparable supervised images, trained same cohort cross validation. Its future application could provide clinicians rapid recommendations array therapies importantly, may contribute advancing oncology developing countries. Statement Significance first shown immune H&E slides. distinction previous approaches, it does not require training specific each drug/indication but then can further oncologists help advance underserved regions low-income

Language: Английский

Citations

6

Cross-linking breast tumor transcriptomic states and tissue histology DOI Creative Commons
Muhammad Dawood,

Mark Eastwood,

Mostafa Jahanifar

et al.

Cell Reports Medicine, Journal Year: 2023, Volume and Issue: 4(12), P. 101313 - 101313

Published: Dec. 1, 2023

Identification of the gene expression state a cancer patient from routine pathology imaging and characterization its phenotypic effects have significant clinical therapeutic implications. However, prediction individual genes whole slide images (WSIs) is challenging due to co-dependent or correlated multiple genes. Here, we use purely data-driven approach first identify groups with then predict their status WSIs using bespoke graph neural network. These allow us capture small number binary variables that are biologically meaningful carry histopathological insights for cases. Prediction based on these allows associating histological phenotypes (cellular composition, mitotic counts, grading, etc.) underlying patterns opens avenues gaining biological directly.

Language: Английский

Citations

3

Digital profiling of cancer transcriptomes from histology images with grouped vision attention DOI Creative Commons
Yuanning Zheng, Marija Pizurica, Francisco Carrillo‐Pérez

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2023, Volume and Issue: unknown

Published: Sept. 30, 2023

Abstract Cancer is a heterogeneous disease that demands precise molecular profiling for better understanding and management. Recently, deep learning has demonstrated potentials cost-efficient prediction of alterations from histology images. While transformer-based architectures have enabled significant progress in non-medical domains, their application to images remains limited due small dataset sizes coupled with the explosion trainable parameters. Here, we develop SEQUOIA , transformer model predict cancer transcriptomes whole-slide To enable full potential transformers, first pre-train using data 1,802 normal tissues. Then, fine-tune evaluate 4,331 tumor samples across nine types. The performance assessed at individual gene levels pathway through Pearson correlation analysis root mean square error. generalization capacity validated two independent cohorts comprising 1,305 tumors. In predicting expression 25,749 genes, highest observed cancers breast, kidney lung, where accurately predicts 11,069, 10,086 8,759 respectively. predicted genes are associated regulation inflammatory response, cell cycles metabolisms. trained tissue level, showcase its spatial patterns transcriptomics datasets. Leveraging performance, digital signature risk recurrence breast cancer. deciphers clinically relevant images, opening avenues improved management personalized therapies.

Language: Английский

Citations

2

MOSBY enables multi-omic inference and spatial biomarker discovery from whole slide images DOI Creative Commons
Yasin Şenbabaoğlu, Vignesh Prabhakar, Aminollah Khormali

et al.

Scientific Reports, Journal Year: 2024, Volume and Issue: 14(1)

Published: Aug. 6, 2024

The utility of deep neural nets has been demonstrated for mapping hematoxylin-and-eosin (H&E) stained image features to expression individual genes. However, these models have not employed discover clinically relevant spatial biomarkers. Here we develop MOSBY (Multi-Omic translation whole slide images Spatial Biomarker discoverY) that leverages contrastive self-supervised pretraining extract improved H&E features, learns a between and bulk omic profiles (RNA, DNA, protein), utilizes tile-level information We validate gene set predictions with transcriptomic serially-sectioned CD8 IHC data. demonstrate MOSBY-inferred colocalization survival-predictive power orthogonal expression, enable concordance indices highly competitive survival-trained multimodal networks. identify (1) an ER stress-associated feature as chemotherapy-specific risk factor in lung adenocarcinoma, (2) the T effector cell vs cysteine signatures negative prognostic multiple cancer indications. discovery biologically interpretable biomarkers showcases model unraveling novel insights biology well informing clinical decision-making.

Language: Английский

Citations

0