GTPLM-GO: Enhancing Protein Function Prediction Through Dual-Branch Graph Transformer and Protein Language Model Fusing Sequence and Local–Global PPI Information DOI Open Access
Haotian Zhang, Yundong Sun, Yansong Wang

и другие.

International Journal of Molecular Sciences, Год журнала: 2025, Номер 26(9), С. 4088 - 4088

Опубликована: Апрель 25, 2025

Currently, protein–protein interaction (PPI) networks have become an essential data source for protein function prediction. However, methods utilizing graph neural (GNNs) face significant challenges in modeling PPI networks. A primary issue is over-smoothing, which occurs when multiple GNN layers are stacked to capture global information. This architectural limitation inherently impairs the integration of local and information within networks, thereby limiting accuracy To effectively utilize we propose GTPLM-GO, a prediction method based on dual-branch Graph Transformer language model. The achieves collaborative through two branches: network linear attention-based encoder. GTPLM-GO integrates local–global with functional semantic encoding constructed by model, overcoming inadequate extraction existing methods. Experimental results demonstrate that outperforms advanced network-based sequence-based datasets varying scales.

Язык: Английский

GTPLM-GO: Enhancing Protein Function Prediction Through Dual-Branch Graph Transformer and Protein Language Model Fusing Sequence and Local–Global PPI Information DOI Open Access
Haotian Zhang, Yundong Sun, Yansong Wang

и другие.

International Journal of Molecular Sciences, Год журнала: 2025, Номер 26(9), С. 4088 - 4088

Опубликована: Апрель 25, 2025

Currently, protein–protein interaction (PPI) networks have become an essential data source for protein function prediction. However, methods utilizing graph neural (GNNs) face significant challenges in modeling PPI networks. A primary issue is over-smoothing, which occurs when multiple GNN layers are stacked to capture global information. This architectural limitation inherently impairs the integration of local and information within networks, thereby limiting accuracy To effectively utilize we propose GTPLM-GO, a prediction method based on dual-branch Graph Transformer language model. The achieves collaborative through two branches: network linear attention-based encoder. GTPLM-GO integrates local–global with functional semantic encoding constructed by model, overcoming inadequate extraction existing methods. Experimental results demonstrate that outperforms advanced network-based sequence-based datasets varying scales.

Язык: Английский

Процитировано

0