Unified Deep Learning of Molecular and Protein Language Representations with T5ProtChem

Thomas Kelly, Song Xia, Jieyu Lu, Yingkai Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

Deep learning has revolutionized difficult tasks in chemistry and biology, yet existing language models often treat these domains separately, relying on concatenated architectures and independently pretrained weights. These approaches fail to fully exploit the shared atomic foundations of molecular and protein sequences. Here, we introduce T5ProtChem, a unified model based on the T5 architecture, designed to simultaneously process molecular and protein sequences. Using a new pretraining objective, ProtiSMILES, T5ProtChem bridges the molecular and protein domains, enabling efficient, generalizable protein-chemical modeling. The model achieves a state-of-the-art performance in tasks such as binding affinity prediction and reaction prediction, while having a strong performance in protein function prediction. Additionally, it supports novel applications, including covalent binder classification and sequence-level adduct prediction. These results demonstrate the versatility of unified language models for drug discovery, protein engineering, and other interdisciplinary efforts in computational biology and chemistry.

Original languageEnglish (US)
Pages (from-to)3990-3998
Number of pages9
JournalJournal of Chemical Information and Modeling
Volume65
Issue number8
DOIs
StatePublished - Apr 28 2025

ASJC Scopus subject areas

  • General Chemistry
  • General Chemical Engineering
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Unified Deep Learning of Molecular and Protein Language Representations with T5ProtChem'. Together they form a unique fingerprint.

Cite this