Binding affinity predictions with hybrid quantum-classical convolutional neural networks.

L Domingo, M Djukic, C Johnson, F Borondo
Author Information
  1. L Domingo: Grupo de Sistemas Complejos, Universidad Polit��cnica de Madrid, 28035, Madrid, Spain. laia@ingenii.dev.
  2. M Djukic: Ingenii Inc., New York, USA.
  3. C Johnson: Ingenii Inc., New York, USA.
  4. F Borondo: Departamento de Qu��mica, Universidad Aut��noma de Madrid, 28049, Cantoblanco, Madrid, Spain.

Abstract

Central in drug design is the identification of biomolecules that uniquely and robustly bind to a target protein, while minimizing their interactions with others. Accordingly, precise binding affinity prediction, enabling the accurate selection of suitable candidates from an extensive pool of potential compounds, can greatly reduce the expenses associated to practical experimental protocols. In this respect, recent advances revealed that deep learning methods show superior performance compared to other traditional computational methods, especially with the advent of large datasets. These methods, however, are complex and very time-intensive, thus representing an important clear bottleneck for their development and practical application. In this context, the emerging realm of quantum machine learning holds promise for enhancing numerous classical machine learning algorithms. In this work, we take one step forward and present a hybrid quantum-classical convolutional neural network, which is able to reduce by 20% the complexity of the classical counterpart while still maintaining optimal performance in the predictions. Additionally, this results in a significant cost and time savings of up to 40% in the training stage, which means a substantial speed-up of the drug design process.

References

  1. Nucleic Acids Res. 2019 Jan 8;47(D1):D520-D528 [PMID: 30357364]
  2. Bioinformatics. 2018 Nov 1;34(21):3666-3674 [PMID: 29757353]
  3. Methods Mol Biol. 2013;924:197-213 [PMID: 23034750]
  4. Phys Rev E. 2022 Oct;106(4):L043301 [PMID: 36397493]
  5. J Comput Chem. 2004 Oct;25(13):1605-12 [PMID: 15264254]
  6. Living J Comput Mol Sci. 2020;2(1): [PMID: 34458687]
  7. Sci Rep. 2020 Sep 7;10(1):14687 [PMID: 32895412]
  8. J Chem Inf Model. 2024 Apr 8;64(7):2205-2220 [PMID: 37319418]
  9. Phys Chem Chem Phys. 2010 Oct 28;12(40):12899-908 [PMID: 20730182]
  10. Chem Sci. 2016 Jan 14;7(1):207-218 [PMID: 26798447]
  11. J Phys Chem Lett. 2023 Mar 2;14(8):2020-2033 [PMID: 36794930]
  12. Living J Comput Mol Sci. 2022;4(1): [PMID: 36382113]
  13. Sensors (Basel). 2020 Sep 07;20(18): [PMID: 32906819]
  14. J Chem Inf Model. 2012 Dec 21;52(12):3225-32 [PMID: 23198780]
  15. Mol Pharm. 2018 Oct 1;15(10):4378-4385 [PMID: 29473756]
  16. Chem Rev. 2006 May;106(5):1589-615 [PMID: 16683746]
  17. J Chem Inf Model. 2018 Feb 26;58(2):287-296 [PMID: 29309725]
  18. Nat Chem Biol. 2023 Nov;19(11):1342-1350 [PMID: 37231267]
  19. Bioinformatics. 2023 Jun 1;39(6): [PMID: 37225408]
  20. Nat Comput Sci. 2023 Jan;3(1):59-70 [PMID: 38177953]
  21. Phys Rev Lett. 2021 Sep 3;127(10):100502 [PMID: 34533342]
  22. PeerJ. 2019 Jul 25;7:e7362 [PMID: 31380152]
  23. Sci Rep. 2023 May 31;13(1):8790 [PMID: 37258528]
  24. J Cheminform. 2011 Oct 07;3:33 [PMID: 21982300]
  25. Nat Comput Sci. 2023 Jan;3(1):10-11 [PMID: 38177959]
  26. J Chem Inf Model. 2017 Apr 24;57(4):942-957 [PMID: 28368587]
  27. J Chem Inf Model. 2021 Apr 26;61(4):1583-1592 [PMID: 33754707]
  28. Bioinformatics. 2017 Oct 01;33(19):3036-3042 [PMID: 28575181]
  29. Proc Natl Acad Sci U S A. 2021 Dec 21;118(51): [PMID: 34921117]
  30. Comput Struct Biotechnol J. 2021 Nov 25;19:6291-6300 [PMID: 34900139]
  31. Sci Rep. 2021 Mar 18;11(1):6329 [PMID: 33737544]

Grants

  1. LCF/BQ/DR20/11790028/'la Caixa' Foundation

Word Cloud

Created with Highcharts 10.0.0learningmethodsdrugdesignaffinityreducepracticalperformancemachineclassicalhybridquantum-classicalconvolutionalneuralpredictionsCentralidentificationbiomoleculesuniquelyrobustlybindtargetproteinminimizinginteractionsothersAccordinglyprecisebindingpredictionenablingaccurateselectionsuitablecandidatesextensivepoolpotentialcompoundscangreatlyexpensesassociatedexperimentalprotocolsrespectrecentadvancesrevealeddeepshowsuperiorcomparedtraditionalcomputationalespeciallyadventlargedatasetshowevercomplextime-intensivethusrepresentingimportantclearbottleneckdevelopmentapplicationcontextemergingrealmquantumholdspromiseenhancingnumerousalgorithmsworktakeonestepforwardpresentnetworkable20%complexitycounterpartstillmaintainingoptimalAdditionallyresultssignificantcosttimesavings40%trainingstagemeanssubstantialspeed-upprocessBindingnetworks

Similar Articles

Cited By (1)