DOI QR코드

DOI QR Code

Comparison of Deep Learning Models Using Protein Sequence Data

단백질 기능 예측 모델의 주요 딥러닝 모델 비교 실험

  • 이정민 (선문대학교 컴퓨터융합전자공학과 바이오빅데이터융합) ;
  • 이현 (선문대학교 컴퓨터공학부)
  • Received : 2021.12.10
  • Accepted : 2022.02.22
  • Published : 2022.06.30

Abstract

Proteins are the basic unit of all life activities, and understanding them is essential for studying life phenomena. Since the emergence of the machine learning methodology using artificial neural networks, many researchers have tried to predict the function of proteins using only protein sequences. Many combinations of deep learning models have been reported to academia, but the methods are different and there is no formal methodology, and they are tailored to different data, so there has never been a direct comparative analysis of which algorithms are more suitable for handling protein data. In this paper, the single model performance of each algorithm was compared and evaluated based on accuracy and speed by applying the same data to CNN, LSTM, and GRU models, which are the most frequently used representative algorithms in the convergence research field of predicting protein functions, and the final evaluation scale is presented as Micro Precision, Recall, and F1-score. The combined models CNN-LSTM and CNN-GRU models also were evaluated in the same way. Through this study, it was confirmed that the performance of LSTM as a single model is good in simple classification problems, overlapping CNN was suitable as a single model in complex classification problems, and the CNN-LSTM was relatively better as a combination model.

단백질은 모든 생명 활동의 기본 단위이며, 이를 이해하는 것은 생명 현상을 연구하는 데 필수적이다. 인공신경망을 이용한 기계학습 방법론이 대두된 이후로 많은 연구자들이 단백질 서열만을 사용하여 단백질의 기능을 예측하고자 하였다. 많은 조합의 딥러닝 모델이 학계에 보고되었으나 그 방법은 제각각이며 정형화된 방법론이 없고, 각기 다른 데이터에 맞춰져있어 어떤 알고리즘이 더 단백질 데이터를 다루는 데 적합한지 직접 비교분석 된 적이 없다. 본 논문에서는 단백질의 기능을 예측하는 융합 분야에서 가장 많이 사용되는 대표 알고리즘인 CNN, LSTM, GRU 모델과 이를 이용한 두가지 결합 모델에 동일 데이터를 적용하여 각 알고리즘의 단일 모델 성능과 결합 모델의 성능을 정확도와 속도를 기준으로 비교 평가하였으며 최종 평가 척도를 마이크로 정밀도, 재현율, F1 점수로 나타내었다. 본 연구를 통해 단순 분류 문제에서 단일 모델로 LSTM의 성능이 준수하고, 복잡한 분류 문제에서는 단일 모델로 중첩 CNN이 더 적합하며, 결합 모델로 CNN-LSTM의 연계 모델이 상대적으로 더 우수함을 확인하였다.

Keywords

Acknowledgement

본 논문은 교육부 및 한국연구재단의 4단계 두뇌한국21 사업(4단계 BK21 사업)으로 지원된 연구임.

References

  1. A. Dalkiran, A. S. Rifaioglu, M. J. Martin, R. Cetin-Atalay, V. Atalay, T. Dogan, "ECPred: A tool for the prediction of the enzymatic functions of protein sequences based on the EC nomenclature," BMC Bioinformatics, Vol.19, No.1, pp.334, 2018. https://doi.org/10.1186/s12859-018-2368-y
  2. A. Amidi, S. Amidi, D. Vlachakis, V. Megalooikonomou, N. Paragios, E. I. Zacharaki, "EnzyNet: Enzyme classification using 3D convolutional neural networks on spatial representation," PeerJ, Vol.6, pp.e4750, 2018. https://doi.org/10.7717/peerj.4750
  3. X. Xiao, L. Duan, G. Xue, G. Chen, P. Wang, W. R. Qiu, "MF-EFP: Predicting multi-functional enzymes function using improved hybrid multi-label classifier," IEEE Access, Vol.8, pp.50276-50284, 2020. https://doi.org/10.1109/access.2020.2979888
  4. N. Strodthoff, P. Wagner, M. Wenzel, and W. Samek, "UDSMProt: Universal deep sequence models for protein classification," Bioinformatics, Vol.36, Iss.8, pp.2401-2409, 2020. https://doi.org/10.1093/bioinformatics/btaa003
  5. R. Semwal, I. Aier, P. Tyagi, and P. K. Varadwaj, "DeEPn: A deep neural network based tool for enzyme functional annotation," Journal of Biomolecular Structure and Dynamics, Vol.39, No.8, pp.2733-2743, 2021. https://doi.org/10.1080/07391102.2020.1754292
  6. J. Y. Ryu, H. U. Kim, and S. Y. Lee, "Deep learning enables high-quality and high-throughput prediction of enzyme commission numbers," Proceedings of the National Academy of Sciences, Vol.116, No.28, pp.13996-14001, 2019. https://doi.org/10.1073/pnas.1821905116
  7. S. A. Memon, K. A. Khan, and H. Naveed, "HECNet: A hierarchical approach to enzyme function classification using a Siamese Triplet Network," Bioinformatics, Vol.36, No.17, pp.4583-4589, 2020. https://doi.org/10.1093/bioinformatics/btaa536
  8. C. Mirabello and B. Wallner, "rawMSA: End-to-end deep learning using raw multiple sequence alignments." PloS one, Vol.14, No.8, pp.e0220182, 2019. https://doi.org/10.1371/journal.pone.0220182
  9. Y. Guo, W. Li, B. Wang, H. Liu, and D. Zhou, "DeepACLSTM: Deep asymmetric convolutional long short-term memory neural models for protein secondary structure prediction," BMC Bioinformatics, Vol.20, No.1, pp.341, 2019. https://doi.org/10.1186/s12859-019-2940-0
  10. E. C. Alley, G Khimulya, S. Biswas, M. AlQuraishi, G. M. Church, "Unified rational protein engineering with sequence-based deep representation learning," Nature Methods, Vol.16, No.12, pp.1315-1322, 2019. https://doi.org/10.1038/s41592-019-0598-1
  11. Y. Jiang, D. Wang, and D. Xu, "DeepDom: Predicting protein domain boundary from sequence alone using stacked bidirectional LSTM," Pacific Symposium on Biocomputing: Pacific Symposium on Biocomputing, Vol.24, pp.66-75, 2019.
  12. S. Min, H. Kim, B. Lee, and S. Yoon, "Protein transfer learning improves identification of heat shock protein families," PloS one, Vol.16, No.5, pp.e0251865, 2021. https://doi.org/10.1371/journal.pone.0251865
  13. C. Claudel-Renard, C. Chevalet, T. Farau, and D. Kahn, "Enzyme-specific profiles for genome annotation: PRIAM," Nucleic Acids Research, Vol.31, No.22, pp.6633-6639, 2003. https://doi.org/10.1093/nar/gkg847
  14. R. d. O. Almeida and G. T. Valente, "Predicting metabolic pathways of plant enzymes without using sequence similarity: Models from machine learning," The Plant Genome, Vol.13, No.3, pp.e20043, 2020.
  15. N. Q. K. Le, E. K. Y. Yapp, and H. Yeh, "ET-GRU: Using multi-layer gated recurrent units to identify electron transport proteins," BMC Bioinformatics, Vol.20, No.1, pp.377, 2019. https://doi.org/10.1186/s12859-019-2972-5
  16. Z. Tao, B. Dong, Z. Teng, and Y. Zhao, "The classification of enzymes by deep learning," IEEE Access, Vol.8, pp.89802-89811, 2020. https://doi.org/10.1109/access.2020.2992468
  17. O. B. Sezer and A. M. Ozbayoglu, "Algorithmic financial trading with deep convolutional neural networks: Time series to image conversion approach," Applied Soft Computing, Vol.70, pp.525-538, 2018. https://doi.org/10.1016/j.asoc.2018.04.024
  18. K. Bhardwaj, "Convolutional Neural Network(CNN/ConvNet) in stock price movement prediction," arXiv:2106.01920, 2021.
  19. S. Hochreiter, "Untersuchungen zu dynamischen neuronalen Netzen," Diplom thesis, Institut f Informatik, Technische Univ, Munich. 1991.
  20. Y. Bengio, P. Simard, and P. Frasconi, "Learning long-term dependencies with gradient descent is diffcult," IEEE Transactions on Neural Networks, Vol.5, No.2, pp.157-166, 1994. https://doi.org/10.1109/72.279181
  21. K. Cho et al., "Learning phrase representations using RNN encoder-decoder for statistical machine translation," arXiv preprint arXiv:1406.1078, 2014.
  22. Y. Kim. "Convolutional Neural Networks for Sentence Classification". In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp.1746-1751, 2014.
  23. D. Masters and C. Luschi, "Revisiting small batch training for deep neural networks," Graphcore Research, arXiv: 1804.07612, 2018.