References
- S. Lim and S. Lee "Research Trends in Artificial Intelligence Language Models", Information and Communication Magazine, Vol 40, No. 3, pp.42-50, 2023.
- M. Shanahan "Talking about large language models", Communication of the ACM, Vol 67, No. 2, pp68-79, 2024. https://doi.org/10.1145/3624724
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez et al., "Attention is All You Need", Advances in Neural Information Processing Systems, pp5998-6008, 2017.
- J. Devlin, M. Chang, K. Lee, and K. Toutanova, "BERT:Pre-training of Deep Bidirectional Transformers for Language Understanding", North America Chaper of the Association for Computational Liguistics, pp4171-4186, 2018.
- A, Radfordm J, Narasimhan, T. Salimans, and I. Sutskever, Improving Language Understanding by Generarive Pre-training, OpenAI, 2018
- LDCC. (2024, February 28). LDCC/LDCC-SOLAR-10.7B. Hugging Face. https://huggingface.co/LDCC/LDCC-SOLAR-10.7B
- Yanolja. (2024, March 16). Yanolja/Bookworm-10.7B-v0.4-DPO. Hugging Face. https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO
- Dopeornope. (2024, January 15). DopeorNope/SOLARC-M-10.7B. Hugging Face. https://huggingface.co/DopeorNope/SOLARC-M-10.7B
- Meta. (2023, November 13). Meta-Llama/Llama-2-13b-Hf. Hugging Face. https://huggingface.co/meta-llama/Llama-2-13b-hf
- Heavytail. (2024, January 28). Heavytail/Kullm-Solar. Hugging Face. https://huggingface.co/heavytail/kullm-solar
- Beomi.(2023, May 3). Beomi/KoAlpaca-Polyglot-12.8B. Hugging Face. https://huggingface.co/beomi/KoAlpaca-Polyglot-12.8B