NLP2 [논문 리뷰] Attention is All You Need Paper DetailsTitle: Attention is All You NeedAuthors: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, Illia PolosukhinConference: 2017 NeurIPS (31st Conference on Neural Information Processing Systems)Year of Publication: 2017Link: https://arxiv.org/abs/1706.03762Key Focus: The Transformer replaces traditional RNN and CNN-based sequence mod.. 2025. 1. 24. [논문 리뷰] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding Paper Details Title: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding Authors: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina ToutanovaConference: NAACL 2019Year of Publication: 2019Link: https://arxiv.org/abs/1810.04805Key Focus: This paper introduces BERT (Bidirectional Encoder Representations from Transformers), a novel language representation model that pr.. 2024. 11. 11. 이전 1 다음