Studying natural language models from the beginning
2022. 7. 26. 21:52ㆍArtificial_Intelligence/Natural Language Processing
자연어 모델을 이해했는지 확인 + 공부를 위한 필기 노트입니다.
1. 단순 신경망
2. RNN
3. LSTM
4. GRU
5. Seq2Seq (Sequence to Sequnece)
6. Attention Mechanism
7. 교사 학습 (Teacher Forcing)
8. Beam Search Algorithm
9. Transformer (Encoder, Decoder)
10. BERT
11. RoBERTa
12. ALBERT
13. Embedding / Encoding
14. Knowledge Distillation
15. Self-Explaning
16. Sentence BERT
728x90
'Artificial_Intelligence > Natural Language Processing' 카테고리의 다른 글
[논문리뷰]Are Prompt-Based Models Clueless? (0) | 2022.08.29 |
---|---|
[DACON] 데이콘 쇼핑몰 리뷰 평점 분류 경진대회 (0) | 2022.08.09 |
학부 졸업 기념, 개념 다시 되짚어보기 (0) | 2022.07.11 |
[논문리뷰] It’s Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners (0) | 2022.07.11 |
Text Similarity, Semantic Similarity (0) | 2022.06.02 |