[논문 스터디]

1.[ Paper Review ] Attention is all you need

post-thumbnail

2.[ Paper Review ] BERT : Pre-training Deep Bidirectional Transformers for Language Understanding

post-thumbnail