# Contrastive Learning
[Story Generation #2] Genre-Controllable Story Generation via Supervised Contrastive Learning (WWW, 2022)
Challenge : Pretraine language model 등의 발전으로 controllable text genration이 각광받고 있다. 하지만 story-specific controllability를 잘하기 위해선 아직 부족하다!
TS-TCC 정리 및 분석 - 1
졸업을 위해 캡스톤 디자인을 하면서 Contrastive learning쪽에 대해 공부하게 되었는데, 공부에 쓰인 TS-TCC 논문을 정리해서 남기고자 해당 포스팅을 씁니다. 코드 : https://github.com/emadeldeen24/TS-TCC 논문 : ht

Incremental False Negative Detection for Contrastive Learning (ICLR / 2022)
점진적으로 false negative 찾아내고 제거하는 방식을 통하여 Contrastive Learning의 false negative problem을 개선

Boosting Contrastive Learning with Relation Knowledge Distillation (AAAI/ 2022)
cluster-based와 contrast-based를 link하는 relation knowledge-distillation 기법 제안
[간단정리]Adversarial Self-Supervised Contrastive Learning(NIPS 2020)
Some notes on Adversarial, Self-Supervised, and Contrastive Learning
[간단정리]MixCo: Mix-up Contrastive Learning for Visual Representation(Arxiv 2020)
Some notes on MixCo

CoDA: Contrast-Enhanced and Diversity-Promoting Data Augmentation for Natural Language Understanding (ICLR / 2021)
back-translation에 adversarial training을 sequential stacking하고 consistency loss와 contrastive loss를 이용하여 informative한 augmented data augmentation 기법을 제안

Similarity Learning & Contrastive Learning
Similarity Leanring & Contrastive Learning(1)

Neighborhood Contrastive Learning for Novel Class Discovery (CVPR / 2021) paper review
Novel Class Discovery task에 대하여 Neighborhood Contrastive Learning + Hard Negative Generation method를 제안

Large-Margin Contrastive Learning with Distance polarization Regularizer (ICML / 2021) paper review
Distance Polarization Regularizer를 사용하여 기존의 Contrastive Learning을 개선하는 method를 제안
.png)
reveiw-understanding contrastive loss
contrastive loss is a hardness-aware loss function.(hardness-aware : 애매하지 않고 딱 구분되게 분류할 수 있도록 만들어 주는 것 - inter 멀고 intra에서 가깝고)temperature controls the
SimCSE- Simple Contrastive Learning of Sentence Embeddings
본 포스트는 BERT, RoBERTa와 같은 masked language model(MLM)으로 학습된 pre-trained language model(PLM)에 drop out 기반의 contrstive learning 방식을 이용, unsupervised, supe

Self-Guided Contrastive Learning for BERT Sentence Representations (ACL / 2021) paper review
BERT의 sentence representation을 improve하기 위한 contrastive learning method