논문리뷰

1.[논문리뷰] Revisiting Random Channel Pruning for Neural Network Compression

post-thumbnail

2.[논문리뷰] When to Prune? A Policy towards Early Structural Pruning

post-thumbnail

4.[논문리뷰] CHIP: CHannel Independence-based Pruning for Compact Neural Networks

post-thumbnail

5.[논문 리뷰] LEARNING N:M FINE-GRAINED STRUCTURED SPARSE NEURAL NETWORKS FROM SCRATCH

post-thumbnail

7.[논문 리뷰] Training data-efficient image transformers & distillation through attention

post-thumbnail

8.[논문 리뷰] Swin Transformer: Hierarchical Vision Transformer using Shifted Windows

post-thumbnail