[Lab #3-2] Cross-lingual learning for text processing: A survey (Expert Systems with Applications, 2021)

누렁이·2023년 4월 23일
0

DSAIL-LAB

목록 보기
4/5

0. Abstract


1. Introduction

Goal

Contribution

2. Definition of cross-lingual learning

2.1 Definition

2.2 Terminology note

3. Methodology

3.1 Selection

3.2 Observed aspects

3.3 Limitations

4. Tasks, datasets and languages

5. Cross-lingual resources and technologies

5.1 Multilingual distributional representations

5.2 Parallel corpus

5.3 Word alignments

5.4 Machine translation

5.5 Universal features

5.6 Bilingual dictionary

5.7 Pre-trained multilingual language models

5.8 Language features

5.9 Others

6. Transfer paradigms

6.1 Label transfer

6.1.1 Correspondence

6.1.2 Projection

6.1.3 Zero-shot and multi-source learning

6.1.4 Reducing noise

6.2 Feature transfer

6.3 Parameter transfer

6.3.1 Language independent representations

6.3.2 Joint learning techniques

6.3.3 Cascade learning techniques

6.3.4 Layer-specific strategies

6.3.5 Multi-source learning

6.3.6 Pre-trained multilingual language models

6.4 Representation transfer

6.5 Comparison and summary

7. Feature directions

7.1 Multilingual datasets

7.2 Stadardization of linguistic resources

7.3 Pre-trained multilingual language models

7.4 Truly low-resource languages and excluded languages

7.6 Curse of multilinguality

7.7 Combination with multitask learning

7.8 Word alignments

7.9 Machine translation

8. Conclusions

profile
왈왈

0개의 댓글