Factorization of the neural parameter space for zero-shot multi-lingual and multi-task transfer. Code for the paper:
Edoardo M. Ponti, Ivan Vulić, Ryan Cotterell, Marinela Parovic, Roi Reichart and Anna Korhonen. 2020. Parameter Space Factorization for Zero-Shot Learning across Tasks and Languages. [arXiv]
If you use this software for academic research, please cite the paper in question:
@misc{ponti2020parameter,
title={Parameter Space Factorization for Zero-Shot Learning across Tasks and Languages},
author={Edoardo M. Ponti and Ivan Vulić and Ryan Cotterell and Marinela Parovic and Roi Reichart and Anna Korhonen},
year={2020},
eprint={2001.11453},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- python 3.5.2
- pytorch 1.1.0
Obtain the data from Universal Dependencies (for POS tagging) and Wikiann (for NER):
./tools/get_data.sh
Run the model and baselines. For instance, to train and evaluate parameter space factorization with low-rank factor covariance:
python src/run_matrix_completion.py --mode lrcmeta --rank_cov 10
The part of the code for multilingual BERT has been taken from in HuggingFace's Transformers. The link contains a copy of the original license and the citation for the library.