Data from this paper can be downloaded here.
Code is available here.
mlama1.1.zip 53 languages [40MB]
mlama1.1-all.zip all data [122MB]
If you use the provided data in your work, please cite the following paper.
@inproceedings{kassner2021multilingual, title = "Multilingual {LAMA}: Investigating Knowledge in Multilingual Pretrained Language Models", author = {Kassner, Nora and Dufter, Philipp and Sch{\"u}tze, Hinrich}, booktitle = "to appear in Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics", year = "2021", address = "Online", publisher = "Association for Computational Linguistics", }
Contact: Nora Kassner, Philipp Dufter