Presented at NAACL 2024!
With my colleague, Tom Lupicki, I presented our paper, “Experiments in Mamba Sequence Modeling and NLLB-200 Fine-Tuning for Low Resource Multilingual Machine Translation”, at NAACL 2024.
This paper was submitted to Shared Task 1 in the AmericasNLP 2024 Workshop, co-located with NAACL 2024. We found that a careful fine-tuning curriculum was quite helpful in improving the performance of a pre-trained MT model on low-resource languages. Our submission won second place in the shared task!