Kubishi Research Group

Backlinks

  • LLMs and Indigenous Language Revitalization
  • Welcome
Home

❯

Publications

❯

Comparing LLM Based Translation Approaches for Extremely Low Resource Languages

Comparing LLM-Based Translation Approaches for Extremely Low-Resource Languages

  • artificial-intelligence
  • large-language-models
  • language-revitalization
Jared Coleman, Ruben Rosales, Kira Toal, Diego Cuadros, Nick Leeds, Bhaskar Krishnamachari, Khalil Iskarous
LoResMT @ EACL 2026 - Workshop on Low-Resource Machine Translation
March 28, 2026

Abstract

We present a comprehensive evaluation and extension of the LLM-Assisted Rule-Based Machine Translation (LLM-RBMT) paradigm, an approach that combines the strengths of rule-based methods and Large Language Models (LLMs) to support translation in no-resource settings. We present a robust new implementation (the Pipeline Translator) that generalizes the LLM-RBMT approach and enables flexible adaptation to novel constructions. We benchmark it against four alternatives (Builder, Instructions, RAG, and Fine-tuned translators) on a curated dataset of 150 English sentences, and compare them across translation quality and runtime. The Pipeline Translator consistently achieves the best overall performance. The LLM-RBMT methods (Pipeline and Builder) also offer an important advantage: they naturally align with evaluation strategies that prioritize grammaticality and semantic fidelity over surface-form overlap, which is critical for endangered languages where mistranslation carries high risk.


Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community