Research at Lilt
We're narrowing the gap between translation research and practice
Language translation is at once one of the oldest applications of digital computers and a rapidly developing research area. Research is essential to our mission because much of what we do hasn't been done before.
The Lilt team has collectively published over 100 papers at the top conferences in Natural Language Processing, Human-Computer Interaction, and Machine Learning.
We also build things. The team has made fundamental contributions to open-source MT decoders (Phrasal and Jane), commercial MT systems (Google Translate), and NLP pipelines (Stanford CoreNLP).
We are advised by the former head of the Google Translate team, and the two professors who supervised the original research at Stanford.
Jeff Heer Associate Professor, University of Washington
Chris Manning Professor, Stanford University
Franz Och Chief Scientist, HLI
- Joern Wuebker, Spence Green, John DeNero, Sasa Hasan and Minh-Thang Luong. 2016. Models and Inference for Prefix-Constrained Machine Translation. To appear in ACL.
- Joern Wuebker, Spence Green, and John DeNero. 2015. Hierarchical Incremental Adaptation for Statistical Machine Translation. In EMNLP.
- Spence Green. 2015. Beyond Post-editing: Advances in interactive translation environments. In ATA Chronicle.
- Spence Green, Sida Wang, Jason Chuang, Jeffrey Heer, and Christopher D. Manning. 2014. Human Effort and Machine Learnability in Computer Aided Translation. In EMNLP.
- Spence Green, Jason Chuang, Jeffrey Heer, and Christopher D. Manning. 2014. Predictive Translation Memory: A Mixed-Initiative System for Human Language Translation. In UIST.
Try the New Standard in Translation Automation