Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Jun 15, 2016 · We propose a semi-supervised approach for training NMT models on the concatenation of labeled (parallel corpora) and unlabeled (monolingual ...
2016. Semi-Supervised Learning for Neural Machine Translation. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics ( ...
Apr 2, 2023 · This paper presents a simple yet effective method to tackle this problem for low-resource languages by augmenting high-quality sentence pairs ...
This work proposes a semi-supervised approach for training NMT models on the concatenation of labeled and unlabeled monolingual corpora data, ...
N-gram language model in SMT. Koehn et al., [2007]. Monolingual corpora as decipherment. Ravi and Knight [2011]. Integrate a neural language model into NMT.
People also ask
Aug 27, 2019 · We have presented a semi-supervised approach to training bidirectional neural machine translation models. The central idea is to introduce ...
This system can provide a reliable translation system to any language using only only 6000 parallel sentences. The second scenario is adapting a model that was ...
We propose a semi-supervised approach for training NMT models on the concatenation of labeled (parallel corpora) and unlabeled (monolingual corpora) data. The ...
Mar 29, 2024 · In a nutshell, semi-supervised learning (SSL) is a machine learning technique that uses a small portion of labeled data and lots of unlabeled ...
A curated list of resources for Neural Semi-Supervised Learning (NSSL). The choice of what I have initially include was subjectively made based on what I ...