Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Experimental results show that the retrofitted structure-aware Transformer language model achieves improved perplexity, meanwhile inducing accurate syntactic ...
Sep 16, 2020 · Abstract:We consider retrofitting structure-aware Transformer-based language model for facilitating end tasks by proposing to exploit ...
The model not only preserves the structural strengths of the Transformer but also brings in rich, implicit linguistic knowledge, making it a valuable asset in ...
Sep 16, 2020 · This work considers retrofitting structure-aware Transformer-based language model for facilitating end tasks by proposing to exploit ...
Retrofitting Structure-aware Transformer Language Model for End Tasks. Hao Fei, Yafeng Ren, Donghong Ji. Abstract Paper Connected Papers Add to Favorites.
Sep 16, 2020 · We consider retrofitting structure-aware Trans- former language model for facilitating end tasks by proposing to exploit syntactic dis-.
Retrofitting Structure-aware Transformer Language Model for End Tasks · Abstract · Authors · BibTeX · References · Bibliographies · Reviews · Related ...
We consider retrofitting structure-aware Transformer-based language model for facilitating end tasks by proposing to exploit syntactic distance to encode ...
We consider retrofitting structure-aware Transformer-based language model for facilitating end tasks by proposing to exploit syntactic distance to encode ...