Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Oct 2, 2020 · The main purpose of this task is to train a light language model and evaluate it over four different NLP tasks. We comprehensively consider the ...
The first large-scale Chinese Language Understanding Evaluation (CLUE) benchmark is introduced, an open-ended, community-driven project that brings together ...
This shared-task examines the performance of light language models on four common NLP tasks: Text Classification, Named Entity Recognition, Anaphora Resolution ...
Oct 14, 2020 · This shared-task examines the performance of light language models on four common NLP tasks: Text Classification, Named Entity Recognition, ...
Oct 2, 2020 · This paper introduces our solution to the Natural Language Processing and Chinese Computing (NLPCC) challenge of Light Pre-Training Chinese ...
We propose a new pre-trained language model called MacBERT that mitigates the gap be- tween the pre-training and fine-tuning stage by masking the word with its ...
Missing: Light | Show results with:Light
Oct 13, 2021 · Abstract:Although pre-trained models (PLMs) have achieved remarkable improvements in a wide range of NLP tasks, they are expensive in terms ...
This work introduces a new Transformer model called Cached Transformer, which uses Gated Recurrent Cached (GRC) attention to extend the self-attention mechanism ...
[ACL2023-Findings] Shuo Wen Jie Zi is a new learning paradigm that enhances the semantics understanding ability of the Chinese PLMs with dictionary ...
Pre-trained Language Models (PLMs) have proven to be beneficial for various downstream NLP tasks. Recently, GPT-3, with 175 billion parameters and 570 GB ...
Missing: Light | Show results with:Light