Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Structured Code Representations Enable Data-Efficient Adaptation of Code Language Models. Current language models tailored for code tasks often adopt the pre-training-then-fine-tuning paradigm from natural language processing, modeling source code as plain text.
Jan 19, 2024
In this work, we explore a data-efficient adaptation of pre-trained code language models by further training and fine-tuning them with program structures, which ...
People also ask
Jan 19, 2024 · We demonstrate that the structured approach allows data-efficient adaptation of pre-trained models across a variety of tasks, including code ...
Jan 19, 2024 · This work represents programs as parse trees and adapt pre-trained models on serialized CSTs and finds that a small amount of continual ...
Structured Code Representations Enable Data-Efficient Adaptation of Code Language Models (2024), arxiv, Agarwal, Mayank, et al. [pdf]; Pass-Tuning: Towards ...
defined and unambiguous structures inherent in programming languages. In this work, we explore a data-efficient adaptation of pre-trained code language models.
In this work, we explore data-efficient adaptation of pre-trained code models by further pre-training and fine-tuning them with program structures.
Feb 2, 2024 · Abstract. Recent studies have shown that code language models at scale demonstrate significant performance gains on downstream tasks, i.e., ...
Structured Code Representations Enable Data-Efficient Adaptation of Code Language Models ... IPC: A Benchmark Data Set for Learning with Graph-Structured Data
In this work, we explore data-efficient adaptation of pre-trained code models by further pre-training and fine-tuning them with program structures.