Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Nov 1, 2022 · We propose a new pruning method that retains important connections and can achieve a high pruning rate while preserving good prediction accuracy ...
In this paper, we focus on reducing the costs of Gated Recurrent Units (GRUs) for time-series prediction tasks and we propose a new pruning method that can ...
Sep 13, 2022 · One-shot pruning of gated recurrent unit neural network by sensitivity for time-series prediction · Neurocomputing ( IF 6 ) Pub Date : 2022-09 ...
PyTorch implementation of "One-shot Pruning of Gated Recurrent Unit Neural Network by Sensitivity for Time-series Prediction" by Hong Tang, Xiangzheng Ling ...
Sep 13, 2022 · We show a new pruning pipeline that needs no ”prune-retrain” cycles, which significantly reduces the training costs. In addition, we test the ...
People also ask
Our method consistently beats random pruning. 3.2 LINGUISTIC SEQUENCE PREDICTION. We assess our models on 3 sequence prediction benchmarks: 1) WikiText-2 (wiki2) ...
Our objective is data efficient (requiring only 64 data points to prune the network), easy to implement, and produces 95% sparse GRUs that significantly improve ...
This case study uses Recurrent Neural Networks (RNNs) to predict electricity consumption based on historical data. The "Electricity Consumption'' dataset ...
Nov 30, 2019 · Our objective is data efficient (requiring only 64 data points to prune the network), easy to implement, and produces 95% sparse GRUs that ...
Missing: sensitivity time- series prediction.
Pruning RNNs reduces the size of the model and can also help achieve significant inference time speed-up using sparse matrix multiply. Benchmarks show that ...