A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Discretization-Aware Architecture Search
[article]
2020
arXiv
pre-print
This paper presents discretization-aware architecture search (DA2S), with the core idea being adding a loss term to push the super-network towards the configuration of desired topology, so that the accuracy ...
The search cost of neural architecture search (NAS) has been largely reduced by weight-sharing methods. ...
To alleviate the above issue, we propose discretization-aware architecture search (DA 2 S). ...
arXiv:2007.03154v1
fatcat:ggaf7owxnjamxioxldv23bldqa
Equivariance-aware Architectural Optimization of Neural Networks
[article]
2023
arXiv
pre-print
We further present evolutionary and differentiable neural architecture search (NAS) algorithms that utilize these mechanisms respectively for equivariance-aware architectural optimization. ...
This motivates algorithmically optimizing the architectural constraints imposed by equivariance. ...
Our two equivariance-aware NAS approaches have distinct approaches: EquiNAS E searches for architectures composed of discretely equivariant layers, while EquiNAS D searches for continuous mixtures of equivariance ...
arXiv:2210.05484v3
fatcat:q7wju37renfx3fswrfftldbkey
DiNTS: Differentiable Neural Network Topology Search for 3D Medical Image Segmentation
[article]
2021
arXiv
pre-print
Recently, neural architecture search (NAS) has been applied to automatically search high-performance networks for medical image segmentation. ...
The discretization of the searched optimal continuous model in differentiable scheme may produce a sub-optimal final discrete model (discretization gap). ...
guaranteed discretization algorithm and a discretization aware topology loss for the search stage to minimize the discretization gap. • We develop a memory usage aware search method which is able to search ...
arXiv:2103.15954v1
fatcat:a77g2iy6s5gftmgmz5dqzfn26e
NADS: Neural Architecture Distribution Search for Uncertainty Awareness
[article]
2020
arXiv
pre-print
NADS searches for a distribution of architectures that perform well on a given task, allowing us to identify common building blocks among all uncertainty-aware architectures. ...
To address these problems, we first seek to identify guiding principles for designing uncertainty-aware architectures, by proposing Neural Architecture Distribution Search (NADS). ...
Specifically, let A denote our discrete architecture search space and α ∈ A be an architecture in this space. ...
arXiv:2006.06646v1
fatcat:bqzwefudvvezjndhntvgzbs6lq
Power-aware multimedia: concepts and design perspectives
2007
IEEE Circuits and Systems Magazine
Discrete Cosine Transform designs. ...
In this article, we focus on the introduction of power-aware concepts and considerations to the architecture design of a video coder, followed by discussions of exiting power aware Motion Estimation and ...
It is a hybrid coding architecture based on Discrete Cosine Transform (DCT) and Motion Estimation/ Motion Compensation (ME/MC). ...
doi:10.1109/mcas.2007.4299440
fatcat:jaftd26bizdzdai2sjz6kljdfa
QuantNAS for super resolution: searching for efficient quantization-friendly architectures against quantization noise
[article]
2024
arXiv
pre-print
Another way is a neural architecture search that automatically discovers new, more efficient solutions. ...
Thus, anyone can design a proper search space based on an existing architecture and apply our method to obtain better quality and efficiency. ...
Entropy regularization proposed in Discretization-Aware search [21] . ...
arXiv:2208.14839v4
fatcat:afg7sz6ymjhktfwogvc5y7krpu
Pareto-Frontier-aware Neural Architecture Generation for Diverse Budgets
[article]
2021
arXiv
pre-print
To this end, we propose a Pareto-Frontier-aware Neural Architecture Generator (NAG) which takes an arbitrary budget as input and produces the Pareto optimal architecture for the target budget. ...
Existing methods often perform an independent architecture search for each target budget, which is very inefficient yet unnecessary. ...
To achieve this, we evenly sample a set of discrete budgets from the range of possible values and maximize the expected reward of the searched architectures over these budgets to approximate the Pareto ...
arXiv:2103.00219v1
fatcat:mjrlpdvgnnhurb3rhehcdefyru
DDNAS: Discretized Differentiable Neural Architecture Search for Text Classification
[article]
2023
arXiv
pre-print
This paper presents a novel NAS method, Discretized Differentiable Neural Architecture Search (DDNAS), for text representation learning and classification. ...
Neural Architecture Search (NAS) has shown promising capability in learning text representation. ...
The architecture variants of RNN are first explored. Dodge et al. [12] perform structure learning with group lasso regularization to search sparsity-aware RNN architectures. Merrill et al. ...
arXiv:2307.06005v1
fatcat:7brcqb4gzvgx3o44lp4c2pdhay
Dynamic Routing Networks
[article]
2020
arXiv
pre-print
Extensive efforts have been made to improve the accuracy with expert-designed or algorithm-searched architectures. ...
Therefore, customizing the model capacity in an instance-aware manner is much needed for higher inference efficiency. ...
Neural Architecture Search. There has been an increasing interest in automated architecture search (NAS). ...
arXiv:1905.04849v5
fatcat:wmvllbypgvdbzgm3itypu3uhdq
Extensible Proxy for Efficient NAS
[article]
2022
arXiv
pre-print
Furthermore, to make Eproxy adaptive to different downstream tasks/search spaces, we propose a Discrete Proxy Search (DPS) to find the optimized training settings for Eproxy with only handful of benchmarked ...
Neural Architecture Search (NAS) has become a de facto approach in the recent trend of AutoML to design deep neural networks (DNNs). ...
The searched Eproxy can accurately evaluate the quality of network architectures and make Eproxy search-space/downstream-task aware. ...
arXiv:2210.09459v1
fatcat:xjoajhyekngzvezgmzafvleeee
Pareto-aware Neural Architecture Generation for Diverse Computational Budgets
[article]
2022
arXiv
pre-print
To address these issues, we propose a Pareto-aware Neural Architecture Generator (PNAG) which only needs to be trained once and dynamically produces the Pareto optimal architecture for any given budget ...
More critically, these independent search processes cannot share their learned knowledge (i.e., the distribution of good architectures) with each other and thus often result in limited search results. ...
Thus, we develop a Pareto-aware Neural Architecture Generator (PNAG) to explicitly learn the whole Pareto frontier. ...
arXiv:2210.07634v1
fatcat:hhij2xicyzg2bg2lpqonltdr4q
You Only Search Once: On Lightweight Differentiable Architecture Search for Resource-Constrained Embedded Platforms
[article]
2022
arXiv
pre-print
However, to obtain the architecture that meets the given performance constraint, previous hardware-aware differentiable NAS methods have to repeat a plethora of search runs to manually tune the hyper-parameters ...
Benefiting from the search efficiency, differentiable neural architecture search (NAS) has evolved as the most dominant alternative to automatically design competitive deep neural networks (DNNs). ...
Thus, we further integrate the latency predictor into LightNAS to achieve hardware-aware architecture search. ...
arXiv:2208.14446v1
fatcat:oanv5cfiejfcbi6bwyixauv5ja
AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search
[article]
2021
arXiv
pre-print
We incorporate a task-oriented knowledge distillation loss to provide search hints and an efficiency-aware loss as search constraints, which enables a good trade-off between efficiency and effectiveness ...
Motivated by the necessity and benefits of task-oriented BERT compression, we propose a novel compression method, AdaBERT, that leverages differentiable Neural Architecture Search to automatically compress ...
Here we solve this problem by modeling searched architecture {K, o i,j } as discrete variables that obey discrete probability distributions P K = [θ K 1 , . . . , θ K Kmax ] and P o = [θ o 1 , . . . , ...
arXiv:2001.04246v2
fatcat:hma3lfdrqnf77axtib26qxafpq
Neighborhood-Aware Neural Architecture Search
[article]
2021
arXiv
pre-print
Based on our formulation, we propose neighborhood-aware random search (NA-RS) and neighborhood-aware differentiable architecture search (NA-DARTS). ...
Existing neural architecture search (NAS) methods often return an architecture with good search performance but generalizes poorly to the test setting. ...
NEIGHBORHOOD-AWARE SEARCH ALGORITHMS We propose neighborhood-aware random search and neighborhood-aware DARTS by applying our formulation to random search (sampling-based) and DARTS (gradient-based), respectively ...
arXiv:2105.06369v2
fatcat:xiuzkml7rnbzjm3t5xywcgrotu
Towards Cardiac Intervention Assistance: Hardware-aware Neural Architecture Exploration for Real-Time 3D Cardiac Cine MRI Segmentation
[article]
2020
arXiv
pre-print
In this work, we present the first hardware-aware multi-scale neural architecture search (NAS) framework for real-time 3D cardiac cine MRI segmentation. ...
Experimental results on ACDC MICCAI 2017 dataset demonstrate that our hardware-aware multi-scale NAS framework can reduce the latency by up to 3.5 times and satisfy the real-time constraints, while still ...
Finally, we will discuss how the network is optimized and how to decode the discrete architecture once the search finishes. ...
arXiv:2008.07071v2
fatcat:jpj7ogvtwjb2dirio5v254xa5e
« Previous
Showing results 1 — 15 out of 69,628 results