Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Recently, path norm was proposed as a new capacity measure for neural networks with Rectified Linear Unit (ReLU) acti- vation function, which takes the rescaling-invariant property of ReLU into account.
Sep 19, 2018 · Abstract:Recently, path norm was proposed as a new capacity measure for neural networks with Rectified Linear Unit (ReLU) activation ...
Motivated by this, we propose a new norm Basis-path Norm based on a group of linearly independent paths to measure the capacity of neural networks more ...
A generalization error bound is established based on this basis path norm, and it is shown it explains the generalization behaviors of ReLU networks more ...
Motivated by this, we propose a new norm Basis-path Norm based on a group of linearly independent paths to measure the capacity of neural networks more ...
Recently, path norm was proposed as a new capacity measure for neural networks with Rectified Linear Unit (ReLU) activation function, which takes the ...
Jan 27, 2019 · Motivated by this, we propose a new norm Basis-path Norm based on a group of linearly independent paths to measure the capacity of neural ...
People also ask
Nov 24, 2023 · This work introduces the first toolkit around path-norms that is fully able to en- compass general DAG ReLU networks with biases, ...
Introduction. The statistical complexity, or capacity, of unregularized feed-forward neural networks, as a function of the network size and depth, ...
Missing: Basis- | Show results with:Basis-
optimize the value vector of the basis paths of neural networks with little extra cost ... Capacity control of relu neural networks by basis-path norm. arXiv ...