Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








2,145 Hits in 4.7 sec

Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning

Ali Rahimi, Benjamin Recht
2008 Neural Information Processing Systems  
Specifically, we consider architectures that compute a weighted sum of their inputs after passing them through a bank of arbitrary randomized nonlinearities.  ...  We analyze shallow random networks with the help of concentration of measure inequalities.  ...  Then with w fixed, it fits the weights α optimally via a simple convex optimization: Algorithm 1 The Weighted Sum of Random Kitchen Sinks fitting procedure.  ... 
dblp:conf/nips/RahimiR08 fatcat:gez25agn7nhavmucv4h5posm6y

Deep learning with t-exponential Bayesian kitchen sinks [article]

Harris Partaourides, Sotirios Chatzis
2018 arXiv   pre-print
On the other hand, shallow models that compute weighted sums of their inputs, after passing them through a bank of arbitrary randomized nonlinearities, have been recently shown to enjoy good test error  ...  of weights.  ...  In the same vein, the machine learning community has recently examined a bold, yet quite promising possibility: postulating weighted sums of random kitchen sinks (RKS) (Rahimi and Recht, 2009 ).  ... 
arXiv:1802.03651v1 fatcat:6agtzxdz3jbbvky4avqjdybply

Image Classification by Throwing Quantum Kitchen Sinks at Tensor Networks [article]

Nathan X. Kodama
2022 arXiv   pre-print
In contrast, a random feature approach known as quantum kitchen sinks provides comparable performance, but leverages non-local feature maps.  ...  Several variational quantum circuit approaches to machine learning have been proposed in recent years, with one promising class of variational algorithms involving tensor networks operating on states resulting  ...  Acknowledgments and Disclosure of Funding NXK acknowledges support from a Microsoft Quantum Internship, where this collaboration started.  ... 
arXiv:2208.13895v1 fatcat:zeh562p5c5avfj5y2mtz3zfucu

Quantum Kitchen Sinks: An algorithm for machine learning on near-term quantum computers [article]

C. M. Wilson (Rigetti Computing, Institute for Quantum Computing, University of Waterloo), J. S. Otterbach, N. Tezak, R. S. Smith, A. M. Polloreno, Peter J. Karalekas, S. Heidel, M. Sohaib Alam, G. E. Crooks, and M. P. da Silva
2019 arXiv   pre-print
Here we describe one such hybrid algorithm for machine learning tasks by building upon the classical algorithm known as random kitchen sinks.  ...  Our technique, called quantum kitchen sinks, uses quantum circuits to nonlinearly transform classical inputs into features that can then be used in a number of machine learning algorithms.  ...  Acknowledgements-We acknowledge helpful discussions with Matthew Harrigan.  ... 
arXiv:1806.08321v2 fatcat:mam4bhcwxfh2jbfdrnk5yx4vom

A la Carte - Learning Fast Kernels [article]

Zichao Yang and Alexander J. Smola and Le Song and Andrew Gordon Wilson
2014 arXiv   pre-print
We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption.  ...  We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(mlogd) time and O(m) memory, for m basis functions and d input dimensions.  ...  Weighted sums of random kitchen sinks: Replacing minimization with randomization in learning. In Neural Information Processing Systems, 2009. C. E. Rasmussen and C. K. I. Williams.  ... 
arXiv:1412.6493v1 fatcat:h2agr4ia4bghfh7djolsolh3vm

Fock State-enhanced Expressivity of Quantum Machine Learning Models [article]

Beng Yee Gan, Daniel Leykam, Dimitris G. Angelakis
2021 arXiv   pre-print
The data-embedding process is one of the bottlenecks of quantum machine learning, potentially negating any quantum speedups. In light of this, more effective data-encoding strategies are necessary.  ...  Our work shed some light on the unique advantages offers by quantum photonics on the expressive power of quantum machine learning models.  ...  ACKNOWLEDGMENTS This research was supported by the National Research Foundation, Prime Ministers Office, Singapore, the Ministry of Education, Singapore under the Research Centres of Excellence programme  ... 
arXiv:2107.05224v1 fatcat:y3t74nkejzgl5nvysnlzzoilge

McKernel: A Library for Approximate Kernel Expansions in Log-linear Time [article]

J. D. Curtó and I. C. Zarza and Feng Yang and Alex Smola and Fernando de la Torre and Chong Wah Ngo and Luc van Gool
2020 arXiv   pre-print
Based on Random Kitchen Sinks [Rahimi and Recht 2007], we provide a C++ library for Large-scale Machine Learning.  ...  McKernel introduces a framework to use kernel approximates in the mini-batch setting with Stochastic Gradient Descent (SGD) as an alternative to Deep Learning.  ...  We account for both the theoretical underpinnings and the practical implications to establish the building blocks of a unifying theory of learning.  ... 
arXiv:1702.08159v15 fatcat:5crh6xonbfdzpj7oswpowdbrpu

Fock state-enhanced expressivity of quantum machine learning models

Beng Yee Gan, Daniel Leykam, Dimitris G. Angelakis
2022 EPJ Quantum Technology  
In light of this, more effective data-encoding strategies are necessary.  ...  AbstractThe data-embedding process is one of the bottlenecks of quantum machine learning, potentially negating any quantum speedups.  ...  minimizing a cost function by training the circuit parameters. (2) Kernel methods, which employ fixed circuits, with training carried out on observables only. (3) Random kitchen sinks, which use a set  ... 
doi:10.1140/epjqt/s40507-022-00135-0 fatcat:4z36gyehxzflhm6lkbcbyrovi4

Doubly stochastic large scale kernel learning with the empirical kernel map [article]

Nikolaas Steenbergen, Sebastian Schelter, Felix Bießmann
2016 arXiv   pre-print
The main problem with kernel methods is that the kernel matrix grows quadratically with the number of data points.  ...  With the rise of big data sets, the popularity of kernel methods declined and neural networks took over again.  ...  In the context of large-scale kernel learning this method was popularized by Rahimi and Recht under the name of random kitchen sinks [16] .  ... 
arXiv:1609.00585v2 fatcat:bgseqpsabvemppqxm3hqylypha

Fastfood: Approximate Kernel Expansions in Loglinear Time [article]

Quoc Viet Le, Tamas Sarlos, Alexander Johannes Smola
2014 arXiv   pre-print
These two matrices can be used in lieu of Gaussian matrices in Random Kitchen Sinks proposed by Rahimi and Recht (2009) and thereby speeding up the computation for a large range of kernel functions.  ...  Experiments show that we achieve similar accuracy to full kernel expansions and Random Kitchen Sinks while being 100x faster and using 1000x less memory.  ...  Experiments In the following we assess the performance of Random Kitchen Sinks and Fastfood. The results show that Fastfood performs as well as Random Kitchen Sinks in terms of accuracy.  ... 
arXiv:1408.3060v1 fatcat:p6e3onmconav3bhenarufruuri

GURLS vs LIBSVM: Performance Comparison of Kernel Methods for Hyperspectral Image Classification

Nikhila Haridas, V. Sowmya, K. P. Soman
2015 Indian Journal of Science and Technology  
Moreover, GURLS package is provided with an implementation of Random Kitchen Sink algorithm, which can easily handle high dimensional Hyper Spectral Images at much lower computational cost than LIBSVM.  ...  The proposed work compares the performance of different kernel methods available in GURLS package with the library for Support Vector Machines namely, LIBSVM.  ...  One possible way to fix weight matrix, W is to choose W which minimizes the sum of squares of L2 norm of errors for each of the n equations.  ... 
doi:10.17485/ijst/2015/v8i24/80843 fatcat:3mwzv24x35ho3hnhuv5x4sc44e

Adversarially Masked Video Consistency for Unsupervised Domain Adaptation [article]

Xiaoyu Zhu, Junwei Liang, Po-Yao Huang, Alex Hauptmann
2024 arXiv   pre-print
It consists of two novel designs. The first module is called Generative Adversarial Domain Alignment Network with the aim of learning domain-invariant representations.  ...  It simultaneously learns a mask generator and a domain-invariant encoder in an adversarial way. The domain-invariant encoder is trained to minimize the distance between the source and target domain.  ...  Row 2 shows the performance of GADAN with the naive pseudolabeling method. Row 3 shows the performance of GADAN with MCL. Here we replace masks produced by AMG with random tubes.  ... 
arXiv:2403.16242v1 fatcat:c2lawojk2rfcrn47mbctu5lis4

Multiple Adaptive Bayesian Linear Regression for Scalable Bayesian Optimization with Warm Start [article]

Valerio Perrone, Rodolphe Jenatton, Matthias Seeger, Cedric Archambeau
2017 arXiv   pre-print
Typically, BO is powered by a Gaussian process (GP), whose algorithmic complexity is cubic in the number of evaluations.  ...  We develop a multiple adaptive Bayesian linear regression model as a scalable alternative whose complexity is linear in the number of observations.  ...  Random Fourier representation An alternative approach is to use random kitchen sinks (RKS) for a random Fourier basis expansion [17] .  ... 
arXiv:1712.02902v1 fatcat:fybbni2izfdlnjthmnrwzgpcme

What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation? [article]

Anthony Strittmatter
2021 arXiv   pre-print
Recent studies have proposed causal machine learning (CML) methods to estimate conditional average treatment effects (CATEs).  ...  In this study, I investigate whether CML methods add value compared to conventional CATE estimators by re-evaluating Connecticut's Jobs First welfare experiment.  ...  of the kitchen sink covariates.  ... 
arXiv:1812.06533v3 fatcat:acnmc47m5rhlnljc7lsrlhty34

Photonic Quantum Computing For Polymer Classification [article]

Alexandrina Stoyanova, Taha Hammadia, Arno Ricou, Bogdan Penkovsky
2022 arXiv   pre-print
The hybrid approach combines one of the three methods, Gaussian Kernel Method, Quantum-Enhanced Random Kitchen Sinks or Variational Quantum Classifier, implemented by linear quantum photonic circuits (  ...  LQPCs), with a classical deep neural network (DNN) feature extractor.  ...  Kitchen Sinks (RKS), and Variational Quantum Classifier (VQC) implemented using linear Quantum Photonic Circuits (QPCs) [31] .  ... 
arXiv:2211.12207v1 fatcat:nkq5ha5a4bfzhipbzwyzlags7m
« Previous Showing results 1 — 15 out of 2,145 results