Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








13 Hits in 6.5 sec

Near-Optimal Statistical Query Lower Bounds for Agnostically Learning Intersections of Halfspaces with Gaussian Marginals [article]

Daniel Hsu, Clayton Sanford, Rocco Servedio, Emmanouil-Vasileios Vlatakis-Gkaragkounis
2022 arXiv   pre-print
Recent work of Diakonikolas et al. (2021) shows that any Statistical Query (SQ) algorithm for agnostically learning the class of intersections of k halfspaces over ℝ^n to constant excess error either must  ...  We consider the well-studied problem of learning intersections of halfspaces under the Gaussian distribution in the challenging agnostic learning model.  ...  also for learning intersections (and other functions) of halfspaces using membership queries (Kwek and Pitt, 1998; Gopalan et al., 2012) .  ... 
arXiv:2202.05096v1 fatcat:jpmmq7m6qnacvj3gf7zqyafhpi

Unconditional lower bounds for learning intersections of halfspaces

Adam R. Klivans, Alexander A. Sherstov
2007 Machine Learning  
We prove new lower bounds for learning intersections of halfspaces, one of the most important concept classes in computational learning theory.  ...  Our main result is that any statistical-query algorithm for learning the intersection of √ n halfspaces in n dimensions must make 2 Ω( √ n) queries.  ...  Klivans and Sherstov (2006) have recently given the first representationindependent (cryptographic) hardness results for PAC learning intersections of halfspaces.  ... 
doi:10.1007/s10994-007-5010-1 fatcat:y554mbcivjf6xdikd224bpujdq

On PAC learning algorithms for rich Boolean function classes

Lisa Hellerstein, Rocco A. Servedio
2007 Theoretical Computer Science  
We give an overview of the fastest known algorithms for learning various expressive classes of Boolean functions in the Probably Approximately Correct (PAC) learning model.  ...  : sparse polynomial threshold functions over the Boolean cube {0, 1} n and sparse GF 2 polynomials over {0, 1} n .  ...  The first author was supported in part by NSF Award IIS-0534908, while visiting the University of Wisconsin, Madison.  ... 
doi:10.1016/j.tcs.2007.05.018 fatcat:cfpv3cqkrvbu3ndw7x2vk6j5he

On PAC Learning Algorithms for Rich Boolean Function Classes [chapter]

Rocco A. Servedio
2006 Lecture Notes in Computer Science  
We give an overview of the fastest known algorithms for learning various expressive classes of Boolean functions in the Probably Approximately Correct (PAC) learning model.  ...  : sparse polynomial threshold functions over the Boolean cube {0, 1} n and sparse GF 2 polynomials over {0, 1} n .  ...  Acknowledgement We thank Adam Klivans for helpful suggestions in preparing this survey, and we thank an anonymous referee for useful suggestions regarding the presentation.  ... 
doi:10.1007/11750321_42 fatcat:iqkiulnpd5at7a7enenxwxxjgy

Learning convex polyhedra with margin [article]

Lee-Ad Gottlieb, Eran Kaufman, Aryeh Kontorovich, Gabriel Nivasch
2021 arXiv   pre-print
Our learning algorithm constructs a consistent polyhedron as an intersection of about t log t halfspaces with constant-size margins in time polynomial in t (where t is the number of halfspaces forming  ...  We present an improved algorithm for quasi-properly learning convex polyhedra in the realizable PAC setting from data with a margin.  ...  Acknowledgments We thank Sasho Nikolov, Bernd Gärtner and David Eppstein for helpful discussions.  ... 
arXiv:1805.09719v3 fatcat:45r3lo7zhvbllgfyo7wq4txgna

Learning Using 1-Local Membership Queries [article]

Galit Bary
2015 arXiv   pre-print
In this work we study a new model of local membership queries (Awasthi et al., 2012), which tries to resolve the problem of artificial queries.  ...  There is a stronger model, in which the algorithm can also query for labels of new examples it creates.  ...  For example, learning automatons, logarithmic depth circuits, and intersections of polynomially many halfspaces are all intractable, assuming the security of various cryptographic schemes (Kearns and  ... 
arXiv:1512.00165v1 fatcat:u4efkvd2jbd47j6m67frgaudpa

Lower Bounds and Hardness Amplification for Learning Shallow Monotone Formulas

Vitaly Feldman, Homin K. Lee, Rocco A. Servedio
2011 Journal of machine learning research  
This hardness amplification for learning builds on the ideas in the work of O'Donnell (2004) on hardness amplification for approximating functions using small circuits, and is applicable to a number of  ...  In this paper we give the first unconditional lower bounds for learning problems of this sort by showing that polynomial-time algorithms cannot learn shallow monotone Boolean formulas under the uniform  ...  Remark 13 This hardness amplification also applies to algorithms using membership queries since membership queries to g ⊗ f can be easily simulated using membership queries to f .  ... 
dblp:journals/jmlr/FeldmanLS11 fatcat:gboaogbqszcolhxx7k5xhvl5ya

Distributional PAC-Learning from Nisan's Natural Proofs [article]

Ari Karchmer
2023 arXiv   pre-print
Carmosino et al. (2016) demonstrated that natural proofs of circuit lower bounds for Λ imply efficient algorithms for learning Λ-circuits, but only over the uniform distribution, with membership queries  ...  The main applications of our result are new distributional PAC-learning algorithms for depth-2 majority circuits, polytopes and DNFs over natural target distributions, as well as the nonexistence of encoded-input  ...  Part of this research was completed while I was visiting the Simons Institute for the theory of computing.  ... 
arXiv:2310.03641v2 fatcat:mpv6pe7hm5dgfdysouykgc2k7u

Arithmetic Circuits: A survey of recent results and open questions

Amir Shpilka, Amir Yehudayoff
2009 Foundations and Trends® in Theoretical Computer Science  
As examples we mention the connection between polynomial identity testing and lower bounds of Kabanets and Impagliazzo, the lower bounds of Raz for multilinear formulas, and two new approaches for proving  ...  Being a more structured model than Boolean circuits, one could hope that the fundamental problems of theoretical computer science, such as separating P from NP, will be easier to solve for arithmetic circuits  ...  [SV09] give an n O(d+k) -time black-box PIT algorithm for the sum of k depth-d ROFs.  ... 
doi:10.1561/0400000039 fatcat:vejtujygx5ddjkm2crbxh2udcq

Computational Applications Of Noise Sensitivity

Ryan O'Donnell
2018
Using our noise sensitivity estimates for functions of boolean halfspaces we obtain new polynomial and quasipolynomial time algorithms for learning intersections, thresholds, and other functions of halfspaces  ...  The theorem lets us prove a new result about the hardness on average of NP: If NP is (1 -- poly(n))-hard for circuits of polynomial size, then it is in fact ( 1 2 +o(1))-hard for circuits of polynomial  ...  Building on work of Blum, Chalasani, Goldman, and Slonim [BCGS98] and Baum [Bau91a] , Kwek and Pitt [KP98] gave a membership query algorithm for learning the intersection of k halfspaces in R n with  ... 
doi:10.1184/r1/6604301.v1 fatcat:t2hfiie6n5gb3ee6sxu4z76zc4

On randomization in sequential and distributed algorithms

Rajiv Gupta, Scott A. Smolka, Shaji Bhaskar
1994 ACM Computing Surveys  
Included with each algorithm is a discussion of its correctness and its computational complexity.  ...  ), universal hashing (choosing the hash function dynamically and at random), irzteractwe probabdwtLc proof systems (a new method of program testing), dining philosophers (a classical problem in distributed  ...  Efficient probabilistic algorithms are presented for the problems of line segment intersection, convex hull, polygon triangula- tion, and halfspace partitions of point sets.  ... 
doi:10.1145/174666.174667 fatcat:mwufckvt5vawlostdlhcv7rxwm

Approximation Algorithms and New Models for Clustering and Learning

Pranjal Awasthi
2018
In addition, the techniques used seem fairly general and promising to be applicable to a wider class of problems. We also propose a new model for learning with queries.  ...  We also study a different model for clustering which introduces limited amount of interaction with the user.  ...  Our results are for learning on log-Lipschitz distributions over the boolean cube, which we denote by {−1, 1} n (or sometimes by {0, 1} n ).  ... 
doi:10.1184/r1/6714833.v1 fatcat:hqcbfbfq5jhvtd7d5cgo4gag7i

Subject Index to Volumes 1–75

2001 Information Processing Letters  
from good examples, 3632 positive examples, 3856 queries, 2131 learning halfspaces, 2915 multivariate polynomials, 3489 nearly monotone k-term DNF, 3766 structure, 3012 theoretic characterization  ...  3766, 3775 activity, 239 algorithm, 2062, 2792, 3337, 3567 Boolean function, 3775 by example, 2439 conjunctions with noise, 3834 convex bodies, 2439 DNF expressions, 2792 framework, 2792 learning  ...  constructibility, 3182 detection, 1387 range join, 2933 algorithm, 2933 range of a function, 3993 values, 744 range query, 153, 791, 859, 939, 1304, 1387, 1435, 1658, 1761, 1908, 2235, 2836 querying,  ... 
doi:10.1016/s0020-0190(01)00175-2 fatcat:5y67tfm6yfbblakrus5nnhs73e