Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.


We present a new algorithm for learning a convex set in $R^n$ given labeled examples drawn from any Gaussian distribution. The efficiency of the algorithm depends on the dimension of the {em normal subspace}, the span of vectors orthogonal to supporting hyperplanes of the convex set. The key step of the algorithm uses a Singular Value Decomposition (SVD) to approximate the relevant normal subspace. The complexity of the resulting algorithm is $poly(n)2^{ ilde{O}(k)}$ for an arbitrary convex set with normal subspace of dimension $k$. For the important special case of the intersection of $k$ halfspaces, the complexity is [ poly(n,k,1/eps) + n cdot min , k^{O(log k/eps^4)}, (k/eps)^{O(klog (1/eps))} ] to learn a hypothesis that correctly classifies $1-eps$ of the unknown Gaussian distribution. This improves substantially on existing algorithms and is the first algorithm to achieve a fixed polynomial dependence on $n$. The proof of our main result is based on a monotonicity property of Gaussian space.

Questions and Answers

You need to be logged in to be able to post here.