Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.

Description

We give sublinear-time approximation algorithms for some optimization problems arising in machine learning, such as training linear classifiers and finding minimum enclosing balls. Our algorithms can be extended to some kernelized versions of these problems, such as SVDD, hard margin SVM, and $L_2$-SVM, for which sublinear-time algorithms were not known before. These new algorithms use a combination of novel sampling techniques and the randomized implementation of online learning algorithms. We give lower bounds which show the running times of our algorithms to be nearly best possible in the unit-cost RAM model. We also give implementations of our algorithms in the semi-streaming setting, obtaining the first low pass sublinear time algorithms with polylogarithmic space and arbitrary approximation factor. Finally we show that our algorithms, which are Monte Carlo, can be made Las Vegas with an additional linear-time step, and we show that such linear work is necessary for Las Vegas results.

Questions and Answers

You need to be logged in to be able to post here.