Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.


Linear support vector machines ({sc svm}s) have become popular for solving classification tasks due to their fast and simple online application to large scale data sets. However, many problems are not linearly separable. For these problems kernel-based {sc svm}s are often used, but unlike their linear variant they suffer from various drawbacks in terms of computational and memory efficiency. Their response can be represented only as a function of the set of support vectors, which has been experimentally shown to grow linearly with the size of the training set. In this paper we propose a novel locally linear {sc svm} classifier with smooth decision boundary and bounded curvature. We show how the functions defining the classifier can be approximated using local codings and show how this model can be optimized in an online fashion by performing stochastic gradient descent with the same convergence guarantees as standard gradient descent method for linear {sc svm}. Our method achieves comparable performance to the state-of-the-art whilst being significantly faster than competing kernel {sc svm}s. We generalise this model to locally finite dimensional kernel {sc svm}.

Questions and Answers

You need to be logged in to be able to post here.