Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.
We propose a PAC-Bayes sample compression approach to kernel methods that can accommodate any bounded similarity function and show that the support vector machine (SVM) classifier is a particular case of a more general class of data-dependent classifiers known as majority votes of sample-compressed classifiers. We provide novel risk bounds for these majority votes and learning algorithms that minimize these bounds.
Questions and AnswersYou need to be logged in to be able to post here.