Michael Muller, Anna Kantosalo, et al.
CHI 2024
In this article we describe and analyze sublinear-time approximation algorithms for some optimization problems arising in machine learning, such as training linear classifiers and finding minimum enclosing balls. Our algorithms can be extended to some kernelized versions of these problems, such as SVDD, hard margin SVM, and L2-SVM, for which sublinear-time algorithms were not known before. These new algorithms use a combination of a novel sampling techniques and a new multiplicative update algorithm. We give lower bounds which show the running times of many of our algorithms to be nearly best possible in the unit-cost RAM model. © 2012 ACM 0004-5411/2012/10-ART23.
Michael Muller, Anna Kantosalo, et al.
CHI 2024
Ken C.L. Wong, Satyananda Kashyap, et al.
Pattern Recognition Letters
Guojing Cong, David A. Bader
Journal of Parallel and Distributed Computing
Jehanzeb Mirza, Leonid Karlinsky, et al.
NeurIPS 2023