Arnon Amir, Michael Lindenbaum
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In many of these theoretical studies, the concept of covering numbers played an important role. It is thus useful to study covering numbers for linear function classes. In this paper, we investigate two closely related methods to derive upper bounds on these covering numbers. The first method, already employed in some earlier studies, relies on the so-called Maurey's lemma; the second method uses techniques from the mistake bound framework in online learning. We compare results from these two methods, as well as their consequences in some learning formulations.
Arnon Amir, Michael Lindenbaum
IEEE Transactions on Pattern Analysis and Machine Intelligence
Daniel Karl I. Weidele, Hendrik Strobelt, et al.
SysML 2019
Arthur Nádas
IEEE Transactions on Neural Networks
Chen-Yong Cher, Michael Gschwind
VEE 2008