Tang, Benyang, Dominic Mazzoni, 2006: Multiclass reduced-set support
vector machines. Proceedings of the 23th International Conference on Machine Learning (ICML 2006), Pittsburgh, PA
Abstract
An improved method is presented for reducing
the number of support vectors in a support
vector machine (SVM) classifier, typically
leading to dramatically faster classification speed with minimal additional error.
The method, which we call RS+, is based
on Burgesd constructive approach to generating
SVM reduced sets, but introduces several
improvements which, when combined,
result in a faster, more robust algorithm. The
constructive approach depends on the ability
to approximate the pre-image of a feature
space vector, which is a differencecult nonlinear
optimization problem. To address this, the
RS+ method uses a combination of dirential
evolution and gradient descent to find
better pre-images. Optimal SVM weights
and bias are found for the reduced-set vectors
by using the SVM training algorithm with
a modified kernel matrix, leading to further
improvements in accuracy. Additionally, we
demonstrate how reduced-set methods can be
applied to multi-class problems made up of
several binary SVMs, resulting in additional
gains beyond what would be achieved by reducing
each binary SVM separately. Our experiments
show that RS+ is quite successful,
for example achieving the same accuracy as
the Burges method on the USPS digit classification problem with 1/3 as many reduced-set
vectors.