Published 2000 | Version v1
Publication

Fast training of Support Vector Machines for Regression

Description

We propose here a fast way to perform the gradient computation in Support Vector Machine (SVM) learning, when samples are positioned on a m-dimensional grid. Our method takes advantage of the particular structure of the constrained quadratic programming problem arising in this case. We show how such structure is connected to the properties of block Toeplitz matrices and how they can be used to speed-up the computation of matrix-vector products

Additional details

Created:
March 27, 2023
Modified:
November 30, 2023