##### Personal tools
You are here: Home On the V-gamma dimension for regression in Reproducing Kernel Hilbert spaces
##### Document Actions

T. Evgeniou and M. Pontil (1999)

# On the V-gamma dimension for regression in Reproducing Kernel Hilbert spaces

In: Lecture Notes in Computer Science, Algorithmic Learning Theory, Tokyo, Japan.

This paper presents a computation of the $V_\gamma$ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression $\epsilon$-insensitive loss function $L_\epsilon$, and general $L_p$ loss functions. Finiteness of the $V_\gamma$ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the $L_\epsilon$ or general $L_p$ loss functions. This paper presents a novel proof of this result. It also presents a computation of an upper bound of the $V_\gamma$ dimension under some conditions, that leads to an approach for the estimation of the empirical $V_\gamma$ dimension given a set of training data.