9

I'm studying machine learning and I feel there is a strong relationship between the concept of VC dimension and the more classical (statistical) concept of degrees of freedom.

Can anyone explain such a connection?

stochazesthai
  • 543
  • 4
  • 5

2 Answers2

8

As stated by Prof Yaser Abu-Mostafa-

Degrees of freedom are an abstraction of the effective number of parameters. The effective number is based on how many dichotomies one can get, rather than how many real-valued parameters are used. In the case of 2-dimensional perceptron, one can think of slope and intercept (plus a binary degree of freedom for which region goes to +1), or one can think of 3 parameters w_0,w_1,w_2 (though the weights can be simultaneously scaled up or down without affecting the resulting hypothesis). The degrees of freedom, however, are 3 because we have the flexibility to shatter 3 points, not because of one way or another of counting the number of parameters.

2-d perceptron

Azrael
  • 1,089
  • 10
  • 10
0

The VC dimension is very well explained in this paper in Section 2.1 and further, with the basic lemmas and proofs given. You can go through this.

Bhaskar
  • 29
  • 6