artificial intelligence |
Aftersleep Books
|
||||||||||||||||||
Statistical Learning TheoryThe following report compares books using the SERCount Rating (base on the result count from the search engine). |
|||||||||||||||||||
|
Aftersleep Books - 2005-06-20 07:00:00 | © Copyright 2004 - www.aftersleep.com () | sitemap | top |
In an earlier book published by Springer-Verlag he develops the basics of the theory. However to keep the mathematical level excessible to computer scientists and engineers he avoided the mathematical proofs needed for mathematical rigor. This text is an advanced text that provides the rigorous development. Although the preface and chapter 0 give the reader a idea of what is to come the rest of the text is difficult reading.
The theory has been quite successful at attacking the pattern recognition/ classification problem and provides a basis for understanding support vector machines. However Vapnik sees a much broader application to statistical inference in general when the classical parametric approach fails.
If you have a strong background in probability theory you should be able to wade through the book and get something out of it. If not I recommend reading section 7.9 of "The Elements of Statistical Learning" by Hastie, Tibshirani and Friedman. That will give you an easily understandable view of the VC dimension. Also sections 12.2 and 12.3 of their text will give you some appreciation for support vector machines and the error rate bounds obtainable for them based on the VC dimension.