Vladimir N. Vapnik: Difference between revisions

From ETHW
(Created page with "==Biography== Vladimir N. Vapnik’s pioneering work became the foundation of a new research field known as “statistical learning theory” that has transformed how computers ...")
 
No edit summary
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
==Biography==
{{Biography
 
|Associated organizations=AT&T; NEC Laboratories
|Fields of study=Artificial intelligence
}}
Vladimir N. Vapnik’s pioneering work became the foundation of a new research field known as “statistical learning theory” that has transformed how computers learn in tackling complex problems. Working with Alexey Chervonenkis in Moscow during the late 1960s/early 1970s, Dr. Vapnik developed the Vapnik-Chervonenkis (VC) learning theory. This theory established a fundamental quantity to represent the limitations of learning machines. Dr. Vapnik later created principles to handle the generalization factors defined by VC theory, known as structural risk minimization. Dr. Vapnik’s research was unknown to the Western world until his arriving in the United States shortly before the collapse of the Soviet Union. Working with AT&T Laboratories in Holmdel, NJ, during the 1990s, he put his theories into practical use with support vector machine (SVM) algorithms for recognizing complex patterns in data for classification and regression analysis tasks. SVMs have become the method of choice for machine learning.
Vladimir N. Vapnik’s pioneering work became the foundation of a new research field known as “statistical learning theory” that has transformed how computers learn in tackling complex problems. Working with Alexey Chervonenkis in Moscow during the late 1960s/early 1970s, Dr. Vapnik developed the Vapnik-Chervonenkis (VC) learning theory. This theory established a fundamental quantity to represent the limitations of learning machines. Dr. Vapnik later created principles to handle the generalization factors defined by VC theory, known as structural risk minimization. Dr. Vapnik’s research was unknown to the Western world until his arriving in the United States shortly before the collapse of the Soviet Union. Working with AT&T Laboratories in Holmdel, NJ, during the 1990s, he put his theories into practical use with support vector machine (SVM) algorithms for recognizing complex patterns in data for classification and regression analysis tasks. SVMs have become the method of choice for machine learning.


A member of the U.S. National Academy of Engineering and NEC Laboratories America Fellow, Dr. Vapnik is currently a professor with Columbia University in New York.
A member of the U.S. National Academy of Engineering and NEC Laboratories America Fellow, Dr. Vapnik is currently a professor with Columbia University in New York.


[[Category:Computational and artificial intelligence|Vapnik]]  
{{DEFAULTSORT:Vapnik}}
[[Category:Neural networks|Vapnik]]
 
[[Category:Automation]]
[[Category:Computational_and_artificial_intelligence]]
[[Category:Neural_networks]]

Latest revision as of 17:09, 1 March 2016

Vladimir N. Vapnik
Associated organizations
AT&T, NEC Laboratories
Fields of study
Artificial intelligence

Biography

Vladimir N. Vapnik’s pioneering work became the foundation of a new research field known as “statistical learning theory” that has transformed how computers learn in tackling complex problems. Working with Alexey Chervonenkis in Moscow during the late 1960s/early 1970s, Dr. Vapnik developed the Vapnik-Chervonenkis (VC) learning theory. This theory established a fundamental quantity to represent the limitations of learning machines. Dr. Vapnik later created principles to handle the generalization factors defined by VC theory, known as structural risk minimization. Dr. Vapnik’s research was unknown to the Western world until his arriving in the United States shortly before the collapse of the Soviet Union. Working with AT&T Laboratories in Holmdel, NJ, during the 1990s, he put his theories into practical use with support vector machine (SVM) algorithms for recognizing complex patterns in data for classification and regression analysis tasks. SVMs have become the method of choice for machine learning.

A member of the U.S. National Academy of Engineering and NEC Laboratories America Fellow, Dr. Vapnik is currently a professor with Columbia University in New York.