You are here

Workshop on Cognition and Control

Competitive Classification
Alon Orlitsky
Alon Orlitsky received B.Sc. degrees in Mathematics and Electrical Engineering from Ben Gurion University in 1980 and 1981, and M.Sc.and Ph.D. degrees in Electrical Engineering from Stanford University in 1982 and 1986.
 
From 1986 to 1996 he was with the Communications Analysis Research Department of Bell Laboratories. He spent the following year as a quantitative analyst at D.E. Shaw and Company, an investment firm in New York City. In 1997 he joined the University of California,
San Diego, where he is currently a professor of Electrical and Computer Engineering and of Computer Science and Engineering, and directs the Information Theory and Applications Center and the Center for Wireless Communications. His research concerns information theory, statistical modeling, machine learning, and speech recognition.
 
Alon is a recipient of the 1981 ITT International Fellowship and the 1992 IEEE W.R.G. Baker Paper Award, and co-recipient of the 2006 Information Theory Society Paper Award. He co-authored two papers for which his students received student-paper awards: the 2003 Capocelli Prize and the 2010 ISIT Student Paper Award. He is a fellow of the IEEE, and holds the Qucalcomm Chair for Information Theory and its Applications at UCSD.
Abstract: 
Combining recent computer-science and information theoretic approaches, we derive competitive algorithms for the classical problem of classification. A classifier takes two training sequences, each generated by a different distribution, and determines which of the two distributions generated a third, test sequence. With no assumptions on the support size or the distance between the two distributions, we construct a linear-complexity classifier that requires at most n^(3/2) samples to attain the n-sample accuracy of the best classifier designed with all essential knowledge of the two distributions. Conversely, we show that for any classifier, there are distributions that require at least n^(7/6) samples to achieve the n-sample accuracy of the best classifier designed with knowledge of the distributions. Joint work with Jayadev Acharya, Hirakendu Das, Ashkan Jafarpour, Shengjun Pan, and Ananda Suresh.
Date: 
February 23rd
Time: 
2:00 pm
Room: 
Larsen 234

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer