作者missergirl (猫猫)
看板NCTU-STAT99G
标题[演讲公告] 10/8 统计所专题演讲
时间Thu Oct 7 15:59:23 2010
交通大学、清华大学 统计学研究所 专题演讲
题 目:Density Estimation with Minimization of U-divergence
主讲人:Prof. Kanta Naito
(Department of Mathematics, Shimane University, Japan)
时 间:99年10月8日(星期五)上午10:40-11:30
(上午10:20-10:40茶会於交大统计所429室举行)
地 点:交大综合一馆427室
Abstract
Recently, there has been renewed widespread interest in supervised learning
in regard to regression, classification and pattern recognition. Boosting has
been known as promising techniques with feasible computational algorithms
that have received a great deal of attention. In contrast to supervised
learning, boosting approaches to unsupervised learning, such as density
estimation, appear to be less developed. Although it is understood that
unsupervised learning is more difficult than supervised learning,
there is a need for an effective learning method for density estimation.
The purpose of this study is to develop a general but practical learning
method for multivariate density estimation. Especially the proposed method
for density estimation is based on the stagewise minimization of the
U-divergence. The U-divergence is a general divergence measure involving a
convex function U which includes the Kullback-Leibler divergence and the
squared norm as special cases. The algorithm to yield the density estimator
is closely related to the boosting algorithm and it is shown that the usual
kernel density estimator can also be seen as a special case of the proposed
estimator. Non-asymptotic error bounds of the proposed estimators are
developed and numerical experiments show that the proposed estimators often
perform better than a kernel density estimator with a sophisticated bandwidth
matrix. The research is a joint work with Shinto Eguchi of The Institute of
Statistical Mathematics
--
※ 发信站: 批踢踢实业坊(ptt.cc)
◆ From: 140.113.114.213