看板ck47th320
标 题Re: 大家有听过"chernoff distance"吗?
发信站不良牛牧场 (Tue Apr 24 05:22:12 2001)
转信站Ptt!warpnews!SimFarm
: -1 -1 -1 -1
: 令Σ=cΣ1+(1-c)Σ2 求Σ*Σ1 的eigenvector matrix 就可以完成对角化
我本来是说将\Sigma_1=V\LambdaV^T, where V is the orthonormal basis of
the eigenvectors of \Sigma_1
So first, decompose \Sigma_1 as \sqrt{\Sigma_1}*\sqrt{\Sigma_1}, then choose
x'=\sqrt{\Sigma_1}*x
and then the original distribution becomes
N_1\sim N(0, I)
N_2\sim N(0, \Sigma_2')
where \Sigma_2'=\sqrt{\Sigma_1}*\Sigma_2*\sqrt{\Sigma_1}
Second, find the orthonormal basis of \Sigma_2', name it as V_2,
then
x"=V_2^{-1}*x'
then
N_1\sim N(0, I)
N_2\sim N(0, diag{\lambda_1,...,\lambda_n})
Then the integral must becomes an much easier form.
Remember to add the Jacobian matrix on the integrant during each
transformation. (Maybe the Jacobian matrix is unnecessary, I'm not
quite sure.)
: -1 -1
: 可是这个eigenvector matrix要如何用Σ1,Σ2(or Σ1,Σ2) and c,(1-c)
: 表示就很头痛了(还是要假设一些参数 如个别的eigenvector and eigenvalue最後再消去)
: -1 -1
: 另一方面 一开始给的是cΣ1+(1-c)Σ2 但结果里都是cΣ1+(1-c)Σ2
: 不知道是如何转换的
: : 不过有一个问题是 I cannot distinguish between scalar product and the
: : inner product. So I have no idea that what your b(x) really is. Ah!
: : I just got what you mean. All products are the matrix productx, Am I correct?
: -------Yes!
: : 看着答案看原题,有一点像。The logarithm part just comes from the Jacobian
: : matrix. You can see whether this approach works.
: : 不过笨汉不愧是笨汉!竟然打得出来,太可怕了!
--
Origin:<不良牛牧场> zoo.ee.ntu.edu.tw (140.112.18.36)
Welcome to SimFarm BBS -- From : [chihw.student.Princeton.EDU
]