看板ck47th320
標 題Re: 大家有聽過"chernoff distance"嗎?
發信站不良牛牧場 (Tue Apr 24 05:22:12 2001)
轉信站Ptt!warpnews!SimFarm
: -1 -1 -1 -1
: 令Σ=cΣ1+(1-c)Σ2 求Σ*Σ1 的eigenvector matrix 就可以完成對角化
我本來是說將\Sigma_1=V\LambdaV^T, where V is the orthonormal basis of
the eigenvectors of \Sigma_1
So first, decompose \Sigma_1 as \sqrt{\Sigma_1}*\sqrt{\Sigma_1}, then choose
x'=\sqrt{\Sigma_1}*x
and then the original distribution becomes
N_1\sim N(0, I)
N_2\sim N(0, \Sigma_2')
where \Sigma_2'=\sqrt{\Sigma_1}*\Sigma_2*\sqrt{\Sigma_1}
Second, find the orthonormal basis of \Sigma_2', name it as V_2,
then
x"=V_2^{-1}*x'
then
N_1\sim N(0, I)
N_2\sim N(0, diag{\lambda_1,...,\lambda_n})
Then the integral must becomes an much easier form.
Remember to add the Jacobian matrix on the integrant during each
transformation. (Maybe the Jacobian matrix is unnecessary, I'm not
quite sure.)
: -1 -1
: 可是這個eigenvector matrix要如何用Σ1,Σ2(or Σ1,Σ2) and c,(1-c)
: 表示就很頭痛了(還是要假設一些參數 如個別的eigenvector and eigenvalue最後再消去)
: -1 -1
: 另一方面 一開始給的是cΣ1+(1-c)Σ2 但結果裡都是cΣ1+(1-c)Σ2
: 不知道是如何轉換的
: : 不過有一個問題是 I cannot distinguish between scalar product and the
: : inner product. So I have no idea that what your b(x) really is. Ah!
: : I just got what you mean. All products are the matrix productx, Am I correct?
: -------Yes!
: : 看著答案看原題,有一點像。The logarithm part just comes from the Jacobian
: : matrix. You can see whether this approach works.
: : 不過笨漢不愧是笨漢!竟然打得出來,太可怕了!
--
Origin:<不良牛牧場> zoo.ee.ntu.edu.tw (140.112.18.36)
Welcome to SimFarm BBS -- From : [chihw.student.Princeton.EDU
]