最終更新日:2022/12/24

If random variables X and Y are mutually independent, then their joint entropy H(X,Y) is just the sum H(X)+H(Y) of its component entropies. If they are not mutually independent, then their joint entropy will be H(X)+H(Y)-I(X;Y) where I(X;Y) is the mutual information of X and Y.

編集履歴(0)

Sentence quizzes to help you learn to read

編集履歴(0)

ログイン / 新規登録

 

アプリをダウンロード!
DiQt

DiQt(ディクト)

無料

★★★★★★★★★★