最終更新日:2022/12/24
The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)-I(Y;X) where I(Y;X) is the mutual information between Y and X.
編集履歴(0)
The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)-I(Y;X) where I(Y;X) is the mutual information between Y and X.