最終更新日:2024/07/30
(information theory) A measure of the uncertainty associated with a random variable; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.
正解を見る
information entropy
編集履歴(0)
元となった辞書の項目
information entropy
noun
(information
theory)
A
measure
of
the
uncertainty
associated
with
a
random
variable;
a
measure
of
the
average
information
content
one
is
missing
when
one
does
not
know
the
value
of
the
random
variable
(usually
in
units
such
as
bits);
the
amount
of
information
(measured
in,
say,
bits)
contained
per
average
instance
of
a
character
in
a
stream
of
characters.
意味(1)
(information
theory)
A
measure
of
the
uncertainty
associated
with
a
random
variable;
a
measure
of
the
average
information
content
one
is
missing
when
one
does
not
know
the
value
of
the
random
variable
(usually
in
units
such
as
bits);
the
amount
of
information
(measured
in,
say,
bits)
contained
per
average
instance
of
a
character
in
a
stream
of
characters.