最終更新日:2022/12/24
The researchers found that the original texts spanned a variety of entropy values in different languages, reflecting differences in grammar and structure. But strangely, the difference in entropy between the original, ordered text and the randomly scrambled text was constant across languages. This difference is a way to measure the amount of information encoded in word order, Montemurro says. The amount of information lost when they scrambled the text was about 3.5 bits per word.
編集履歴(0)
元となった例文
The
researchers
found
that
the
original
texts
spanned
a
variety
of
entropy
values
in
different
languages,
reflecting
differences
in
grammar
and
structure.
But
strangely,
the
difference
in
entropy
between
the
original,
ordered
text
and
the
randomly
scrambled
text
was
constant
across
languages.
This
difference
is
a
way
to
measure
the
amount
of
information
encoded
in
word
order,
Montemurro
says.
The
amount
of
information
lost
when
they
scrambled
the
text
was
about
3.5
bits
per
word.