David Cressey wrote:
> Instead of hijacking another topic, I'll start this topic.
> David BL suggested that a way to quantify information is the number of bits
> needed to encode, allowing for compression.
I think the term being looked for is
There's been a _lot_ of work done in this area.
> I said I preferred entropy as
> the measure of information, and suggested that the two measures might in
> some way be equivalent. Someone else recalled the concept of entropy from
> the study of statistical mechanics and thermodynamics in physics.
It also gets a good work out in machine learning. Many of the classical
algorithms (decision trees in their various forms) have
information-gain (which is defined in terms of entropy) at their heart.
(The precise definition is closely related to the number of bits
required for representation)
I vaguely recall that some of the more theoretical machine learning
results (like learnability and optimality results) rely on a notion entropy.
[... snip interesting perspective ...]
> All of this goes back to the 1960s, and some of it to the 1940s. Is entropy
> still widely used in information science?
> Is it relevant to database theory?
Joe >> Stay informed about: Entropy and Quantity of Information