PDF version of this document

next up previous contents
Next: Minimum Description Length (MDL) Up: Information Theory Previous: Importance   Contents

Shannon's Entropy

The term entropy is used to denote a general measure of uncertainty. It is not a very sophisticated idea, yet a very fundamental one which was first introduced in 1948. Uncertainty is associated with the required amount of data so it can also be thought of as an information measure or quantifier. The value that quantifies uncertainty originally related to random variables which take different probabilities among a set of states (reminiscent of Markov chain models). Shannon's entropy has become a very useful way of evaluating structures and pattern in some data. The lower the entropy value, the more data is already inherent in that data. In a sense, the entropy indicates how much can be learned from the data and what is still unknown.



2004-07-19