Friday, April 20, 2007

Lab 10

Hartley and Shannon both have different ways of measuring uncertainty. Hartley's measure refers to the amount of uncertainty associated with a set of alternatives. This is measured by the amount of information needed to remove the uncertainty. It refers to the information revealed after we know the outcome. This is given by the Hartley function which is H(A):=logb(׀A׀) where if the log base is 2, then the uncertainty is measured in bits. If is it the natural logarithm, then the unit is
nat. Shannon entropy, on the other hand, is the average amount of uncertainty associated with a set of weighted alternatives measured by the average amount of information needed to remove the uncertainty. The main difference is that shannon entropy refers to averages with a random variable. Shannon entropy shows the average amount of infomation that the recipient is missing when they do not know the value of a random variable, whereas Hartley is how much information needed to remove the uncertainty.

No comments: