maximum entropy probability distribution

over 13 years ago

In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is larger than (or equal to) that of all other members of a specified class of distributions.

If nothing is known about a distribution except that it belongs to a certain class, then the maximum entropy distribution for that class is often assumed "by default", according to the principle of maximum entropy. The reason is twofold: first, maximizing entropy, in a sense, means minimizing the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time.

Find Source Up