Home Articles FAQs XREF Games Software Instant Books BBS About FOLDOC RFCs Feedback Sitemap
irt.Org

entropy

You are here: irt.org | FOLDOC | entropy

<theory> A measure of the disorder of a system. Systems tend to go from a state of order (low entropy) to a state of maximum disorder (high entropy).

The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way.

Shannon's formula gives the entropy H(M) of a message M in bits:

	H(M) = -log2 p(M)

Where p(M) is the probability of message M.

(1998-11-23)

Nearby terms: entity « entity-relationship diagram « entity-relationship model « entropy » Entry Sequenced Data Set » enumerated type » enumeration

FOLDOC, Topics, A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, Y, Z, ?, ALL

©2018 Martin Webb