Jump to content

Constraint (information theory)

From Wikipedia, the free encyclopedia

Constraint in information theory is the degree of statistical dependence between or among variables.

Garner[1] provides a thorough discussion of various forms of constraint (internal constraint, external constraint, total constraint) with application to pattern recognition and psychology.

See also

[edit]

References

[edit]
  1. ^ Garner W R (1962). Uncertainty and Structure as Psychological Concepts, John Wiley & Sons, New York.