The Wikimedia Foundation's book rendering service has been withdrawn. Please upload your Wikipedia book to one of the external rendering services. |
You can still create and edit a book design using the Book Creator and upload it to an external rendering service:
|
This user book is a user-generated collection of Wikipedia articles that can be easily saved, rendered electronically, and ordered as a printed book. If you are the creator of this book and need help, see Help:Books (general tips) and WikiProject Wikipedia-Books (questions and assistance). Edit this book: Book Creator · Wikitext Order a printed copy from: PediaPress [ About ] [ Advanced ] [ FAQ ] [ Feedback ] [ Help ] [ WikiProject ] [ Recent Changes ] |
Information
editStatistics
edit- Algorithmic information theory
- Algorithmic probability
- Alternating decision tree
- Approximate entropy
- Ascendency
- Binary combinatory logic
- Binary entropy function
- Binary lambda calculus
- C4.5 algorithm
- CHAID
- Chain rule for Kolmogorov complexity
- Chaitin's constant
- Cheung–Marks theorem
- Computational indistinguishability
- Conditional entropy
- Conditional mutual information
- Cramér–Rao bound
- Cross entropy
- Decision rules
- Decision stump
- Decision tree
- Decision tree learning
- Decision tree model
- Differential entropy
- Entropy (information theory)
- Entropy encoding
- Entropy estimation
- Entropy in thermodynamics and information theory
- Exformation
- Fisher information
- Fisher information metric
- Gene expression programming
- Gibbs algorithm
- Gradient boosting
- Grafting (decision trees)
- ID3 algorithm
- Immerman–Szelepcsényi theorem
- Incremental decision tree
- Inequalities in information theory
- Information gain in decision trees
- Information gain ratio
- Information theory
- Jeffreys prior
- Joint entropy
- Kolmogorov complexity
- Kolmogorov structure function
- Kullback–Leibler divergence
- Landauer's principle
- Linear partial information
- Logistic model tree
- Maximum entropy probability distribution
- Measure-preserving dynamical system
- Minimum description length
- Minimum message length
- Multivariate mutual information
- Mutual information
- Negentropy
- Nonextensive entropy
- Nyquist–Shannon sampling theorem
- Observed information
- Partition function (mathematics)
- Perplexity
- Pointwise mutual information
- Principle of maximum entropy
- Pruning (decision trees)
- Pseudorandom ensemble
- Pseudorandom generator
- Queap
- Random forest
- Randomness tests
- Rényi entropy
- Schwartz–Zippel lemma
- Self-information
- Shannon's source coding theorem
- Shannon–Hartley theorem
- Topological entropy
- Transfer entropy
- Tsallis entropy
- Variation of information