Definition by Topic
A
B
- Bayes factor
- Bayesian information criterion
- Bayesian model averaging
- Bernoulli distribution
- Beta distribution
- Beta-binomial data
- Beta-binomial distribution
- Beta-distributed data
- Binomial distribution
- Binomial observations
- Bivariate normal distribution
- Brier scoring rule
C
- Categorical distribution
- Central moment
- Characteristic function
- Chi-squared distribution
- Coefficient of determination
- Conditional differential entropy
- Conditional entropy
- Conditional independence
- Conditional probability distribution
- Confidence interval
- Conjugate and non-conjugate prior distribution
- Constant
- Continuous uniform distribution
- Corrected Akaike information criterion
- Correlation
- Correlation matrix
- Corresponding forward model
- Covariance
- Covariance matrix
- Critical value
- Cross-covariance matrix
- Cross-entropy
- Cross-validated log model evidence
- Cumulant-generating function
- Cumulative distribution function
D
- Data
- Definition Template
- Deviance
- Deviance information criterion
- Differential cross-entropy
- Differential entropy
- Dirichlet distribution
- Dirichlet-distributed data
- Discrete and continuous random variable
- Discrete uniform distribution
E
- Empirical and theoretical prior distribution
- Empirical Bayes
- Empirical Bayes prior distribution
- Empirical Bayesian log model evidence
- Encompassing model
- Estimation matrix
- Event space
- ex-Gaussian distribution
- Exceedance probability
- Expected value
- Expected value of a random matrix
- Expected value of a random vector
- Explained sum of squares
- Exponential distribution
F
- F-contrast for contrast-based inference in multiple linear regression
- F-distribution
- F-statistic
- Family evidence
- Flat, hard and soft prior distribution
- Full probability model
- Full width at half maximum
G
I
- independent and identically distributed
- Informative and non-informative prior distribution
- Interaction sum of squares
- Inverse general linear model
J
- Joint cumulative distribution function
- Joint differential entropy
- Joint entropy
- Joint likelihood
- Joint probability
- Joint probability distribution
K
L
- Law of conditional probability
- Law of marginal probability
- Likelihood function
- Likelihood ratio
- Log Bayes factor
- Log family evidence
- Log model evidence
- Log probability scoring rule
- Log-likelihood function
- Log-likelihood ratio
- Log-normal distribution
- Logistic regression
M
- Marginal likelihood
- Marginal probability distribution
- Matrix-normal distribution
- Maximum
- Maximum entropy prior distribution
- Maximum likelihood estimation
- Maximum log-likelihood
- Maximum-a-posteriori estimation
- Mean squared error
- Median
- Method-of-moments estimation
- Minimum
- Mode
- Model evidence
- Moment
- Moment-generating function
- Multinomial distribution
- Multinomial observations
- Multiple linear regression
- Multivariate normal distribution
- Multivariate t-distribution
- Mutual exclusivity
- Mutual information
N
- Non-standardized t-distribution
- Normal distribution
- Normal-gamma distribution
- Normal-Wishart distribution
- Null hypothesis
O
P
- p-value
- Parameter
- Point and set hypothesis
- Poisson distribution
- Poisson distribution with exposure values
- Poisson-distributed data
- Posterior distribution
- Posterior model probability
- Posterior predictive distribution
- Power of a statistical test
- Precision
- Precision matrix
- Prior distribution
- Prior predictive distribution
- Probability
- Probability density function
- Probability distribution
- Probability mass function
- Probability space
- Probability-generating function
- Projection matrix
- Proper scoring rule
Q
R
- Random event
- Random experiment
- Random matrix
- Random variable
- Random vector
- Raw moment
- Reference prior distribution
- Regression line
- Residual sum of squares
- Residual variance
- Residual-forming matrix
S
- Sample correlation coefficient
- Sample correlation matrix
- Sample covariance
- Sample covariance matrix
- Sample mean
- Sample skewness
- Sample space
- Sample variance
- Sampling distribution
- Scoring rule
- Shannon entropy
- Signal-to-noise ratio
- Significance level
- Simple and composite hypothesis
- Simple linear regression
- Size of a statistical test
- Skewness
- Standard deviation
- Standard gamma distribution
- Standard normal distribution
- Standard uniform distribution
- Standardized moment
- Statistic
- Statistical hypothesis
- Statistical hypothesis test
- Statistical independence
- Strictly proper scoring rule
T
- t-contrast for contrast-based inference in multiple linear regression
- t-distribution
- Test statistic
- Total sum of squares
- Transformed general linear model
- Treatment sum of squares
- Two-way analysis of variance
U
- Uniform and non-uniform prior distribution
- Uniform-prior log model evidence
- Univariate and multivariate random variable
- Univariate Gaussian
- Univariate Gaussian with known variance