Proof by Topic
A
- Accuracy and complexity for Bayesian linear regression
- Accuracy and complexity for Bayesian linear regression with known covariance
- Accuracy and complexity for the univariate Gaussian
- Accuracy and complexity for the univariate Gaussian with known variance
- Addition law of probability
- Addition of the differential entropy upon multiplication with a constant
- Addition of the differential entropy upon multiplication with invertible matrix
- Additivity of the Kullback-Leibler divergence for independent distributions
- Additivity of the variance for independent random variables
- Akaike information criterion for multiple linear regression
- Application of Cochran’s theorem to two-way analysis of variance
- Approximation of log family evidences based on log model evidences
B
- Bayes’ rule
- Bayes’ theorem
- Bayesian information criterion for multiple linear regression
- Bayesian model averaging in terms of log model evidences
- Best linear unbiased estimator for the inverse general linear model
- Binomial test
- Brier scoring rule is strictly proper scoring rule
C
- Characteristic function of a function of a random variable
- Chi-squared distribution is a special case of gamma distribution
- Combined posterior distribution for Bayesian linear regression when analyzing conditionally independent data sets
- Combined posterior distributions in terms of individual posterior distributions obtained from conditionally independent data
- Concavity of the Shannon entropy
- Conditional distributions of the multivariate normal distribution
- Conditional distributions of the normal-gamma distribution
- Conjugate prior distribution for Bayesian linear regression
- Conjugate prior distribution for Bayesian linear regression with known covariance
- Conjugate prior distribution for binomial observations
- Conjugate prior distribution for multinomial observations
- Conjugate prior distribution for multivariate Bayesian linear regression
- Conjugate prior distribution for Poisson-distributed data
- Conjugate prior distribution for the Poisson distribution with exposure values
- Conjugate prior distribution for the univariate Gaussian
- Conjugate prior distribution for the univariate Gaussian with known variance
- Construction of confidence intervals using Wilks’ theorem
- Construction of unbiased estimator for variance
- Construction of unbiased estimator for variance in multiple linear regression
- Continuous uniform distribution maximizes differential entropy for fixed range
- Convexity of the cross-entropy
- Convexity of the Kullback-Leibler divergence
- Corrected Akaike information criterion converges to uncorrected Akaike information criterion when infinite data are available
- Corrected Akaike information criterion for multiple linear regression
- Corrected Akaike information criterion in terms of maximum log-likelihood
- Correlation always falls between -1 and +1
- Correlation coefficient in terms of standard scores
- Covariance and variance of the normal-gamma distribution
- Covariance matrices of the matrix-normal distribution
- Covariance matrix of the categorical distribution
- Covariance matrix of the multinomial distribution
- Covariance matrix of the multivariate normal distribution
- Covariance matrix of the sum of two random vectors
- Covariance of independent random variables
- Cross-validated log Bayes factor for the univariate Gaussian with known variance
- Cross-validated log model evidence for the univariate Gaussian with known variance
- Cumulative distribution function in terms of probability density function of a continuous random variable
- Cumulative distribution function in terms of probability mass function of a discrete random variable
- Cumulative distribution function of a strictly decreasing function of a random variable
- Cumulative distribution function of a strictly increasing function of a random variable
- Cumulative distribution function of a sum of independent random variables
- Cumulative distribution function of the beta distribution
- Cumulative distribution function of the beta-binomial distribution
- Cumulative distribution function of the continuous uniform distribution
- Cumulative distribution function of the discrete uniform distribution
- Cumulative distribution function of the exponential distribution
- Cumulative distribution function of the gamma distribution
- Cumulative distribution function of the log-normal distribution
- Cumulative distribution function of the normal distribution
D
- Derivation of Bayesian model averaging
- Derivation of R² and adjusted R²
- Derivation of the Bayesian information criterion
- Derivation of the family evidence
- Derivation of the log Bayes factor
- Derivation of the log family evidence
- Derivation of the log model evidence
- Derivation of the model evidence
- Derivation of the posterior model probability
- Deviance for multiple linear regression
- Deviance information criterion for multiple linear regression
- Differential entropy can be negative
- Differential entropy for the matrix-normal distribution
- Differential entropy of the continuous uniform distribution
- Differential entropy of the gamma distribution
- Differential entropy of the multivariate normal distribution
- Differential entropy of the normal distribution
- Differential entropy of the normal-gamma distribution
- Discrete uniform distribution maximizes entropy for finite support
- Distribution of parameter estimates for simple linear regression
- Distribution of residual sum of squares in multiple linear regression with weighted least squares
- Distribution of the inverse general linear model
- Distribution of the transformed general linear model
- Distributional transformation using cumulative distribution function
- Distributions of estimated parameters, fitted signal and residuals in multiple linear regression upon ordinary least squares
- Distributions of estimated parameters, fitted signal and residuals in multiple linear regression upon weighted least squares
E
- Effects of mean-centering on parameter estimates for simple linear regression
- Encompassing prior method for computing Bayes factors
- Entropy of the Bernoulli distribution
- Entropy of the binomial distribution
- Entropy of the categorical distribution
- Entropy of the discrete uniform distribution
- Entropy of the multinomial distribution
- Equivalence of log-likelihood ratio and mutual information for the general linear model
- Equivalence of log-likelihood ratios for regular and inverse general linear model
- Equivalence of matrix-normal distribution and multivariate normal distribution
- Equivalence of operations for model evidence and log model evidence
- Equivalence of parameter estimates from the transformed general linear model
- Exceedance probabilities for the Dirichlet distribution
- Exceedance probability for a random variable in terms of cumulative distribution function
- Existence of a corresponding forward model
- Expectation of a quadratic form
- Expectation of parameter estimates for simple linear regression
- Expectation of the cross-validated log Bayes factor for the univariate Gaussian with known variance
- Expectation of the log Bayes factor for the univariate Gaussian with known variance
- Expected value of a non-negative random variable
- Expected value of the trace of a matrix
- Expected value of x times ln(x) for a gamma distribution
- Exponential distribution is a special case of gamma distribution
- Expression of R² in terms of residual variances
- Expression of the cumulative distribution function of the normal distribution without the error function
- Expression of the noise precision posterior for Bayesian linear regression using prediction and parameter errors
- Expression of the probability mass function of the beta-binomial distribution using only the gamma function
- Extreme points of the probability density function of the normal distribution
F
- F-statistic in terms of ordinary least squares estimates in one-way analysis of variance
- F-statistics in terms of ordinary least squares estimates in two-way analysis of variance
- F-test for equality of variances in two independent samples
- F-test for grand mean in two-way analysis of variance
- F-test for interaction in two-way analysis of variance
- F-test for main effect in one-way analysis of variance
- F-test for main effect in two-way analysis of variance
- F-test for multiple linear regression using contrast-based inference
- First central moment is zero
- First raw moment is mean
- Full width at half maximum for the normal distribution
G
I
- Independence of estimated parameters and residuals in multiple linear regression
- Independence of products of multivariate normal random vector
- Independent random variables are uncorrelated
- Inflection points of the probability density function of the normal distribution
- Invariance of the covariance matrix under addition of constant vector
- Invariance of the differential entropy under addition of a constant
- Invariance of the Kullback-Leibler divergence under parameter transformation
- Invariance of the variance under addition of a constant
- Inverse transformation method using cumulative distribution function
J
K
- Kullback-Leibler divergence for the Bernoulli distribution
- Kullback-Leibler divergence for the binomial distribution
- Kullback-Leibler divergence for the continuous uniform distribution
- Kullback-Leibler divergence for the Dirichlet distribution
- Kullback-Leibler divergence for the discrete uniform distribution
- Kullback-Leibler divergence for the gamma distribution
- Kullback-Leibler divergence for the matrix-normal distribution
- Kullback-Leibler divergence for the multivariate normal distribution
- Kullback-Leibler divergence for the normal distribution
- Kullback-Leibler divergence for the normal-gamma distribution
- Kullback-Leibler divergence for the Wishart distribution
L
- Law of the unconscious statistician
- Law of total covariance
- Law of total expectation
- Law of total probability
- Law of total variance
- Linear combination of bivariate normal random variables
- Linear combination of independent normal random variables
- Linear transformation theorem for the matrix-normal distribution
- Linear transformation theorem for the moment-generating function
- Linear transformation theorem for the multivariate normal distribution
- Linearity of the expected value
- Log Bayes factor for Bayesian linear regression
- Log Bayes factor for binomial observations
- Log Bayes factor for multinomial observations
- Log Bayes factor for the univariate Gaussian with known variance
- Log Bayes factor in terms of log model evidences
- Log family evidences in terms of log model evidences
- Log model evidence for Bayesian linear regression
- Log model evidence for Bayesian linear regression with known covariance
- Log model evidence for binomial observations
- Log model evidence for multinomial observations
- Log model evidence for multivariate Bayesian linear regression
- Log model evidence for Poisson-distributed data
- Log model evidence for the Poisson distribution with exposure values
- Log model evidence for the univariate Gaussian
- Log model evidence for the univariate Gaussian with known variance
- Log model evidence in terms of prior and posterior distribution
- Log sum inequality
- Log-likelihood ratio for multiple linear regression
- Log-likelihood ratio for the general linear model
- Log-odds and probability in logistic regression
- Logarithmic expectation of the gamma distribution
M
- Marginal distribution of a conditional binomial distribution
- Marginal distributions for the matrix-normal distribution
- Marginal distributions of the multivariate normal distribution
- Marginal distributions of the normal-gamma distribution
- Marginal likelihood is a definite integral of the joint likelihood
- Marginally normal does not imply jointly normal
- Maximum likelihood estimation can result in biased estimates
- Maximum likelihood estimation for binomial observations
- Maximum likelihood estimation for Dirichlet-distributed data
- Maximum likelihood estimation for multinomial observations
- Maximum likelihood estimation for multiple linear regression
- Maximum likelihood estimation for Poisson-distributed data
- Maximum likelihood estimation for simple linear regression
- Maximum likelihood estimation for simple linear regression
- Maximum likelihood estimation for the general linear model
- Maximum likelihood estimation for the Poisson distribution with exposure values
- Maximum likelihood estimation for the univariate Gaussian
- Maximum likelihood estimation for the univariate Gaussian with known variance
- Maximum likelihood estimator of variance in multiple linear regression is biased
- Maximum likelihood estimator of variance is biased
- Maximum log-likelihood for binomial observations
- Maximum log-likelihood for multinomial observations
- Maximum log-likelihood for multiple linear regression
- Maximum log-likelihood for the general linear model
- Maximum-a-posteriori estimation for Bayesian linear regression
- Maximum-a-posteriori estimation for binomial observations
- Maximum-a-posteriori estimation for multinomial observations
- Mean of the Bernoulli distribution
- Mean of the beta distribution
- Mean of the binomial distribution
- Mean of the categorical distribution
- Mean of the continuous uniform distribution
- Mean of the ex-Gaussian distribution
- Mean of the exponential distribution
- Mean of the gamma distribution
- Mean of the log-normal distribution
- Mean of the matrix-normal distribution
- Mean of the multinomial distribution
- Mean of the multivariate normal distribution
- Mean of the normal distribution
- Mean of the normal-gamma distribution
- Mean of the normal-Wishart distribution
- Mean of the Poisson distribution
- Mean of the Wald distribution
- Median of the continuous uniform distribution
- Median of the exponential distribution
- Median of the log-normal distribution
- Median of the normal distribution
- Method of moments for beta-binomial data
- Method of moments for beta-distributed data
- Method of moments for ex-Gaussian-distributed data
- Method of moments for Wald-distributed data
- Mode of the continuous uniform distribution
- Mode of the exponential distribution
- Mode of the log-normal distribution
- Mode of the normal distribution
- Moment in terms of moment-generating function
- Moment-generating function of a function of a random variable
- Moment-generating function of a sum of independent random variables
- Moment-generating function of linear combination of independent random variables
- Moment-generating function of the beta distribution
- Moment-generating function of the ex-Gaussian distribution
- Moment-generating function of the exponential distribution
- Moment-generating function of the gamma distribution
- Moment-generating function of the multivariate normal distribution
- Moment-generating function of the normal distribution
- Moment-generating function of the Wald distribution
- Moments of the chi-squared distribution
- Monotonicity of probability
- Monotonicity of probability
- Monotonicity of the expected value
- Multinomial test
- Multiple linear regression is a special case of the general linear model
- Multivariate normal distribution is a special case of matrix-normal distribution
- Mutual information of dependent and independent variables in the general linear model
- Mutual information of the bivariate normal distribution
- Mutual information of the multivariate normal distribution
N
- Necessary and sufficient condition for independence of multivariate normal random variables
- Non-invariance of the differential entropy under change of variables
- (Non-)Multiplicativity of the expected value
- Non-negativity of the expected value
- Non-negativity of the Kullback-Leibler divergence
- Non-negativity of the Kullback-Leibler divergence
- Non-negativity of the Shannon entropy
- Non-negativity of the variance
- Non-symmetry of the Kullback-Leibler divergence
- Normal distribution is a special case of multivariate normal distribution
- Normal distribution maximizes differential entropy for fixed variance
- Normal-gamma distribution is a special case of normal-Wishart distribution
- Normally distributed and uncorrelated does not imply independent
O
- Omnibus F-test for multiple regressors in multiple linear regression
- One-sample t-test for independent observations
- One-sample z-test for independent observations
- Ordinary least squares for multiple linear regression
- Ordinary least squares for multiple linear regression
- Ordinary least squares for multiple linear regression
- Ordinary least squares for multiple linear regression with two regressors
- Ordinary least squares for one-way analysis of variance
- Ordinary least squares for simple linear regression
- Ordinary least squares for simple linear regression
- Ordinary least squares for the general linear model
- Ordinary least squares for two-way analysis of variance
P
- Paired t-test for dependent observations
- Paired z-test for dependent observations
- Parameter estimates for simple linear regression are uncorrelated after mean-centering
- Parameters of the corresponding forward model
- Partition of a covariance matrix into expected values
- Partition of covariance into expected values
- Partition of skewness into expected values
- Partition of sums of squares for multiple linear regression
- Partition of sums of squares for simple linear regression
- Partition of sums of squares in one-way analysis of variance
- Partition of sums of squares in two-way analysis of variance
- Partition of the log model evidence into accuracy and complexity
- Partition of the mean squared error into bias and variance
- Partition of variance into expected values
- Positive semi-definiteness of the covariance matrix
- Posterior credibility region against the omnibus null hypothesis for Bayesian linear regression
- Posterior density is proportional to joint likelihood
- Posterior distribution for Bayesian linear regression
- Posterior distribution for Bayesian linear regression with known covariance
- Posterior distribution for binomial observations
- Posterior distribution for multinomial observations
- Posterior distribution for multivariate Bayesian linear regression
- Posterior distribution for Poisson-distributed data
- Posterior distribution for the Poisson distribution with exposure values
- Posterior distribution for the univariate Gaussian
- Posterior distribution for the univariate Gaussian with known variance
- Posterior model probabilities in terms of Bayes factors
- Posterior model probabilities in terms of log model evidences
- Posterior model probability in terms of log Bayes factor
- Posterior predictive distribution is a marginal distribution of the joint likelihood
- Posterior probability of the alternative hypothesis for Bayesian linear regression
- Posterior probability of the alternative model for binomial observations
- Posterior probability of the alternative model for multinomial observations
- Probability and log-odds in logistic regression
- Probability density function is first derivative of cumulative distribution function
- Probability density function of a linear function of a continuous random vector
- Probability density function of a strictly decreasing function of a continuous random variable
- Probability density function of a strictly increasing function of a continuous random variable
- Probability density function of a sum of independent continuous random variables
- Probability density function of an invertible function of a continuous random vector
- Probability density function of the beta distribution
- Probability density function of the bivariate normal distribution
- Probability density function of the bivariate normal distribution in terms of correlation coefficient
- Probability density function of the chi-squared distribution
- Probability density function of the continuous uniform distribution
- Probability density function of the Dirichlet distribution
- Probability density function of the ex-Gaussian distribution
- Probability density function of the exponential distribution
- Probability density function of the F-distribution
- Probability density function of the gamma distribution
- Probability density function of the log-normal distribution
- Probability density function of the matrix-normal distribution
- Probability density function of the multivariate normal distribution
- Probability density function of the multivariate t-distribution
- Probability density function of the normal distribution
- Probability density function of the normal-gamma distribution
- Probability density function of the normal-Wishart distribution
- Probability density function of the t-distribution
- Probability density function of the Wald distribution
- Probability integral transform using cumulative distribution function
- Probability mass function of a strictly decreasing function of a discrete random variable
- Probability mass function of a strictly increasing function of a discrete random variable
- Probability mass function of a sum of independent discrete random variables
- Probability mass function of an invertible function of a random vector
- Probability mass function of the Bernoulli distribution
- Probability mass function of the beta-binomial distribution
- Probability mass function of the binomial distribution
- Probability mass function of the categorical distribution
- Probability mass function of the discrete uniform distribution
- Probability mass function of the multinomial distribution
- Probability mass function of the Poisson distribution
- Probability of exhaustive events
- Probability of exhaustive events
- Probability of normal random variable being within standard deviations from its mean
- Probability of the complement
- Probability of the empty set
- Probability of the empty set
- Probability under mutual exclusivity
- Probability under statistical independence
- Probability-generating function is expectation of function of random variable
- Probability-generating function of the binomial distribution
- Projection matrix and residual-forming matrix are idempotent
- Projection matrix and residual-forming matrix are symmetric
- Projection of a data point to the regression line
- Proof Template
Q
- Quantile function is inverse of strictly monotonically increasing cumulative distribution function
- Quantile function of the continuous uniform distribution
- Quantile function of the discrete uniform distribution
- Quantile function of the exponential distribution
- Quantile function of the gamma distribution
- Quantile function of the log-normal distribution
- Quantile function of the normal distribution
R
- Range of probability
- Range of the variance of the Bernoulli distribution
- Range of the variance of the binomial distribution
- Relation of continuous Kullback-Leibler divergence to differential entropy
- Relation of continuous mutual information to joint and conditional differential entropy
- Relation of continuous mutual information to marginal and conditional differential entropy
- Relation of continuous mutual information to marginal and joint differential entropy
- Relation of discrete Kullback-Leibler divergence to Shannon entropy
- Relation of mutual information to joint and conditional entropy
- Relation of mutual information to marginal and conditional entropy
- Relation of mutual information to marginal and joint entropy
- Relationship between chi-squared distribution and beta distribution
- Relationship between coefficient of determination and correlation coefficient in simple linear regression
- Relationship between correlation coefficient and slope estimate in simple linear regression
- Relationship between covariance and correlation
- Relationship between covariance matrix and correlation matrix
- Relationship between F-statistic and maximum log-likelihood
- Relationship between F-statistic and R²
- Relationship between gamma distribution and standard gamma distribution
- Relationship between gamma distribution and standard gamma distribution
- Relationship between multivariate normal distribution and chi-squared distribution
- Relationship between multivariate t-distribution and F-distribution
- Relationship between non-standardized t-distribution and t-distribution
- Relationship between normal distribution and chi-squared distribution
- Relationship between normal distribution and standard normal distribution
- Relationship between normal distribution and standard normal distribution
- Relationship between normal distribution and standard normal distribution
- Relationship between normal distribution and t-distribution
- Relationship between precision matrix and correlation matrix
- Relationship between R² and maximum log-likelihood
- Relationship between residual variance and sample variance in simple linear regression
- Relationship between second raw moment, variance and mean
- Relationship between signal-to-noise ratio and maximum log-likelihood
- Relationship between signal-to-noise ratio and R²
- Reparametrization for one-way analysis of variance
S
- Sampling from the matrix-normal distribution
- Sampling from the normal-gamma distribution
- Savage-Dickey density ratio for computing Bayes factors
- Scaling of a random variable following the gamma distribution
- Scaling of the covariance matrix upon multiplication with constant matrix
- Scaling of the variance upon multiplication with a constant
- Second central moment is variance
- Self-covariance equals variance
- Self-independence of random event
- Simple linear regression is a special case of multiple linear regression
- Skewness of the ex-Gaussian distribution
- Skewness of the exponential distribution
- Skewness of the Wald distribution
- Specific t-test for single regressor in multiple linear regression
- Square of expectation of product is less than or equal to product of expectation of squares
- Statistical significance test for the coefficient of determinantion based on an omnibus F-test
- Statistical test for comparing simple linear regression models with and without slope parameter
- Statistical test for intercept parameter in simple linear regression model
- Statistical test for slope parameter in simple linear regression model
- Sums of squares for simple linear regression
- Symmetry of the covariance
- Symmetry of the covariance matrix
T
- t-distribution is a special case of multivariate t-distribution
- t-test for multiple linear regression using contrast-based inference
- The expected value minimizes the mean squared error
- The log probability scoring rule is a strictly proper scoring rule
- The median minimizes the mean absolute error
- The p-value follows a uniform distribution under the null hypothesis
- The product of independent log-normal random variables is a log-normal random variable
- The regression line goes through the center of mass point
- The residuals and the covariate are uncorrelated in simple linear regression
- The sum of residuals is zero in simple linear regression
- Transformation matrices for ordinary least squares
- Transformation matrices for simple linear regression
- Transitivity of Bayes Factors
- Transposition of a matrix-normal random variable
- Two-sample t-test for independent observations
- Two-sample z-test for independent observations
V
- Value of the probability-generating function for argument one
- Value of the probability-generating function for argument zero
- Variance of constant is zero
- Variance of parameter estimates for simple linear regression
- Variance of the Bernoulli distribution
- Variance of the beta distribution
- Variance of the binomial distribution
- Variance of the continuous uniform distribution
- Variance of the ex-Gaussian distribution
- Variance of the exponential distribution
- Variance of the gamma distribution
- Variance of the linear combination of two random variables
- Variance of the log-normal distribution
- Variance of the normal distribution
- Variance of the Poisson distribution
- Variance of the sum of two random variables
- Variance of the Wald distribution