# Proof by Topic

### (

### A

- Accuracy and complexity for the univariate Gaussian
- Accuracy and complexity for the univariate Gaussian with known variance
- Addition law of probability
- Addition of the differential entropy upon multiplication with a constant
- Additivity of the Kullback-Leibler divergence for independent distributions
- Additivity of the variance for independent random variables

### B

### C

- Chi-squared distribution is a special case of gamma distribution
- Concavity of the Shannon entropy
- Conditional distributions of the multivariate normal distribution
- Conditional distributions of the normal-gamma distribution
- Conjugate prior distribution for Bayesian linear regression
- Conjugate prior distribution for Poisson-distributed data
- Conjugate prior distribution for binomial observations
- Conjugate prior distribution for multinomial observations
- Conjugate prior distribution for multivariate Bayesian linear regression
- Conjugate prior distribution for the Poisson distribution with exposure values
- Conjugate prior distribution for the univariate Gaussian
- Conjugate prior distribution for the univariate Gaussian with known variance
- Construction of confidence intervals using Wilks’ theorem
- Construction of unbiased estimator for variance
- Convexity of the Kullback-Leibler divergence
- Convexity of the cross-entropy
- Covariance of independent random variables
- Cross-validated log Bayes factor for the univariate Gaussian with known variance
- Cross-validated log model evidence for the univariate Gaussian with known variance
- Cumulative distribution function in terms of probability density function of a continuous random variable
- Cumulative distribution function in terms of probability mass function of a discrete random variable
- Cumulative distribution function of a strictly decreasing function of a random variable
- Cumulative distribution function of a strictly increasing function of a random variable
- Cumulative distribution function of the beta distribution
- Cumulative distribution function of the continuous uniform distribution
- Cumulative distribution function of the discrete uniform distribution
- Cumulative distribution function of the exponential distribution
- Cumulative distribution function of the gamma distribution
- Cumulative distribution function of the normal distribution

### D

- Derivation of Bayesian model averaging
- Derivation of R² and adjusted R²
- Derivation of the Bayesian information criterion
- Derivation of the log Bayes factor
- Derivation of the log family evidence
- Derivation of the log model evidence
- Derivation of the posterior model probability
- Differential entropy can be negative
- Differential entropy of the gamma distribution
- Differential entropy of the multivariate normal distribution
- Differential entropy of the normal distribution
- Differential entropy of the normal-gamma distribution
- Distributional transformation using cumulative distribution function

### E

- Encompassing Prior Method for computing Bayes Factors
- Equivalence of matrix-normal distribution and multivariate normal distribution
- Exceedance probabilities for the Dirichlet distribution
- Expectation of a quadratic form
- Expectation of the cross-validated log Bayes factor for the univariate Gaussian with known variance
- Expectation of the log Bayes factor for the univariate Gaussian with known variance
- Expected value of a non-negative random variable
- Expected value of x times ln(x) for a gamma distribution
- Exponential distribution is a special case of gamma distribution
- Expression of the cumulative distribution function of the normal distribution without the error function

### F

- First central moment is zero
- First raw moment is mean
- Full width at half maximum for the normal distribution

### G

### I

- Invariance of the Kullback-Leibler divergence under parameter transformation
- Invariance of the differential entropy under addition of a constant
- Invariance of the variance under addition of a constant
- Inverse transformation method using cumulative distribution function

### J

### K

- Kullback-Leibler divergence for the gamma distribution
- Kullback-Leibler divergence for the multivariate normal distribution
- Kullback-Leibler divergence for the normal distribution
- Kullback-Leibler divergence for the normal-gamma distribution

### L

- Law of the unconscious statistician
- Linear combination of independent normal random variables
- Linear transformation theorem for the matrix-normal distribution
- Linear transformation theorem for the moment-generating function
- Linear transformation theorem for the multivariate normal distribution
- Linearity of the expected value
- Log Bayes factor for the univariate Gaussian with known variance
- Log Bayes factor in terms of log model evidences
- Log family evidences in terms of log model evidences
- Log model evidence for Bayesian linear regression
- Log model evidence for Poisson-distributed data
- Log model evidence for binomial observations
- Log model evidence for multinomial observations
- Log model evidence for multivariate Bayesian linear regression
- Log model evidence for the Poisson distribution with exposure values
- Log model evidence for the univariate Gaussian
- Log model evidence for the univariate Gaussian with known variance
- Log sum inequality
- Log-odds and probability in logistic regression
- Logarithmic expectation of the gamma distribution

### M

- Marginal distributions of the multivariate normal distribution
- Marginal distributions of the normal-gamma distribution
- Marginal likelihood is a definite integral of joint likelihood
- Maximum likelihood estimation for Dirichlet-distributed data
- Maximum likelihood estimation for Poisson-distributed data
- Maximum likelihood estimation for multiple linear regression
- Maximum likelihood estimation for the Poisson distribution with exposure values
- Maximum likelihood estimation for the general linear model
- Maximum likelihood estimation for the univariate Gaussian
- Maximum likelihood estimation for the univariate Gaussian with known variance
- Maximum likelihood estimator of variance is biased
- Mean of the Bernoulli distribution
- Mean of the Poisson distribution
- Mean of the Wald distribution
- Mean of the beta distribution
- Mean of the binomial distribution
- Mean of the categorical distribution
- Mean of the continuous uniform distribution
- Mean of the exponential distribution
- Mean of the gamma distribution
- Mean of the multinomial distribution
- Mean of the normal distribution
- Mean of the normal-gamma distribution
- Median of the continuous uniform distribution
- Median of the exponential distribution
- Median of the normal distribution
- Method of moments for beta-distributed data
- Mode of the continuous uniform distribution
- Mode of the exponential distribution
- Mode of the normal distribution
- Moment in terms of moment-generating function
- Moment-generating function of linear combination of independent random variables
- Moment-generating function of the Wald distribution
- Moment-generating function of the beta distribution
- Moment-generating function of the normal distribution
- Moments of the chi-squared distribution
- Monotonicity of probability
- Monotonicity of the expected value

### N

- Necessary and sufficient condition for independence of multivariate normal random variables
- Non-negativity of the Kullback-Leibler divergence
- Non-negativity of the Kullback-Leibler divergence
- Non-negativity of the Shannon entropy
- Non-negativity of the expected value
- Non-negativity of the variance
- Non-symmetry of the Kullback-Leibler divergence

### O

- One-sample t-test for independent observations
- One-sample z-test for independent observations
- Ordinary least squares for multiple linear regression
- Ordinary least squares for multiple linear regression
- Ordinary least squares for the general linear model

### P

- Paired t-test for dependent observations
- Paired z-test for dependent observations
- Partition of a covariance matrix into expected values
- Partition of covariance into expected values
- Partition of sums of squares in ordinary least squares
- Partition of the log model evidence into accuracy and complexity
- Partition of the mean squared error into bias and variance
- Partition of variance into expected values
- Posterior credibility region against the omnibus null hypothesis for Bayesian linear regression
- Posterior density is proportional to joint likelihood
- Posterior distribution for Bayesian linear regression
- Posterior distribution for Poisson-distributed data
- Posterior distribution for binomial observations
- Posterior distribution for multinomial observations
- Posterior distribution for multivariate Bayesian linear regression
- Posterior distribution for the Poisson distribution with exposure values
- Posterior distribution for the univariate Gaussian
- Posterior distribution for the univariate Gaussian with known variance
- Posterior model probabilities in terms of Bayes factors
- Posterior model probabilities in terms of log model evidences
- Posterior model probability in terms of log Bayes factor
- Posterior probability of the alternative hypothesis for Bayesian linear regression
- Probability and log-odds in logistic regression
- Probability density function is first derivative of cumulative distribution function
- Probability density function of a strictly decreasing function of a continuous random variable
- Probability density function of a strictly increasing function of a continuous random variable
- Probability density function of the Dirichlet distribution
- Probability density function of the Wald distribution
- Probability density function of the beta distribution
- Probability density function of the chi-squared distribution
- Probability density function of the continuous uniform distribution
- Probability density function of the exponential distribution
- Probability density function of the gamma distribution
- Probability density function of the matrix-normal distribution
- Probability density function of the multivariate normal distribution
- Probability density function of the normal distribution
- Probability density function of the normal-gamma distribution
- Probability integral transform using cumulative distribution function
- Probability mass function of a strictly decreasing function of a discrete random variable
- Probability mass function of a strictly increasing function of a discrete random variable
- Probability mass function of the Bernoulli distribution
- Probability mass function of the Poisson distribution
- Probability mass function of the binomial distribution
- Probability mass function of the categorical distribution
- Probability mass function of the discrete uniform distribution
- Probability mass function of the multinomial distribution
- Probability of the complement
- Probability of the empty set
- Probability under mutual exclusivity
- Probability under statistical independence
- Projection matrix and residual-forming matrix are idempotent
- Proof Template

### Q

- Quantile function is inverse of strictly monotonically increasing cumulative distribution function
- Quantile function of the continuous uniform distribution
- Quantile function of the discrete uniform distribution
- Quantile function of the exponential distribution
- Quantile function of the gamma distribution
- Quantile function of the normal distribution

### R

- Range of probability
- Relation of Kullback-Leibler divergence to entropy
- Relation of continuous Kullback-Leibler divergence to differential entropy
- Relation of continuous mutual information to joint and conditional differential entropy
- Relation of continuous mutual information to marginal and conditional differential entropy
- Relation of continuous mutual information to marginal and joint differential entropy
- Relation of mutual information to joint and conditional entropy
- Relation of mutual information to marginal and conditional entropy
- Relation of mutual information to marginal and joint entropy
- Relationship between R² and maximum log-likelihood
- Relationship between covariance and correlation
- Relationship between covariance matrix and correlation matrix
- Relationship between gamma distribution and standard gamma distribution
- Relationship between gamma distribution and standard gamma distribution
- Relationship between multivariate t-distribution and F-distribution
- Relationship between non-standardized t-distribution and t-distribution
- Relationship between normal distribution and chi-squared distribution
- Relationship between normal distribution and standard normal distribution
- Relationship between normal distribution and standard normal distribution
- Relationship between normal distribution and standard normal distribution
- Relationship between normal distribution and t-distribution
- Relationship between precision matrix and correlation matrix
- Relationship between second raw moment, variance and mean
- Relationship between signal-to-noise ratio and R²

### S

- Savage-Dickey Density Ratio for computing Bayes Factors
- Scaling of the variance upon multiplication with a constant
- Second central moment is variance

### T

- Transformation matrices for ordinary least squares
- Transitivity of Bayes Factors
- Transposition of a matrix-normal random variable
- Two-sample t-test for independent observations
- Two-sample z-test for independent observations

### V

- Variance of constant is zero
- Variance of the Poisson distribution
- Variance of the Wald distribution
- Variance of the beta distribution
- Variance of the gamma distribution
- Variance of the linear combination of two random variables
- Variance of the normal distribution
- Variance of the sum of two random variables