Quadratic form (statistics)

From Wikipedia, the free encyclopedia

In multivariate statistics, if is a vector of random variables, and is an -dimensional symmetric matrix, then the scalar quantity is known as a quadratic form in .

Expectation[edit]

It can be shown that[1]

where and are the expected value and variance-covariance matrix of , respectively, and tr denotes the trace of a matrix. This result only depends on the existence of and ; in particular, normality of is not required.

A book treatment of the topic of quadratic forms in random variables is that of Mathai and Provost.[2]

Proof[edit]

Since the quadratic form is a scalar quantity, .

Next, by the cyclic property of the trace operator,

Since the trace operator is a linear combination of the components of the matrix, it therefore follows from the linearity of the expectation operator that

A standard property of variances then tells us that this is

Applying the cyclic property of the trace operator again, we get

Variance in the Gaussian case[edit]

In general, the variance of a quadratic form depends greatly on the distribution of . However, if does follow a multivariate normal distribution, the variance of the quadratic form becomes particularly tractable. Assume for the moment that is a symmetric matrix. Then,

.[3]

In fact, this can be generalized to find the covariance between two quadratic forms on the same (once again, and must both be symmetric):

.[4]

In addition, a quadratic form such as this follows a generalized chi-squared distribution.

Computing the variance in the non-symmetric case[edit]

The case for general can be derived by noting that

so

is a quadratic form in the symmetric matrix , so the mean and variance expressions are the same, provided is replaced by therein.

Examples of quadratic forms[edit]

In the setting where one has a set of observations and an operator matrix , then the residual sum of squares can be written as a quadratic form in :

For procedures where the matrix is symmetric and idempotent, and the errors are Gaussian with covariance matrix , has a chi-squared distribution with degrees of freedom and noncentrality parameter , where

may be found by matching the first two central moments of a noncentral chi-squared random variable to the expressions given in the first two sections. If estimates with no bias, then the noncentrality is zero and follows a central chi-squared distribution.

See also[edit]

References[edit]

  1. ^ Bates, Douglas. "Quadratic Forms of Random Variables" (PDF). STAT 849 lectures. Retrieved August 21, 2011.
  2. ^ Mathai, A. M. & Provost, Serge B. (1992). Quadratic Forms in Random Variables. CRC Press. p. 424. ISBN 978-0824786915.
  3. ^ Rencher, Alvin C.; Schaalje, G. Bruce. (2008). Linear models in statistics (2nd ed.). Hoboken, N.J.: Wiley-Interscience. ISBN 9780471754985. OCLC 212120778.
  4. ^ Graybill, Franklin A. Matrices with applications in statistics (2. ed.). Wadsworth: Belmont, Calif. p. 367. ISBN 0534980384.