Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Introduction to Regression Procedures

Multivariate Tests

Multivariate hypotheses involve several dependent variables in the form

H_0 : L {\beta}M = d

where L is a linear function on the regressor side, {{\beta}} is a matrix of parameters, M is a linear function on the dependent side, and d is a matrix of constants. The special case (handled by PROC REG) in which the constants are the same for each dependent variable is written

(L {\beta}- {cj})M = 0

where c is a column vector of constants and j is a row vector of 1s. The special case in which the constants are 0 is

L {\beta}M = 0

These multivariate tests are covered in detail in Morrison (1976); Timm (1975); Mardia, Kent, and Bibby (1979); Bock (1975); and other works cited in Chapter 6, "Introduction to Multivariate Procedures."

To test this hypothesis, construct two matrices, H and E, that correspond to the numerator and denominator of a univariate F test:

H & = & M^' ({LB} - {cj})^'
 (L(X^' X)^{-}
 L^')^{-1}
 ({LB} - {cj})M \E & = & M^'
 ( Y^' Y - B^'
 (X^' X) B
 ) M

Four test statistics, based on the eigenvalues of E-1 H or (E+H)-1 H, are formed. Let \lambda_i be the ordered eigenvalues of E-1 H (if the inverse exists), and let \xi_i be the ordered eigenvalues of (E + H)-1 H. It happens that \xi_i = \lambda_i / (1+\lambda_i) and \lambda_i = \xi_i / (1-\xi_i), and it turns out that \rho_i = \sqrt{\xi_i} is the ith canonical correlation.

Let p be the rank of (H+E), which is less than or equal to the number of columns of M. Let q be the rank of L(X' X)- L'. Let v be the error degrees of freedom and s = min(p,q). Let m = (|p-q|-1)/2, and let n=(v-p-1)/2. Then the following statistics have the approximate F statistics as shown.

Wilks' Lambda

If

\Lambda = \frac{{det}(E)}
 {{det}(H+E)}
 = \prod_{i=1}^n \frac{1}{1+ \lambda_i}
 = \prod_{i=1}^n (1 - \xi_i)

then

F = \frac{1- \Lambda^{1/t}}{\Lambda^{1/t}} \cdot
 \frac{rt-2u}{pq}
is approximately F, where
r & = & v - \frac{p -q+1}2 \u & = & \frac{pq - 2}4 \t & = & \{ \sqrt{\frac{p^2 q^2 - 4}{p^2 + q^2 - 5}} & &
 {if } p^2 + q^2 - 5 \gt 0 \ 
 1 & & {otherwise}
 .

The degrees of freedom are pq and rt-2u. The distribution is exact if \min(p,q) \leq 2.(Refer to Rao 1973, p. 556.)

Pillai's Trace

If

V = {trace} ( H(H+E)^{-1}
 )
 = \sum_{i=1}^n \frac{\lambda_i}{1+ \lambda_i}
 = \sum_{i=1}^n \xi_i

then

F = [(2n+s+1)/(2m+s+1)] ·[(V)/(s-V)]

is approximately F with s(2m+s+1) and s(2n+s+1) degrees of freedom.

Hotelling-Lawley Trace

If

U = {trace} ( E^{-1} H )
 = \sum_{i=1}^n \lambda_i
 = \sum_{i=1}^n \frac{\xi_i}{1 - \xi_i}

then

F = [(2(sn+1)U)/(s2 (2m+s+1))]

is approximately F with s(2m+s+1) and 2(sn+1) degrees of freedom.

Roy's Maximum Root

If

\Theta = \lambda_1

then

F = \Theta \frac{v-r+q}r

where r = max(p,q) is an upper bound on F that yields a lower bound on the significance level. Degrees of freedom are r for the numerator and v-r+q for the denominator.

Tables of critical values for these statistics are found in Pillai (1960).

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.