**1. Introduction**

In this chapter, we review statistical inference, especially likelihood ratio criterion (LRC) in multivariate linear model, focusing on matrix theory. Consider a multivariate linear model with *p* response variables *y*1, …, *yp* and *k* explanatory or dummy variables *x*1, …, *xk*. Suppose that *y* = (*y*1, …, *yp*)′ and *x* = (*x*1, …, *xk*)′ are measured for *n* subjects, and let the observation of the *i*th subject be denoted by *y<sup>i</sup>* and *x<sup>i</sup>* . Then, we have the observation matrices given by

$$\mathbf{Y} = (y\_1, y\_2, \dots, y\_n)', \quad \mathbf{X} = (\mathbf{x}\_1, \mathbf{x}\_2, \dots, \mathbf{x}\_n)'. \tag{1.1}$$

It is assumed that *y*1, …, *yn* are independent and have the same covariance matrix **Σ**. We express the mean of **Y** as follows:

© 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

$$\mathbf{E}(\mathbf{Y}) \mathbf{=} \boldsymbol{\eta} = (\boldsymbol{\eta}\_1, \dots, \boldsymbol{\eta}\_p). \tag{1.2}$$

A multivariate linear model is defined by requiring that

$$
\eta\_i \in \Omega \text{ for all } i = 1, \dots, p,\tag{1.3}
$$

where *Ω* is a given subspace in the *n* dimensional Euclid space R*<sup>n</sup>*. A typical *Ω* is given by

$$\Omega \mathbf{Q} = \mathcal{R}[\mathbf{X}] = \{ \mathfrak{q} = \mathbf{X}\boldsymbol{\theta} ; \; \mathbf{\bar{\theta}} = (\theta\_1, \dots, \theta\_k)', -\infty < \theta\_i < \infty, i = 1, \dots, k \}. \tag{1.4}$$

Here, ℛ[**X**] is the space spanned by the column vectors of **X**. A general theory for statistical inference on the regression parameter Θ can be seen in texts on multivariate analysis, e.g., see [1–8]. In this chapter, we discuss with algebraic approach in multivariate linear model.

In Section 2, we consider a multivariate regression model in which *xi* ' *s* are explanatory variables and *Ω* = ℛ[**X**]. The maximum likelihood estimator (MLE)s and likelihood ratio criterion (LRC) for Θ<sup>2</sup> =O are derived by using projection matrices. Here, Θ=(Θ<sup>1</sup> Θ2). The distribution of LRC is discussed by multivariate Cochran theorem. It is pointed out that projection matrices play an important role. In Section 3, we give a summary of projection matrices. In Section 4, we consider to test an additional information hypothesis of *y*2 in the presence of *y*1, where *y*1 = (*y*1. …, *yq*)′ and *y*2 = (*yq* + 1. …, *yp*)′. In Section 5, we consider testing problems in discriminant analysis. Section 6 deals with a generalized multivariate linear model which is also called the growth curve model. Some related problems are discussed in Section 7.
