🔬 Tutorial problems eta \eta

🔬 Tutorial problems eta \(\eta\)#

Note

This problems are designed to help you practice the concepts covered in the lectures. Not all problems may be covered in the tutorial, those left out are for additional practice on your own.

\(\eta\).1#

Let \(I_{2}\) be the \((2 \times 2)\) identity matrix, and consider the following three matrices:

\[\begin{split} A=\left(\begin{array}{cc} 1 & -4 \\ 0 & 9 \end{array}\right), B=\left(\begin{array}{cc} 4 & 3 \\ -7 & 0 \end{array}\right) \text {, and } C=\left(\begin{array}{ccc} 5 & -1 & -1 \\ 12 & 0 & 2 \end{array}\right) \text {. } \end{split}\]

(a) If possible, find \(A+B\).

(b) If possible, find \(A-B\).

(c) If possible, find \(A+4 B\).

(d) If possible, find \(A+I_{2}\).

(e) If possible, find \(A I_{2}\).

(f) If possible, find \(A+C\).

(g) If possible, find \(A+B^{T}\).

(h) If possible, find \(B C\).

(i) If possible, find \(C B\).

(j) If possible, find \(C B^{T}\).

(k) If possible, find \((A B)^{T}\).

(l) If possible, find \(C+5 I_{2}\).

(m) If possible, find \(C^{T} A\).

(n) If possible, find \((B C)^{T}\).

(o) If possible, find \(A C+B\).

[Bradley, 2013] Progress Exercises 9.2

\(\eta\).2#

The Real Estate Institute wants to develop a model which explains the relationship between the price of land and the distance from the central business district. The price per square metre of the last five blocks of land sold are shown in the following vector:

\[\begin{split} y=\left(\begin{array}{l} 6 \\ 4 \\ 7 \\ 5 \\ 9 \end{array}\right) \end{split}\]

The distance of these blocks from the central business district are shown in the second column of the following matrix:

\[\begin{split} X=\left(\begin{array}{cc} 1 & 15 \\ 1 & 20 \\ 1 & 5 \\ 1 & 16 \\ 1 & 1 \end{array}\right) \end{split}\]

It can be shown that

\[\begin{split} \left(X^{T} X\right)^{-1}=\left(\begin{array}{cc} \frac{4,535}{6,430} & \frac{-57}{1,286} \\ \frac{-57}{1,286} & \frac{5}{1,286} \end{array}\right) \end{split}\]

(a) Find \(X^{T} y\).

(b) Find \(X^{T} X\).

(c) Find \(\left(X^{T} X\right)^{-1} X^{T} y\). (Note that this is the formula for the ordinary least squares (and maximum likelihood) estimator of the coefficient parameter vector in the classical linear regression model.)

(d) Find the hat matrix, \(P=X\left(X^{T} X\right)^{-1} X^{T}\).

(e) Calculate \(P^{T}\). Is the hat matrix symmetric?

(f) Calculate \(P P\). Is the hat matrix idempotent?

(g) Find the residual-making matrix, \(M=I-P\).

(h) Calculate \(M^{T}\). Is the residual-making matrix symmetric?

(i) Calculate \(M M\). Is the residual-making matrix idempotent?

[Shannon, 1995] p. 228, Question 12. Some additional parts have been added to that question here.*

Review the classic linear regression model (CLRM) in the lecture notes.

\[ b=\left(X^{T} X\right)^{-1} X^{T} Y \]

\(\eta\).3#

Compute the following determinants

(a) \(\mathrm{det} \left( \begin{array}{cc} 5,& 1 \\ 0,& 1 \end{array} \right)\)

(b) \(\mathrm{det} \left( \begin{array}{cc} 2,& 1 \\ 1,& 2 \end{array} \right)\)

(c) \(\mathrm{det} \left( \begin{array}{ccc} 1,& 5,& 8 \\ 0,& 2,& 1 \\ 0,& -1,& 2 \end{array} \right)\)

(d) \(\mathrm{det} \left( \begin{array}{ccc} 1,& 0,& 3 \\ 1,& 1,& 0 \\ 0,& 0,& 8 \end{array} \right)\)

(e) \(\mathrm{det} \left( \begin{array}{cccc} 1,& 5,& 8,& 17 \\ 0,& -2,& 13,& 0 \\ 0,& 0,& 1,& 2 \\ 0,& 0,& 0,& 2 \end{array} \right)\)

(f) \(\mathrm{det} \left( \begin{array}{cccc} 2,& 1,& 0,& 0 \\ 1,& 2,& 0,& 0 \\ 0,& 0,& 2,& 0 \\ 0,& 0,& 0,& 2 \end{array} \right)\)

\(\eta\).4#

Consider an \((n \times n)\) Vandermonde matrix [this one can be named :)] of the form

\[\begin{split} V = \begin{bmatrix} 1 & 1 & \cdots & 1 \\ x_1 & x_2 & \cdots & x_n \\ x_1^2 & x_2^2 & \cdots & x_n^2 \\ \vdots & \vdots & \ddots & \vdots \\ x_1^{n-1} & x_2^{n-1} & \cdots & x_n^{n-1} \end{bmatrix} \end{split}\]

Show that the determinant of \(V\) is given by

\[ \det(V) = \Pi_{j<i \leqslant n}(x_i-x_j) \]

for the cases \(n=2\), \(n=3\) and \(n=4\)

Properties of the determinants help in finding an elegant solution. In particular, you may find useful that the determinant does not changes when a row/column of a matrix is added/subtracted from the other one, see for example here for explanation.