## Monthly Archives: February 2012

### Linear algebra II, week 4

Here are some brief remarks on the questions.

For the question on the determinant of

$A=\left( \begin{array}{cc} U & 0 \\ W & X\end{array} \right)$

it is probably best to write at once

$A=\left( \begin{array}{cc} U & 0 \\ W & I_{n_2}\end{array} \right)\left( \begin{array}{cc} I_{n_1} & 0 \\ 0 & X\end{array} \right)$

so that

$\det(A)=\det \left( \begin{array}{cc} U & 0 \\ W & I_{n_2}\end{array} \right) \det \left( \begin{array}{cc} I_{n_1} & 0 \\ 0 & X\end{array} \right)$

by the multiplicativity of the determinant. Then

$\det \left( \begin{array}{cc} U & 0 \\ W & I_{n_2}\end{array} \right)=\det(U)$

and

$\det \left( \begin{array}{cc} I_{n_1} & 0 \\ 0 & X\end{array} \right)=\det(X)$

are immediate, by induction on $n_1$ and $n_2$, for example, using a column expansion. [A word on induction: I’m sure you can see these identities right away by now and you might not need to write down the inductive proof explicitly. But even so, since you’re essentially beginning mathematics, it’s good to be aware that a careful proof of something of this sort often requires induction.]

There are three important properties of the determinant that you need to internalize once and for all:

(1) $\det(A)\neq 0$ iff the rows of $A$ are linearly independent iff the columns of $A$ are linearly independent. A clear application of this property permeates many of the problems in sheets 2 and 3.

(2) Multiplicativity: $\det(AB)=\det(A)\det(B).$

(3) An $n\times n$ matrix $A$ can be view as $n$ column vectors

$A=(A_1, A_2, \ldots, A_n).$

Viewed as a function of those vectors, the determinant is multilinear, that is, linear in each argument separately when the others are kept fixed. For example,

$\det (cA_1+c'A_1', A_2, A_3, \ldots, A_n) = c\det (A_1, A_2, A_3, \ldots, A_n)+c'\det (A_1', A_2, A_3, \ldots, A_n).$

One warning is that definitely

$\det(A+B)\neq \det(A)+\det(B)$

in general. That is, the determinant is definitely not linear as a function of the whole matrix. It is just linear as a function of any given column.

These three properties need to be proved once, more or less using the definitions. However, it’s not very useful to go back to the complicated definition too often. You will see such a phenomenon quite frequently in mathematics: A rather difficult definition that then needs to be used to establish basic foundational properties. It is then these properties that end up the most serviceable.

We discussed at the meeting the expression

$\det (xI-A)=\det(xe_1-A_1, xe_2-A_2, \ldots, xe_n-A_n)$

for the characteristic polynomial. I realized afterwards that actually both parts of problem 6 in sheet 3 can be done fairly easily using the multilinearity. The multilinearity of the determinant means that we can expand this expression essentially as if it were a product of the $n$ arguments, giving rise to $2^n$ terms. Now try to collect those terms in powers of $x$, and you will find

$\binom{n}{i}$

terms that make up the coefficient of

$x^i$.

I leave it to you to compute those terms. You will see the various $(n-i)\times (n-i)$ minors emerging naturally. The only subtle point ends up being the sign in front of various determinants.

Advertisements

### Yet more remarks on Probability A, sheet 1

I am adding yet more remarks on sheet 1 of Probability A, this time on question 6.

### Probability, more remarks

I am adding now some comments on problem 5 of the probability sheet 1.

### Probability tutorials, week 3

Here are some remarks on the probability tutorials last week. I didn’t have time to comment on any but the most elementary problems, but I thought it might be good to review the basics anyways. I’ll try to write more this weekend.