This is a problem on linear dependent columns of the data matrix
used in multiple regression.
Assume the row dimension of is greater than the
column dimension of .
has full column rank of the columns of
are linearly independent.
Let denote the columns of .
The columns of are linearly independent if
implies
that .
The columns of are linearly dependent if
for at least one vector
that is not the zero vector.
Let denote the columns of
the matrix .
The columns of are linearly independent if
implies
that .
The columns of are linearly dependent if
for at least one vector
that is not the zero vector.
Please check your linear algebra text for the definition of column rank.
Notation: denotes the zero vector in -dimensional
space.
Part a)
Suppose the columns of are linearly dependent based on the
coefficient vector ,
that is,
.
Which of the following are
correct statements and implications.
Possibly more than one item is correct.
Part b)
Suppose the columns of are linearly dependent based on the
coefficient vector ,
that is, .
Which of the following are
correct statements and implications.
Possibly more than one item is correct.
Part c)
What is the column rank of
Part d)
What is the column rank of
Part e)
What is the row rank of
Part f)
Which of the following are ways to determine the column rank of ?
There might be more than one correct answer.
From the above, write at least two proofs for the statement:
"the rank of
and are the same".
Hint:
You can earn partial credit on this problem.