Explain why the columns of an $n imes n$ matrix $A$ are lipractically independent once $A$ is invertible.

You are watching: Explain why the columns of an nxn matrix a are linearly independent when a is invertible

The proof that I thought of was:

If $A$ is invertible, then $A sim I$ ($A$ is row tantamount to the identification matrix). Because of this, $A$ has actually $n$ pivots, one in each column, which means that the columns of $A$ are linearly independent.

The proof that was provided was:

Suppose $A$ is invertible. Thus the equation $Ax = 0$ has actually only one solution, namely, the zero solution. This indicates that the columns of $A$ are linearly independent.

I am not sure whether or not my proof is correct. If it is, would tright here be a reason to choose one proof over the other?

As seen in the Wikipedia article and in Linear Algebra and also Its Applications, $sim$ indicates row equivalence in between matrices.


linear-algebra matrices proof-confirmation
Share
Cite
Follow
edited Sep 13 "16 at 11:51
Bernarexecute Sulzbach
asked Sep 13 "16 at 11:03
*

Bernarexecute SulzbachBernarperform Sulzbach
35411 gold badge22 silver badges99 bronze badges
$endgroup$
6
| Sexactly how 1 more comment

6 Answers 6


Active Oldest Votes
21
$egingroup$
I would certainly say that the textbook"s proof is better bereason it proves what needs to be prrange without using facts around row-operations alengthy the way. To watch that this is the situation, it might assist to compose out every one of the meanings at occupational right here, and also all the facts that get used alengthy the method.

Definitions:

$A$ is invertible if there exists a matrix $A^-1$ such that $AA^-1 = A^-1A = I$The vectors $v_1,dots,v_n$ are lialmost independent if the only solution to $x_1v_1 + cdots + x_n v_n = 0$ (through $x_i in Bbb R$) is $x_1 = cdots = x_n = 0$.

Textbook Proof:

Fact: With $v_1,dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + cdots + x_n v_n = 0$ can be recomposed as $Ax = 0$. (This is true by interpretation of matrix multiplication)

Now, suppose that $A$ is invertible. We desire to show that the only solution to $Ax = 0$ is $x = 0$ (and also by the above fact, we"ll have actually prrange the statement).

Multiplying both sides by $A^-1$ provides us$$Ax = 0 indicates A^-1Ax = A^-10 means x = 0$$So, we might indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.

Your Proof:

Fact: With $v_1,dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + cdots + x_n v_n = 0$ have the right to be recomposed as $Ax = 0$. (This is true by definition of matrix multiplication)

Fact: If $A$ is invertible, then $A$ is row-equivalent to the identity matrix.

See more: How To Follow A Tag On Tumblr ? How To Track Tags On Tumblr: #Followthattag!

Fact: If $R$ is the row-lessened variation of $A$, then $R$ and also $A$ have the same nullroom. That is, $Rx = 0$ and $Ax = 0$ have the same solutions

From the over facts, we conclude that if $A$ is invertible, then $A$ is row-tantamount to $I$. Since the columns of $I$ are livirtually independent, the columns of $A$ should be lipractically independent.