Expand using the distributive law and the given fact that AB = BA.This is preposterous... What's the easiest way to do this question?
Writing out the components takes way too long
This is preposterous... What's the easiest way to do this question?
Writing out the components takes way too long
Let A be m x n. Since AB is defined, B is n x p for some p.Oh wow this is what happens when I forget that matrices satisfy other laws too...
Ok, another question. Using just a paragraph I can explain it. But if I wanted my algebra to supplement it how would I do this.
Alright fair enough that's what I didLet A be m x n. Since AB is defined, B is n x p for some p.
Since BA is defined, n x p is compatible with m x n, which implies p = m.
Now, AB is m x p (since A is m x n and B is n x p), i.e. m x m.
Similarly, BA is n x n.
(m,n,p positive integers)
Yeah pretty much.The question: Prove the commutative law of matrix addition for matrices A and B of same size.
Can't I just break up the matrix (A+B) into (A) + (B) and then use the commutative law for R before reverting?
In = I can be proved by induction using that fact, yes.Since AI = A, can we just assume that I^n = I? (n in Z)
In the order given, you have AABB = ABABLet A and be square matrices such that A^2=I, B^2=1 and (AB)^2 = I. Prove that AB=BA
So keeping in mind that I^2 = I I let A^2 B^2 = I.
This means that AABB=I
The second statement also implies ABAB=I
So equatin, AABB=ABAB which is not trivial as matrix multiplication is not commutative.
Can I therefore argue that AB=BA because the sequence of multiplication matters? I feel that this is flawed because you can't cancel out matrices: AB=AC does not imply B=C
Basically, I think I gave a wrong proof
NoteLet A and be square matrices such that A^2=I, B^2=1 and (AB)^2 = I. Prove that AB=BA
So keeping in mind that I^2 = I I let A^2 B^2 = I.
This means that AABB=I
The second statement also implies ABAB=I
So equatin, AABB=ABAB which is not trivial as matrix multiplication is not commutative.
Can I therefore argue that AB=BA because the sequence of multiplication matters? I feel that this is flawed because you can't cancel out matrices: AB=AC does not imply B=C
Basically, I think I gave a wrong proof
This essentially assumes that A and B are invertible (which is indeed true as can be proved by consideration of determinants, or definition of inverses etc., and if we are allowed to use knowledge of matrix inverses, we can do the proof like that or other ways using inverses).In the order given, you have AABB = ABAB
The first and last matrices are identical, and appear in the same position, so they can be removed, leaving behind the central matrices.
The equivalence follows.
Ok I'm inclined to say that's magic, even though it's really reuse of associative law...Note
AB = A I B
= A (A2 AB AB B2) B (using the given equalities involving I)
= A A A A (B A) B B B B
= I I BA I I (as A2 = I = B2)
= BA.
(Of course we have used associativity of matrix multiplication a lot here.)
Ok I'm inclined to say that's magic, even though it's really reuse of associative law...
_______________________________
Let A be a 2x2 real matrix such that AX=XA for all 2x2 real matrices X. Show that A is a scalar product of the identity matrix.
Hint please?
_______________________________
Confession: Wow, I felt dumb about maths since uni but THIS has taken me aback on new levels
Spoilers don't even work properly on this forum.Put some in a spoiler because realised you only wanted hint.
(you aren't wrong so far)One more question here and then if I have more I'm making a new thread.
Either hint or full answer please, thanks. Editing in what I have so far.
_______________________________
Suppose A is an mxn matrix and B is an nxp matrix. Then AB is an mxp matrix.
Suppose C is a pxq matrix. Then BC is an nxq matrix.
Redundancies
If A(BC) exists, then col.(A) = row.(BC) => n = n
If (AB)C exists, then col.(AB) = row.(C) => p = p
= leehuan thinks he is wrong again