Blog‎ > ‎Miscellaneous‎ > ‎

Matrix Multiplication - What you Don't Know!!

Hello Readers!!

 

Your frustration from the current lock-down situation due to ‘Corona’ is clearly evident as you have just clicked on a blog post teaching you a concept as trivial as the “Matrix Multiplication”.

 

BUT WAIT !


Don’t go anywhere !


 

I hope you read this post, as you will truly enjoy revisiting this trivial but an important tool of Linear Algebra, and get introduced to new concepts in 5 minutes.


 

Let us start with a problem through which we got introduced to the world of Matrix Algebra.

 

Solving a system of Linear Equations

 

Consider the following system with 3 equations and 3 unknowns:


 

We learned to represent the above set of equations in Matrix form as follows:

 

All the Matrix Algebra that we learned till date is centred around this specific application, directly or indirectly. Hence, basic matrix operations like addition, subtraction and multiplication are defined keeping in mind the same. Amongst these, addition and subtraction is performed element-wise, which is consistent with its scalar counterpart and logically so. But there is something weird about the multiplication operator.

 

We do not multiply matrices element-wise. Instead, the difficult to understand (at least at the time when we learned it first) ‘Row times Column’ method is used. And all the conditions on the size of the matrices, commutation property, etc. follow.

 

This special rule of matrix multiplication is defined just to work with our use-case of linear equations. One can easily verify that only this method of multiplication yields the original set of equations from their equivalent matrix representation. This is a very plausible justification to this way of matrix multiplication, along with other scholarly explanations of the same, like linear combinations of column vectors.[1][2] 

 

However, this is not the only way of multiplying two matrices. Methods other than the ordinary matrix multiplication have emerged from various use-cases of Linear Algebra.

 

Namely:


I. Hadamard Product / Schur Product / Dot Product / Cartesian Product

II. Kronecker Product / Direct Product / Cross Product / Outer Product

 

Let us see in brief about how these products are defined

 

I. Hadamard / Schur Product:


This product is named after either the French mathematician Jacques Hadamard[3] or the German mathematician Issai Schur[4].

 

The way this operates on two matrices is exactly the same as matrix addition, i.e. element-wise.

 

For two 2x2 matrices A and B, the Hadamard / Schur product is defined as follows:


Example:


Properties of Hadamard product:


1. It is commutative

2. It requires the matrix dimensions to be same

 

Applications:


1. JPEG - Lossy Image Compression

2. LSTM and RNN algorithms in Machine Learning

3. Image Kernel Convolution

4. Fuzzy Logic

 



II. Kronecker Product:


This product is named after the German mathematician ‘Leopold Kronecker[5].

 

For two 2x2 matrices A and B, the Kronecker / Direct product is defined as follows:


 

Let us consider the same example from the previous case.

 

Properties of Kronecker product:


1. It is non-commutative

2. It puts no restriction on the dimension of input matrices

3. It transforms the two input matrices into a matrix of higher dimension

 

Applications:


1. Image Processing - Hadamard / Walsh Transform[6]

2. Quantum Mechanics

3. Generate Basis Vectors in higher dimensions

 


 This is the end of this short blog on Matrix Multiplication. Hope you learned something new and will definitely use Google to get to various texts explaining the same concepts in much detail.

 

If you are interested in Linear Algebra in general, do see the lectures on Linear Algebra by Dr. Gilbert Strang of MIT as well as go through the series on ‘Essence of Linear Algebra’ by Grant Sanderson on his channel 3Blue1Brown.

 

Last but not the least, I would like to thank ‘Dr. Faruk Kazi’ for teaching me this valuable concept. I would not have known this without attending his highly informative lectures at VJTI.

 

References:


[1] Matrix Multiplication as Composition - 3Blue1Brown

[2] Matrix Multiplication by Gilbert Strang - MIT OCW

[3] About Jacques Hadamard

[4] About Issai Schur

[5] About Leopold Kronecker

[6] Hadamard/Walsh Transform

 

— Saurabh Gupta

Comments