What is Matrix ?

In Machine Learning and Deep Learning you are going to face matrix a lot. It wouldn't be exaggeration if I say we could not solve a single problem in Machine Learning without a matrix.
What is Matrix?
However, it is not just bunch of numbers arranged in columns and rows. Matrix contents a much deeper meaning. Here we will discuss some high-level intuition.
As you are already here on Matrices, I am assuming that you are familiar with vectors. If you are not, go through this great video.
In this article we will restrict ourselves to just two dimensions. It will be easy to understand in 2D and then imagine to higher dimensions.
Matrix is representation of linear Transformation of Space.
Did not get it, okay, let me explain first what is linear transformation.
Suppose we have a 2D space (Plane) and we stretch it, rotate it, shear it, flip it, or do that all things at a time but keeping origin at origin and keeping grid line parallel and equally spaced to each other is called as Linear transformation.
Following is 2D space with base vectors i and j in direction of x and y axis respectively.

We can represent the base vectors as i = [1,0], j=[0,1]. Now stack them together

If we rotate the above space in 90 degree anti clock wise then it would look like below.

Now our i wil be [0,1] and j will be [-1,0]. Stack this new i and j

This matrix A is representation of, where our base vector i and j will land when we rotate space 90 degree anti clockwise.
When we multiply any vector to a matrix, the resultant vector shows us that where our vector will land in linearly transform the space.
Inverse Matrix
Inverse is something that it brings down our prior action to normal, e.g., Inverse of 10 is (1/10). Let’s connect this to our matrix definition.
According to our definition matric is representation of linearly transformation, Then inverse of that matrix must be exact opposite transformation of the space. So, If matrix A is 90 degree anti clock wise rotation. Then the inverse will be 90 degree clockwise. Right… !!!
Let’s draw it.

We will stack together the base vector i and j in matrix A`

As per definition, multiplication of A and A` should be our original matrix O.

We have proved that matrix multiplication of A and A` is O i.e. Identity matrix of our original space.
On high level, we can say that matrix is representation of base vectors if we linear transform the space. And the columns of matrix we can take it as base vector of a new space.
This is just high-level intuition. Please go through the references for more.
References
[1] https://www.3blue1brown.com/
[2] Introduction to Linear Algebra by Gilbert Strang