Matrix
My math knowledge from freshman calculus and linear algebra has mostly faded. Refreshing it here alongside numpy notation.
Annotation
scalar calculation
In numpy, + and - work directly.
scalar product
Hadamard Product: element-wise product of identically shaped vectors. X · Y
X * YNorm
 Distance from the origin to the vector. L1 norm = sum of absolute values of changes L2 norm = Euclidean distance
Angle between vectors
 Using the law of cosines, we can compute the angle between two vectors.
def angle(x, y): v = np.inner(x, y) / (l2_norm(x) * l2_norm(y)) theta = np.arccos(v) return thetamultiplication
XY
X @ YThrough matrix multiplication, a matrix can be understood as an operator in vector space. Because matrix multiplication can send a vector to a different dimensional space. In other words, it can be used for pattern extraction and data compression.
inner product


inner in numpy
np.inner computes the inner product between vectors. To express the inner product of vectors in matrix form, a transpose is typically used. 
np.inner(X, Y)Inverse matrix
np.linalg.inv(X)Pseudo-inverse, Moore-Penrose matrix
- Unlike the regular inverse, the number of rows and columns doesn’t need to match.
- Still performs a role similar to the inverse.  n = rows, m = columns
np.linalg.pinv(X)Solving systems of equations

Linear regression

Considering the distribution of data, doing linear regression as a system of equations is not possible. Therefore, finding a solution that minimizes the L2 norm of y is the general approach.
# using sklearn for linear regressionfrom sklearn.linear_model import LinearRegressionmodel = LinearRegression()model.fit(X, y)y_test = model.predict(x_test)
# Moore-Penrose inverse matrixX_ = np.array([np.append(x, [1]) for x in X]) # add y-interceptbeta = np.linalg.pinv(X_) @ yy_test = np.append(x_test) @ betasklearn automatically estimates the y-intercept when performing linear regression. When doing linear regression via the Moore-Penrose inverse, you need to manually add the y-intercept to construct X.