Matrices

Overview


One of the primary tools of linear algebra is the matrix. A matrix is a rectangular arrangement of numbers, as in
{% \begin{bmatrix} a & b & c \\ d & e & f\\ g & h & i \\ \end{bmatrix} %}


Matrices need not be square. When a matrix has a single row or single column, we refer to them as vectors. (see matrix vectors)

A row vector is defined to be a matrix that has a single row. ({% 1 \times n %} matrix)
{% \begin{bmatrix} a & b & c \\ \end{bmatrix} %}
A column vector is defined to be a matrix with a single column ({% n \times 1 %} matrix) :
{% \begin{bmatrix} a \\ c \\ e \\ \end{bmatrix} %}


Linear Algebra defines how to add and multiply these objects. Matrix algebra finds its utility in being able to simplify large sets of equations into a fewer, sometimes, single equation.

Matrices are representations of linear transformations.

Topics


  • The operations of addition and multiplication can be defined for matrices.
    • Matrix Addition
    • Multiplication

    Addition and scalar multiplication turns the space of {% n \times m %} matrices into a vector space.
  • Matrix Inverse - The matrix inverse for a given matrix is the matrix that when multiplied by the original matrix, returns the identity matrix. Not all matrices have inverses.
    • Pseudo Inverse:
    • Generalized Inverse:
  • Determinant - is a number computed from a square matrix. Geometrically, the determinant represents the volume of the column vectors of the matrix.
  • Eigenvalues and vectors - An eigenvector of a matrix is a column vector that when left multiplied by the matrix, returns the same column vector, possibly multiplied by a scalar. The column vector is referred to as an eigenvector, and the corresponding scale is called the eigenvalue.
  • Matrices and Linear Spaces
    • Change of Basis
    • Column Space
    • Rank
    • Projection Matrix
  • Transpose
  • Trace
  • Matrix Types
    • Orthogonal
    • Unitary
    • Hermitian
    • Normal
  • Matrix Calculus
    • Ordinary Differential Equations:
    • Matrix Exponential
  • Matrix Inner Product
  • Vectorized Matrices

Matrix Decompositions


A matrix decomposition is a method for finding a set of other matrices that when multiplied together result in the original matrix. Matrix decompositions are useful for both writing efficient numeric algorithms, as well as being useful in crafting multiple machine learning algorithms.

  • LU: decomposes a matrix into the multiplication of a lower triangular matrix and an upper triangular matrix.
  • Singular Value Decomposition, SVD:
  • Cholesky :
  • QR Decomposition
  • Spectral Decomposition (Diagonalization)

Matrix Algorithms


Matrix algorithms are common algorithms that compute some derived matrix or use matrices to accomplish some other goal.

  • Gauss Siedel :

Matrix api


A matrix wrapper library exists for the numeric library that assists in basic matrix manipulations.


let mt = await import("/lib/linear-algebra/v1.0.0/matrix.mjs");

let answer = mt.matrix([[2,2],[3,1]])
               .multiply([[1],[2]])
               .multiply([[0.3],[2.1]])
               .inverse()
               .toArray();
					


In addition to simplifying the code, the matrix module will also speed up some calculations because it will keep track of information, such as the form of the matrix, that cant easily be tracked in the simple array form.