Matrix Multiplication

Motivating Matrix Multiplication

  • Matrix multiplication is an operation that produces a matrix from two matrices
  • Linear transformations are typically performed using matrix multiplication
  • In other words, multiplying any matrix by a vector is equivalent to performing a linear transformation on that vector
  • Therefore, matrices are a convenient way of representing linear transformations
  • In matrix multiplication, each entry in the resultant matrix is the dot product of a row in the first matrix and a column in the second matrix

System of Equations and Linear Combinations

  • Up until now, we have only used scalar multiplication to determine a linear combination of a set of vectors
  • However, we can actually combine our scalars into a matrix, then our scalar multiplication becomes matrix multiplication
  • In linear algebra, we mostly work with the system of equations defined as the following:
Ax=bAx = b
  • If we're given a constant vector xx and we want to find a vector bb that is a linear combination of a set of vectors AA, then we're solving for bb in the system of equations Ax=bAx = b

    • We're given AA and xx
    • In this case, we would use matrix multiplication to solve for bb
  • If we're finding a vector xx that makes bb a linear combination of a set of vectors AA, then we are solving for xx in the system of equations Ax=b

    • We're given AA and bb
    • In this case, we would use row reduction to solve for xx

System of Equations and Linear Transformations

  • If we're given a transformation matrix AA and a vector xx from that transformed vector space and we want to find the vector bb from our initial vector space, then we're solving for bb in the system of equations Ax=bAx = b

    • We're given AA and xx
    • In this case, we would use matrix multiplication to solve for bb
  • If we're given a transformation matrix AA and a vector bb from our inital vector space and we want to find the vector xx from our transformed vector space, then we're solving for xx in the system of equations Ax=bAx = b

    • We're given AA and bb
    • In this case, we would perform the following steps:

      1. Find the determinant of the transformation matrix
      2. Use that determinant to find the inverse of the transformation matrix
      3. Use that inverse matrix by multipling each side of our system of equations Ax=bAx = b, so we get x=A1bx = A^{-1}b
      4. Solve for xx by performing matrix multiplication of the inverse matrix A1A^{-1} and vector bb
  • Therefore, we can also use matrix multiplication of the inverse matrix if we want to find a vector xx that makes bb a linear combination of a set of vectors AA, instead of manually performing row reduction
  • We can do this because we're solving for the same thing (i.e. vector xx) in both scenarios

Matrix Multiplication as a Linear Combination

  • When performing matrix multiplication, we're computing the dot-product of a row in the first matrix with a column in the second matrix iteratively
  • We can also think of this process (of multiplying constants xx by our variables AA) as the linear combination of our constants xx and variables AA
  • Another way to look at it is that it's a linear combination of the rows of matrix AA using coefficients from our vector xx
  • In this scenario, the linear combination operation and dot product operation are interchangeable operations

Summarizing the Relationship of Everything

  • Matrix multiplication involves taking the dot product of a row from one matrix and a column from another matrix
  • In other words, matrix multiplication involves the dot product
  • Linear transformation is an application of matrix multiplication
  • Linear combination is an application of matrix multiplication

References

Previous
Next

Dot Product

Change of Basis