16 KiB
Executable File

author date title tags
Alvie Rahman \today MMME1026 // Systems of Equations and Matrices
uni
nottingham
mechanical
engineering
mmme1026
maths
systems_of_equations
matrices

Systems of Equations (Simultaneous Equations)

Gaussian Elimination

Gaussian eliminiation can be used when the number of unknown variables you have is equal to the number of equations you are given.

I'm pretty sure it's the name for the method you use to solve simultaneous equations in school.

For example if you have 1 equation and 1 unknown:

\begin{align*} 2x &= 6 \ x &= 3 \end{align*}

Number of Solutions

Let's generalise the example above to

ax = b

There are 3 possible cases:

\begin{align*} a \ne 0 &\rightarrow x = \frac b a \ a = 0, b \ne 0 &\rightarrow \text{no solution for $x$} \ a = 0, b = 0 &\rightarrow \text{infinite solutions for $x$} \end{align*}

2x2 Systems

A 2x2 system is one with 2 equations and 2 unknown variables.

Example 1

\begin{align*} 3x_1 + 4x_2 &= 2 &\text{(1)} \ x_1 + 2x_2 &= 0 &\text{(2)} \ \end{align*}

\begin{align*} 3\times\text{(2)} = 3x_1 + 6x_2 &= 0 &\text{(3)} \ \text{(3)} - \text{(1)} = 0x_1 + 2x_2 &= -2 \ x_2 &= -1 \end{align*}

We've essentially created a 1x1 system for x_2 and now that's solved we can back substitute it into equation (1) (or equation (2), it doesn't matter) to work out the value of x_1:

\begin{align*} 3x_1 + 4x_2 &= 2 \ 3x_1 - 1 &= 2 \ 3x_1 &= 6 \ x_1 &= 2 \end{align*}

You can check the values for x_1 and x_2 are correct by substituting them into equation (2).

3x3 Systems

A 3x3 system is one with 3 equations and 3 unknown variables.

Example 1

\begin{align*} 2x_1 + 3x_2 - x_3 &= 5 &\text{(1)} \ 4x_1 + 4x_2 - 3x_3 &= 5 &\text{(2)} \ 2x_1 - 3x_2 + x_3 &= 5 &\text{(3)} \ \end{align*}

The first step is to eliminate x_1 from (2) and (3) using (1):

\begin{align*} \text{(2)}-2\times\text{(1)} = -2x_2 -x_3 &= -1 &\text{(4)} \ \text{(3)}-\text{(1)} = -6x_2 + 3x_3 &= -6 &\text{(5)} \ \end{align*}

This has created a 2x2 system of x_2 and x_3 which can be solved as any other 2x2 system. I'm too lazy to type up the working, but it is solved like any other 2x2 system.

\begin{align*} x_2 &= -2 x_3 &= 5 \end{align*}

These values can be back-substituted into any of the first 3 equations to find out x_1:

\begin{align*} -2x_1 + 3x_2 - x_3 = 2x_1 + 6 - 3 = 5 \rightarrow x_1 = 1 \end{align*}

Example 2

\begin{align*} x_1 + x_2 - 2x_3 &= 1 &R_1 \ 2x_1 - x_2 - x_3 &= 1 &R_2 \ x_1 + 4x_2 + 7x_3 &= 2 &R_3 \ \end{align*}

  1. Eliminate x_1 from R_2, R_3:

    \begin{align*} x_1 + x_2 - 2x_3 &= 1 &R_1' = R_1\

    • 3x_2 - 5x_3 &= -1 &R_2' = R_2 - 2R_1 \ 3x_2 + 5x_3 &= 1 &R_3' = R_3 - R_1 \ \end{align*}

    We've created another 2x2 system of R_2' and R_3'

  2. Eliminate x_2 from R_3''

    \begin{align*} x_1 + x_2 - 2x_3 &= 1 &R_1'' = R_1' = R_1\

    • 3x_2 - 5x_3 &= -1 &R_2'' = R_2' = R_2 - 2R_1 \ 0x_3 &= 0 &R_3'' = R_3 '+ R_2' \ \end{align*}

    We can see that x_3 can be any number, so there are infinite solutions. Let:

    x_3 = t

    where t can be any number

  3. Substitute x_3 into R_2'':

    R_2'' = -3x_2 - 5t = -1 \rightarrow x_2 = \frac 1 3 - \frac{5t} 3
  4. Substitute x_2 and x_3 into R_1'':

    R_1'' = x_1 + \frac 1 3 - \frac{5t} 3 + 2t = 1 \rightarrow x_1 = \frac 2 3 - \frac t 3

Systems of Equations and Matrices

Many problems in engineering have a very large number of unknowns and equations to solve simultaneously. We can use matrices to solve these efficiently.

Take the following simultaneous equations::

\begin{align*} 3x_1 + 4x_2 &= 2 &\text{(1)} \ x_1 + 2x_2 &= 0 &\text{(2)} \end{align*}

They can be represented by the following matrices:

\begin{align*} A &= \begin{pmatrix} 3 & 4 \ 1 & 2 \end{pmatrix} \ \pmb x &= \begin{pmatrix} x_1 \ x_2 \end{pmatrix} \ \pmb b &= \begin{pmatrix} 2 \ 0 \end{pmatrix} \ \end{align*}

You can then express the system as:

A\pmb x = \pmb b

A 3x3 System as a Matrix

\begin{align*} 2x_1 + 3x_2 - x_3 &= 5 \ 4x_1 + 4x_2 - 3x_3 &= 3 \ 2x_1 - 3x_2 + x_3 &= -1 \end{align*}

Could be expressed in the form A\pmb x = \pmb b where:

\begin{align*} A &= \begin{pmatrix} 2 & 3 & -1 \ 4 & 4 & -3 \ 2 & -3 & -1 \end{pmatrix} \ \pmb x &= \begin{pmatrix} x_1 \ x_2 \ x_3 \end{pmatrix} \ \pmb b &= \begin{pmatrix} 5 \ 3 \ -1 \end{pmatrix} \ \end{align*}

An m\times n System as a Matrix

\begin{align*} a_{11}x_1 + a_{12}x_2 + \cdots + a_{1n}x_n &= b_1 \ a_{21}x_1 + a_{22}x_2 + \cdots + a_{2n}x_n &= b_2 \ \cdots \ a_{m1}x_1 + a_{m2}x_2 + \cdots + a_{mn}x_n &= b_m \ \end{align*}

Could be expressed in the form A\pmb x = \pmb b where:

\begin{align*} A = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1n} \ a_{21} & a_{22} & \cdots & a_{2n} \ \vdots & & & \vdots \ a_{m1} & a_{m2} & \cdots & a_{mn} \end{pmatrix}, \pmb x = \begin{pmatrix} x_1 \ x_2 \ \vdots \ x_n \end{pmatrix}, \pmb b = \begin{pmatrix} b_1 \ b_2 \ \vdots \ b_m \end{pmatrix} \end{align*}

Matrices

Order of a Matrix

The order of a matrix is its size e.g. 3\times2 or m\times n

Column Vectors

  • Column vectors are matrices with only one column:

    \begin{pmatrix} 1 \\ 2 \end{pmatrix} \begin{pmatrix} 4 \\ 45 \\ 12 \end{pmatrix}
  • Column vector variables typed up or printed are expressed in \pmb{bold} and when it is handwritten it is \underline{underlined}:

    \pmb x = \begin{pmatrix} -3 \\ 2 \end{pmatrix}

Matrix Algebra

Equality

Two matrices are the same if:

  • Their order is the same
  • Their corresponding elements are the same

Addition and Subtraction

Only possible if their order is the same. \begin{align*} A + B&= \begin{pmatrix} a_{11} + b_{11} & a_{12} + b_{12} & \cdots & a_{1n} + b_{1n} \ a_{21} + b_{21} & a_{22} + b_{22} & \cdots & a_{2n} + b_{2n} \ \vdots & & & \vdots \ a_{m1} + b_{m1} & a_{m2} + b_{m2} & \cdots & a_{mn} + b_{mn} \end{pmatrix} \ A - B&= \begin{pmatrix} a_{11} - b_{11} & a_{12} - b_{12} & \cdots & a_{1n} - b_{1n} \ a_{21} - b_{21} & a_{22} - b_{22} & \cdots & a_{2n} - b_{2n} \ \vdots & & & \vdots \ a_{m1} - b_{m1} & a_{m2} - b_{m2} & \cdots & a_{mn} - b_{mn} \end{pmatrix}, \end{align*}

Zero Matrix

This is a matrix whose elements are all zeros. For any matrix A,

A + 0 =A

We can only add matrices of the same order, therefore 0 must be of the same order as A.

Multiplication

Let


\begin{matrix}
A & m\times n \\
B & p\times q
\end{matrix}

To be able to multiply A by B, n = p.

If n \ne p, then AB does not exist.


\begin{matrix}
A & B & = & C \\
m\times n & p \times q & & m\times q
\end{matrix}

When C = AB exists,

C_{ij} = \sum_r\! a_{ir}b_{rj}

That is, C_{ij} is the 'product' of the $i$th row of A and $j$th column of B.

Multiplication of a Matrix by a Scalar

If \lambda is a scalar, we define


\lambda a = \begin{pmatrix} \lambda a_{11} & \lambda a_{12} & \cdots & \lambda a_{1n} \\
                \lambda a_{21} & \lambda a_{22} & \cdots & \lambda a_{2n} \\
                \vdots &        &     & \vdots \\
                \lambda a_{m1} & \lambda a_{m2} & \cdots & \lambda a_{mn} 
                \end{pmatrix}, 

Example 1


\begin{pmatrix} 1 & -1 \\ 2 & 1 \end{pmatrix}
\begin{pmatrix} 0 & 1 \\ 3 & 2 \end{pmatrix} = 
\begin{pmatrix} -3 & -1 \\ 3 & 4 \end{pmatrix}

\begin{pmatrix} 0 & 1 \\ 3 & 2 \end{pmatrix}
\begin{pmatrix} 1 & -1 \\ 2 & 1 \end{pmatrix} =
\begin{pmatrix} 2 & 1 \\ 7 & -1 \end{pmatrix}

Example 2


A = \begin{pmatrix} 4 & 1 & 6 \\ 3 & 2 & 1 \end{pmatrix},\,
B = \begin{pmatrix} 1 & 1 \\ 1 & 2 \\ 1 & 0 \end{pmatrix}

AB = \begin{pmatrix} 11 & 6 \\ 6 & 7 \end{pmatrix},\,
BA = \begin{pmatrix} 7 & 3 & 7 \\ 10 & 5 & 8 \\ 4 & 1 & 6 \end{pmatrix}

Other Properties of Matrix Algebra

  • (\lambda A)B = \lambda(AB) = A(\lambda B)

  • A(BC) = (AB)C = ABC

  • (A+B)C = AC + BC

  • C(A+B) = CA + CB

  • In general, AB \ne BA even if both exist

  • AB = 0 does not always mean A = 0 or B = 0:

    $$\begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix} \begin{pmatrix}3 & 0 \ 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 \ 0 & 0 \end{pmatrix} = 0$$

    It follows that AB = AC does not imply that B=C as

    AB = AC \leftrightarrow A(B + C) = 0

    and as A and (B-C) are not necessarily 0, B is not necessarily equal to C:

    $$AB = \begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix} \begin{pmatrix}0 & 0 \ 1 & 0 \end{pmatrix} = \begin{pmatrix} 1 & 0 \ 0 & 0 \end{pmatrix}$$

    and

    $$AC = \begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix} \begin{pmatrix}1 & 2 \ 1 & 0 \end{pmatrix} = \begin{pmatrix} 1 & 0 \ 0 & 0 \end{pmatrix} = AB$$

    but B \ne C

Special Matrices

Square Matrix

Where m = n

Example 1

A 3\times3 matrix.

\begin{pmatrix}1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9\end{pmatrix}

Example 2

A 2\times2 matrix.

\begin{pmatrix}1 & 2 \\ 4 & 5 \end{pmatrix}

Identity Matrix

The identity matrix is a square matrix whose eleements are all 0, except the leading diagonal which is 1s. The leading diagonal is the top left to bottom right corner.

It is usually denoted by I or I_n.

The identity matrix has the properties that

AI = IA = A

for any square matrix A of the same order as I, and

Ix = x

for any vector x.

Example 1

The 3\times3 identity matrix.

\begin{pmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{pmatrix}

Example 2

The 2\times2 identity matrix.

\begin{pmatrix}1 & 0 \\ 0 & 1 \end{pmatrix}

Transposed Matrix

The transpose of matrix A of order m\times n is matrix A^T which has the order n\times m. It is found by reflecting it along the leading diagonal, or interchanging the rows and columns of A.

by Lucas Vieira

Let matrix D = EF, then D^T = (EF)^T = E^TF^T

Example 1


A = \begin{pmatrix}3 & 2 & 1 \\ 4 & 5 & 6 \end{pmatrix},\,
A^T = \begin{pmatrix}3 & 4 \\ 2 & 5 \\ 1 & 6\end{pmatrix}

Example 2


B = \begin{pmatrix}1 \\ 4\end{pmatrix},\,
B^T = \begin{pmatrix}1 & 4\end{pmatrix}

Example 3


C = \begin{pmatrix}1 & 2 & 3 \\ 0 & 5 & 1 \\ 2 & 3 & 7\end{pmatrix},\,
C^T = \begin{pmatrix}1 & 0 & 2 \\ 2 & 5 & 4 \\ 3 & 1 & 7\end{pmatrix}

Orthogonal Matrices

A matrix, A, such that

A^{-1} = A^T

is said to be orthogonal.

Another way to say this is

AA^T = A^TA = I

Symmetric Matrices

A square matrix which is symmetric about its leading diagonal:

A = A^T

You can also express this as the matrix A, where

a_{ij} = a_{ji}

is satisfied to all elements.

Example 1

$$\begin{pmatrix} 1 & 0 & -1 & 3 \ 0 & 3 & 4 & -1 \ -2 & 4 & -1 & 6 \ 3 & -7 & 6 & 2 \end{pmatrix}$$

Anti-Symmetric

A square matrix is anti-symmetric if

A = -A^T

This can also be expressed as

a_{ij} = -a_{ji}

This means that all elements on the leading diagonal must be 0.

Example 1

$$\begin{pmatrix} 0 & -1 & 5 \ 1 & 0 & 1 \ -5 & -1 & 0 \end{pmatrix}$$

The Determinant

Determinant of a 2x2 System

The determinant of a 2x2 system is

D = a_{11}a_{22} - a_{12}a_{21}

It is denoted by


\begin{vmatrix}
a_{11} & a_{12} \\
a_{21} & a_{22}
\end{vmatrix}
\text{ or }
\det
\begin{pmatrix}
a_{11} & a_{12} \\
a_{21} & a_{22}
\end{pmatrix}
  • A system of equations has a unique solution if D \ne 0

  • If D = 0, then there are either

    • no solutions (the equations are inconsistent)
    • intinitely many solutions

Determinant of a 3x3 System

Let


A = \begin{pmatrix}
a_{11} & a_{12} & a_{13} \\
a_{21} & a_{22} & a_{23} \\
a_{31} & a_{32} & a_{33}
\end{pmatrix}

\begin{align*} \det A = &a_{11} \times \det \begin{pmatrix}a_{22} & a_{23} \ a_{32} & a_{33} \end{pmatrix} \ &-a_{12} \times \det \begin{pmatrix}a_{21} & a_{23} \ a_{31} & a_{33} \end{pmatrix} \ &+a_{13} \times \det \begin{pmatrix}a_{21} & a_{22} \ a_{31} & a_{32} \end{pmatrix} \end{align*}

The 2x2 matrices above are created by removing any elements on the same row or column as its corresponding coefficient:

Chessboard Determinant

\det A may be obtained by expanding out any row or column. To figure out which coefficients should be subtracted and which ones added use the chessboard pattern of signs:

\begin{pmatrix} + & - & + \\ - & + & - \\ + & - & + \end{pmatrix}

Properties of Determinants

  • \det A = \det A^T
  • If all elements of one row of a matrix are multiplied by a constant z, the determinant of the new matrix is z times the determinant of the original matrix:

    \begin{align*} \begin{vmatrix} za & zb \ c & d \end{vmatrix} &= zad - zbc \ &= z(ad-bc) \ &= z\begin{vmatrix} a & b \ c & d \end{vmatrix} \end{align*}

    This is also true if a column of a matrix is mutiplied by a constant.

    Application if the fator z appears in each elements of a row or column of a determinant it can be factored out

    $$\begin{vmatrix}2 & 12 \ 1 & 3 \end{vmatrix} = 2\begin{vmatrix}1 & 6 \ 1 & 3 \end{vmatrix} = 2 \times 3 \begin{vmatrix} 1 & 2 \ 1 & 1 \end{vmatrix}$$

    Application if all elements in one row or column of a matrix are zero, the value of the determinant is 0.

    \begin{vmatrix} 0 & 0 \\ c & d \end{vmatrix} = 0\times d - 0\times c = 0

    Application if A is an n\times n matrix,

    \det(zA) = z^n\det A
  • Swapping any two rows or columns of a matrix changes the sign of the determinant

    \begin{align*} \begin{vmatrix} c & d \ a & b \end{vmatrix} &= cb - ad \ &= -(ad - bc) \ &= -\begin{vmatrix} a & b \ c & d \end{vmatrix} \end{align*}

    Application If any two rows or two columns are identical, the determinant is zero.

    Application If any row is a mutiple of another, or a column a multiple of another column, the determinant is zero.

  • The value of a determinant is unchanged by adding to any row a constant multiple of another row, or adding to any column a constant multiple of another column

  • If A and B are square matrices of the same order then

    \det(AB) = \det A \times \det B

Inverse of a Matrix

If A is a square matrix, then its inverse matrix is A^{-1} and is defined by the property that:

A^{-1}A = AA^{-1} = I
  • Not every matrix has an inverse

  • If the inverse exists, then it is very useful for solving systems of equations:

    \begin{align*} A\pmb{x} = \pmb b \rightarrow A^{-1}A\pmb x &= A^{-1}\pmb b \ I\pmb x &= A^{-1}\pmb b \ \pmb x &= A^{-1}\pmb b \end{align*}

    Therefore there must be a unique solution to A\pmb x = \pmb b: \pmb x = A^{-1}\pmb b.

  • If D = EF then

    D^-1 = (EF)^{-1} = F^{-1}E^{-1}

Inverse of a 2x2 Matrix

If A is the 2x2 matrix


A = \begin{pmatrix}
a_{11} & a_{12} \\
a_{21} & a_{22}
\end{pmatrix}

and its determinant, D, satisfies D \ne 0, A has the inverse A^{-1} given by


A^{-1} = \frac 1 D \begin{pmatrix}
a_{22} & -a_{12} \\
-a_{21} & a_{11}
\end{pmatrix}

If D = 0, then matrix A has no inverse.

Example 1

Find the inverse of matrix A = \begin{pmatrix} -1 & 5 \\ 2 & 3 \end{pmatrix}.

  1. Calculate the determinant

    \det A = -1 \times 3 - 5 \times 2 = -13

    Since \det A \ne 0, the inverse exists.

  2. Calculate A^{-1}

    A^{-1} = \frac 1 {-13} \begin{pmatrix} 3 & -5 \\ -2 & -1\end{pmatrix}