# Matrices and Simultaneous Equations

Consider the following equations:

${2x+3y =8, \ x - y = -1}$

This can be written using matrix form as

${\begin {bmatrix} 2 & 3 \\ 1 & -1 \end {bmatrix} \begin {bmatrix} x \\ y \end {bmatrix} = \begin {bmatrix} 8 \\ -1 \end {bmatrix}}$

Let ${\begin {bmatrix} 2 & 3 \\ 1 & -1 \end {bmatrix}}$ be ${A}$, the matrix of coefficients, ${\begin {bmatrix} x \\ y \end {bmatrix}}$ be ${X}$, the matrix of variables and ${\begin {bmatrix} 8 \\ -1 \end {bmatrix}}$ be ${B}$, the matrix of constant terms.

We can then write ${AX =B}$.

Now, consider a matrix, where the column vector ${B}$ is joined to ${A}$, s.t. it is the last column of the newly formed matrix. i.e.

${\begin {bmatrix} 2 & 3 & 8\\ 1 & -1 & -1 \end {bmatrix}}$

This matrix is known as the augmented matrix. Let’s denote it by ${(A,B)}$. The behavior of the system depends on the relation between the rank of augmented matrix ${\rho (A,B)}$ and the rank of coefficient matrix ${\rho (A)}$.

The augmentation is done usually for the purpose of performing the same elementary row operations on each of the given matrices.

Thus, if a system has ${m}$ variables in ${n}$ equations, it can be written in the form

${A_{n \times m} X_{m \times 1} = B_{n \times 1}}$

The augmented matrix will be ${(A,B)_{n \times {(m+1)}}}$.

• ### Homogeneous and non-Homogeneous Systems

When a system of equations has all constant terms to be ${0}$, i.e. ${B}$ is a null vector, it is known as a homogeneous system. When ${B}$ is not a null vector, it is known as a non-homogeneous system.

• ### Consistent and Inconsistent Systems

When a system has 1 or more solutions, it is said to be consistent. When it does not have any solution, it is said to be inconsistent.

Thus, a homogeneous system will always be consistent, because, zeros always form a solution. (This solution is known as trivial solution).

• ### Condition for Consistency of Non-homogeneous System of Equations

When the rank of augmented matrix is equal to the rank of matrix of coefficients, the system is consistent. i.e.

${\rho (A,B) = \rho (A)}$

When it isn’t, the system is inconsistent.

• ### Finitely Many and Infinitely Many Solutions of Non-homogeneous System of Equations

Once the condition for consistency is satisfied, we can look for the possibility of infinitely many solutions. When the rank ${\rho}$ is less than the total number of unknowns,${m}$, the system possesses an infinite number of solutions. When it is equal to the total number of unknowns, the system possesses a unique solution.

• ### A sub-condition, provided that ranks are equal

When the number of unknowns is equal to the number of equations, the coefficient matrix ${A}$ is a square matrix. In such cases, if ${|A| \ne 0}$, the system possesses a unique solution given by ${X = A^{-1}B}$. (Discussed in Matrices III)

When ${|A|=0}$, if ${\rho < m}$, it is a case of infinitely many solutions.

• ### Condition for Only Unique (Trivial) Solution of Homogeneous System

The homogeneous system always possesses a trivial solution. If the ranks are equal, and is less than the the number of unknowns, there will be infinitely many solutions.

For homogeneous systems, where number of equations is equal to number of unknowns, when ${|A| \ne 0}$, it possesses only trivial solution.

• ### Linear Dependence of Vectors

Recall : A vector is either a row vector or a column matrix.

A set of ${n}$ vectors, ${x_1, x_2, ... , \ x_n}$ is said to be linearly dependent, when there exist ${n}$ scalars ${c_1, c_2, ... ,\ c_n}$, not all zero, such that

${\sum_{i=1}^{n} c_i \times x_i =0, \ \ \ \ equation \ 1}$

When this condition is not satisfied, the vectors are linearly independent. For example, consider ${\hat i, \hat j}$ and ${\hat k}$. They can be written as

${\hat i = [1 \ 0 \ 0], \hat j = [0 \ 1 \ 0], \hat k = [0 \ 0 \ 1]}$

There exist no scalars ${c_1, c_2, c_3}$, not all zero, s.t. ${c_1 \hat i + c_2 \hat j + c_3 \hat k = 0}$. Hence these vectors are linearly independent.

Equation ${1}$ gives us a homogeneous system of equations with ${X}$ being the matrix containing all ${c_i}$s.

• ### Orthogonality

A square matrix ${A}$ is orthogonal, when ${A^T = A^{-1}}$. Its determinant is always equal to ${\pm 1}$. Consider a system

${Y = AX}$

such that

${Y = \begin {bmatrix} y_1 \\ y_2 \\ .. \\ .. \\ y_n \end {bmatrix} , X = \begin {bmatrix} x_1 \\ x_2 \\ .. \\ .. \\ x_n \end {bmatrix}, A = \begin {bmatrix} a_{11} & a_{12} & .. & .. & a_{1n}\\ a_{21} & a_{22} & .. & .. & a_{2n} \\ .. & .. & .. & .. & ..\\ .. & .. & .. & .. & .. \\ a_{n1} & a_{n2} & .. & .. & a_{nn}\end {bmatrix}}$

${A}$ is orthogonal, when ${\sum \limits_i x_i^2 = \sum \limits_i y_i^2}$