Home > AI > Math > LinearAlgebra >

linearly independence

Example:

(1)   \begin{align*}x_1 = \begin{bmatrix} 1 \\ 2 \\ -1 \\ -1 \\ -1 \end{bmatrix}, x_2 = \begin{bmatrix} 2 \\ -1 \\ 1 \\ 2 \\ -2 \end{bmatrix},x_3 = \begin{bmatrix} 3 \\ -4 \\ 3 \\ 5 \\ -3 \end{bmatrix},x_4 = \begin{bmatrix} -1 \\ 8 \\ -5 \\ -6 \\ 1 \end{bmatrix},\end{align*}

To prove linear independece, we need to prove

\sum_{i=1}^k \lambda_i \times x_i = 0 when and only when \lambda_i = 0

By combining \{x_1, x_2, x_3, x_4\}, we got this homogeneous equation and use Gaussian Elimination (both row deduction and column deduction is fine).

(2)   \begin{align*}\{x_1, x_2, x_3, x_4\}  & = \begin{bmatrix} 1 & 2 & 3 & 1 \\ 2 & -1 & -4 & 8 \\ -1 & 1 & 3 & -5 \\ -1 & 2 & 5 & -6 \\ -1 & -2 & -3 & 1 \end{bmatrix} \\& = \begin{bmatrix} 1 & 0 & 0 & 0 \\0 & 1 & 0 & 0 \\ 0 & 0& 0 & 1\\0 & 0 & 0 & 0\\0 & 0 & 0 & 0 \end{bmatrix} \end{align*}