Suppose that \(\{ v_1, v_2, \dots,
v_n \}\) is a set of linearly dependent vectors. Then there exist scalars \(\alpha_1, \ldots, \alpha_n\) such that
\begin{equation*}
\alpha_1 v_1 + \alpha_2 v_2 + \cdots + \alpha_n v_n = {\mathbf 0 }\text{,}
\end{equation*}
with at least one of the \(\alpha_i\)’s not equal to zero. Suppose that \(\alpha_k \neq 0\text{.}\) Then
\begin{equation*}
v_k = - \frac{\alpha_1}{\alpha_k} v_1 - \cdots - \frac{\alpha_{k - 1}}{\alpha_k} v_{k-1} - \frac{\alpha_{k + 1}}{\alpha_k} v_{k + 1} - \cdots - \frac{\alpha_n}{\alpha_k} v_n\text{.}
\end{equation*}
Conversely, suppose that
\begin{equation*}
v_k = \beta_1 v_1 + \cdots + \beta_{k - 1} v_{k - 1} + \beta_{k + 1} v_{k + 1} + \cdots + \beta_n v_n\text{.}
\end{equation*}
Then
\begin{equation*}
\beta_1 v_1 + \cdots + \beta_{k - 1} v_{k - 1} - v_k + \beta_{k + 1} v_{k + 1} + \cdots + \beta_n v_n = {\mathbf 0}\text{.}
\end{equation*}