Here is the proof: it is almost the same as in the textbook: (it's in TeX format)
\parbox{14cm}{Alternatively for a nxm determinant $m > n$, one my choose to write this as a sum of nxn determinants as follows:}
\paragraph{}
\parbox{14cm}{1. Write it as a sum of nxn determinants for all combinations of deleted columns such that n remains, with the columns explicitly deleted}
\paragraph{}
\parbox{14cm}{2. Swap columns with deleted columns until all columns are at leftmost.}
\paragraph{}
\parbox{14cm}{3. Change a term's sign to negative if an odd amount of swaps were required.}
\paragraph{}
\parbox{14cm}{3.1 Theorem}
\paragraph{}
\parbox{14cm}{For a mxn matrix \mathbf{A} with $n>m$ the matrix has a right inverse if $det_R (A) <>0$ and $A^{-1}= (\frac{1}{det_R (A)})A_{CF}^T$,}
\paragraph{}
\parbox{14cm}{where $A_{CF}^T$ is the co-factor matrix of A transposed.}
\paragraph{}
\parbox{14cm}{Proof:}
\paragraph{}
\parbox{14cm}{We must prove: $G = AA^{-1} = A(\frac{1}{det_R (A)})A_{CF}^T =(\frac{1}{det_R (A)})AA_{CF}^T = I_{n\times n}$.}
\paragraph{}
\parbox{14cm}{We have, by definition of matrix multiplication:}
\paragraph{}
\parbox{14cm}{$g_{kl}= \frac{1}{det_R(A)}\Sigma_{s=1}^n a_{ls}A_{ks}$},
\paragraph{}
\parbox{14cm}{ For l = k the last sum is the development $D = det_R (A)$ by the k'th row. Hence:}
\paragraph{}
\parbox{14cm}{$g_{kk} = \frac{1}{det_R (A)}\Sigma_{s=1}^n a_{ks}A_{ks} =1$}
\paragraph{}
\parbox{14cm}{so long as we develop the determinants in $det_R(A)$ and $A_{ks}$ also by row this holds.}
\paragraph{}
\parbox{14cm}{For $l <> k$ this sum is the dwvelopment by the k'th row of the determinant $D'$ obtained from $D$ by replacing the k'th column of $D$ with the l'th column of $D$. This has two columns identical and is zero because we can write the determinant as the sum of $m-1 \times m-1$ determinants with two columns identical and $n-m$ columns deleted, where the column deletion does not include any of the repeated columns plus: pairs of $D'$ with one of the repeated columns deleted. This determinants looks like:}
\paragraph{}
\parbox{14cm}{$\begin{vmatrix}
A_{11}& A_{11} &A_{31} &... &A_{n-m,1}&[]&...&[]\\
A_{12}&A_{12}&A_{32}&...&A_{n-m,2}&[]&...&[]\\
...
\end{vmatrix}$}
\paragraph{}
\parbox{12cm}{where each such determinant will give rise to two determinants:}
\paragraph{}
\parbox{14cm}{$\begin{vmatrix}
A_{11} & [] &A_{31} &... &A_{n-m,1}&[]&...&[]\\
A_{12} & [] &A_{32}&...&A_{n-m,2}&[]&...&[]\\
...
\end{vmatrix}$}
\paragraph{}
\parbox{14cm}{$+
\begin{vmatrix}
[]& A_{11} &A_{31} &... &A_{n-m,1}&[]&...&[]\\
[]&A_{12}&A_{32}&...&A_{n-m,2}&[]&...&[]\\
...
\end{vmatrix}$}
\paragraph{}
\parbox{14cm}{and by our rule for changing a term's sign, these produce the same terms, just with opposite signs. This takes care of the case where the two identical columns have an even amount of columns between them. For the case when the two identical columns have an odd amount of columns between them we don't have opposite signs after shifting all the deleted columns to rightmost, but then we need an odd amount of column swaps to make the determinants identical. QED.}
Here is how we develop cofactors:
$\begin{vmatrix}
a_11&a_12&a_13&a_14\\
a_21&a_22&a_23&a_24\\
a_31&a_32&a_33&a_34\\
\end{vmatrix}$
Then cofactor: A_12 equals
$\begin{vmatrix}
a_21&a_23&a_24\\
a_31&a_33&a_34\\
\end{vmatrix}$
This goes to a sum of 2x2 matrices:
$\begin{vmatrix}
a_21&a_23&[]\\
a_31&a_#3&[]
\end{vmatrix}+
\begin{vmatrix}
a_21&[]&a_24\\
a_31&[]&a_34
\end{vmatrix}+
\begin{vmatrix}
[]&a_23&a_24\\
[]&a_33&a_34
\end{vmatrix}$
and the second term's sign must be inverted.