The two-phase simplex method will automatically detect and resolve redundancy in the system of equations.
For completeness, let me first mention the case of a system with inequality constraints (something like $Ax \le b$ with $x \ge 0$). Here, the full row rank assumption will be satisfied automatically. We will then add slack variables to turn the system into $Ax + Is = b$ with $x,s \ge 0$, and the augmented matrix $[A \mid I]$ always has full row rank.
In the case of a system with equation constraints (something like $Ax = b$ with $x \ge 0$), we'll have other problems to deal with before we worry about rank. Namely, we don't have an initial basic feasible solution to start from. In this scenario, we use a two-phase simplex method. The most straightforward approach here is to add artificial slack variables in the first phase: again, turn the system into $Ax + Is = b$ with $x,s \ge 0$. Then minimize the sum $s_1 + \dots + s_m$: the original system has a solution if we can get $s_1 = \dots = s_m = 0$.
At this point, if the system $Ax = b$ did not have full row rank, two things could happen.
It could end up being inconsistent. Then we will not be able to get the sum of the artificial slack variables to $0$ in the first phase. In this case, we just report that the linear program is infeasible.
It could be consistent, but with redundant equations. When the artificial slack variables all hit $0$, some of them will remain basic, and we'll have one or more rows in the tableau that only involve the artificial slack variables. In this case, we drop all such rows as we go into the second phase.
For example, if the system is something like
\begin{align}
x_1 + 2x_2 &=2 \\
2x_1 + 4x_2 &= 4
\end{align}
then in phase one, we will be minimizing $s_1 + s_2$ subject to the constraints
\begin{align}
x_1 + 2x_2 + s_1 &=2 \\
2x_1 + 4x_2 + s_2 &= 4.
\end{align}
The initial basic solution has basic variables $(s_1, s_2)$. One possible basic solution at the end of phase one is $(x_1, s_2)$, with the dictionary
\begin{align}
x_1 &= 2 - s_1 - 2x_2 \\
s_2 &= 0 + 2s_1
\end{align}
Going into phase two, we will drop the second equation and simplify the first equation to $$x_1 = 2 - 2x_2.$$ The point $(x_1,x_2) = (2,0)$ is now our basic solution (with basic variable $x_1$), and we can go on to optimize whatever it is we originally wanted to optimize.
To summarize: we will not have to worry about rank in the first phase, because the augmented matrix $[A \mid I]$ always has full row rank. We will not have to worry about rank in the second phase, because we will delete redundant rows after finishing the first phase.