I think it can be easily shown(without assuming the concept of diagonalizability) by using the result that Algebraic multiplicity of an eigen value is greater than or equal to its geometric multiplicity and by case wise determining the Ch eqn and min polynomial.
Proof: Suppose, $T:V\to V$ corresponding to the $n\times n$ matrix $A$, where $V$ is an 'n' dimensional vector space. $\beta$ be the standard basis i.e, $[T]_{\beta}=A$. Now from rank-nullity theorem,
$dim(Ker T)=(n-1)$(say), Assumed $\beta$ be the extended basis of this. [since, $dim(V)=n$ and $dim(Im T)=rank(A)=1$(given)]
Now, $A$ is an $n\times n$ matrix with rank $1$ where $n\geq 2$, so, $det(A)=0$. Since, $A$ is matrix of order $n$ , the characteristic eqn, $det(xI-A)=0$ is of degree $n$. Here, $det(A)=0$ so "$0$" is an eigen value because $x=0$ satisfies the ch. eqn. Now trying to find the geometric multiplicity of the eigen value $0$, i.e the dimension of the solution space of $(A-0.I)x=0$ i.e, $\to Ax=0$, which is same with $dim(Ker T)=(n-1)$. So, the eigen value $0$ has geometric multiplicity $(n-1)$. So, the algebraic multiplicity of $0$ is $\geq (n-1)$. Since Characteristic eqn of degree $n$ of the matrix $A$ will look like
i) $x^n=0$ or,
ii) $[x^{(n-1)}\times (x-c)]=0$ , where, $c\neq 0$, the other eigen value.
Now $x^n=0$ cannot be the Ch eqn for all such matrix $A$ , here is the contradictory example ,
take $A=$ $\begin{bmatrix} 1 & 2\\ 2 & 4 \end{bmatrix}$, a $2\times 2$ matrix of rank $1$ , here $x^2=0$ is not the Ch eqn. Check the Ch eqn is $x(x-5)=0$, which is of second type. As we have only two possibilities of Ch eqn then we can conclude that, The Ch eqn of such matrix $A$ (as given) is $[x^{(n-1)}\times (x-c)]=0$ , $c\neq 0$.
Now $(x-c)=0$ is not the minimal polynomial, because for all such $A$, $(A-cI)\neq 0$, we can assume that same contradictory example again to get contradiction s.t for $A=$ $\begin{bmatrix} 1 & 2\\ 2 & 4 \end{bmatrix}$ (rank=$1$ & order=$2$), $(A-5I)\neq 0$.And also since $rank(A)=1$ $\to$ $A\neq 0$ so $x=0$ is not the minimal polynomial.
Now we will check if the matrix $A$ can satisfy $x(x-c)=0$. Now $c$ is an non-zero eigen value having algebraic and geometric multiplicity $=1$. So, the dimension of the solution space of $(A-c.I)x=0$ is $1$, $x\epsilon V$.So, $dim(Ker(A-c.I))=1$ $\to$ $dim(Im(A-c.I))=(n-1)$
[Since, $(A-c.I)$ is $n\times n$ matrix $\simeq$ $T':V\to V]$. So, $rank(A-c.I)=(n-1)$ and $rank(A)=1$ , so $rank(A(A-c.I))=1$ or, $0$ [as, $r(AB)\leq min (r(A),r(B))$].
If $rank(A(A-c.I))=0$ then $A(A-c.I)=0$.
If $rank(A(A-c.I))=1$ we have $(A(A-c.I))$ is $n\times n$ matrix $\simeq$ $T'':V\to V$, so $dim(Ker(T''))=(n-1)$, we can easily check $x\epsilon Ker(T)$ $\to$ $x\epsilon Ker(T'')$ and dimension same so, $Ker(T)=Ker(T'')$. Now, there exist non zero $x$ $\epsilon$ $Ker(T')$ [As $dim(Ker(T'))=dim(Ker(A-c.I))=1$], then $(A-c.I)x=0$ $\to$ $Ax=cx\neq 0$ so, $x$ does not belong to $Ker(T)$, but $A(A-c.I)x=A^2x-cAx=A(cx)-c^2x=c^2x-c^2x=0$, so $x$ $\epsilon$ $Ker(T'')$ , So, $Ker(T)\neq Ker(T'')$ , hence a contradiction. So, $A(A-c.I)=0$ and $x(x-c)=0$ be the minimal polynomial of degree $2$ for all such $A$ (as given). [Hence proved]
I have tried to do in a new way with the basic concepts only.