There are several explanations of SVD all over the place. Here is a link to some explanation in this forum.
Here is a construction that gets the ordering as you asked.
Suppose $A$ is an $m\times n$ matrix on $\mathbb{C}$. We use $A^*$ to denote the conjugate transpose of $A$ (this is an $n\times m$ matrix). In terms in operators on $L(\mathbb{C}^m,\mathbb{C}^n)$, $A^*$ is the operator that satisfies $$y^*Ax=\langle Ax,y\rangle = \langle x,A^*y\rangle=(A^*y)^*x$$
- $A^*A$ is an $n\times n$ matrix and satisfies $x^*A^*Ax=\langle Ax,Ax\rangle=\|Ax\|^2_2\geq0$.
- By known facts of linear algebra, $A^*A$ has $n$-eigenvalues, all real and non negative which then can be ordered decreasingly as $\sigma^2_1\geq \sigma^2_2\geq\ldots\geq\sigma^2_n$. Eigen vectors corresponding to different eigenvalues are orthogonal, and so, we can find an orthogonal basis for $\mathbb{C}^n$ consistent of Eigen vectors .
- Suppose $r=\operatorname{rank}(A^*A)$. Then $r\leq (m,n)$, and so $\sigma^2_1\geq\ldots\geq\sigma^2_r>0=\sigma^2_{r+1}=\ldots\sigma^2_n$.
- We choose eigen vectors $u_j$ such that $$A^*Au_j=\sigma^2_j u_j,\quad 1\leq j\leq n$$ and $\langle u_i,u_j\rangle=u^*_ju_i=\delta_{ij}$. That is $\{u_j:1\leq j\leq n\}$ for an orthonormal basis of eigen vectors.
- In particular $$ \|Au_j\|^2=\langle Au_j,Au_j\rangle =\langle u_j,A^*Au_j\rangle =\sigma^2_j\langle u_j,u_j\rangle =\sigma^2_j$$ and so, $Au_j\neq 0$ for $1\leq j\leq r$ and $0$ otherwise.
- Define $Q$ as the $n\times n$ matrix whose $j$-th row is $u^*_j$. Clearly $Q$ is an orthogonal matrix since $QQ^*=I_n$ which in turn means that $Q^*Q=I_n$.
- For $i=1,\ldots ,r$ define $$v_i=\frac{1}{\sigma_i}Au_j$$
- Notice that if $1\leq i,j\leq r$, $$\langle v_i,v_j\rangle =\frac{1}{\sigma_i\sigma_j}\langle Au_i,Au_j\rangle=\frac{1}{\sigma_i\sigma_j}\langle u_i,A^*Au_j\rangle =\frac{\sigma_j}{\sigma_i}\delta_{ij}=\delta_{ij}$$
That is, $\{v_j:1\leq j\leq r\}$ are orthonormal vectors in $\mathbb{C}^m$.
- Complete $\{v_1,\ldots,v_r\}$ with vectors $\{v_{r+1},\ldots,v_m\}$ (if needed) to form an orthonormal basis for $\mathbb{C}^m$. Define $P$ as the $m\times m$ matrix whose $i$-th column is $v_i$. Clearly, $P$ is an orthogonal matrix for $P^*P=I_m$ and so $PP^*=I_m$
- Notice that $D:=P^*AQ^*$ is an $m\times n$ matrix with main diagonal $(\sigma_1,\ldots,\sigma_r,0,\ldots,0)$ and zeros everywhere else, for
$$(P^*AQ^*)_{ij}=v^*_iAu_j=\sigma_jv_iv_j=\sigma_j\delta_{ij}$$
for $1\leq j\leq r$, and $$(P^*AQ)_{ij}=v^*_iAu_j=\sigma_j v^*_iv_j=0=\sigma_j\delta_{ij}$$
for $j>r$.
- Putting things together, one obtains $A=PDQ$, with the desired deceasing ordering in the main diagonal of $D$.
Some final remarks:
Matrices $Q$ and $P$ in the SVD decomposition of $A$, even when the main diagonal of $D$ is ordered decreasingly, are not unique (there is a choice in ordering eigenvectors corresponding to an eigenvalue of multiplicity > 1, another choice for completing an orthonormal basis to construct $P$, and multiplication of vectors by unitary scales will produce also different $Q$s and $P$s)
If a particular SVD decomposition $P,D,Q$ of $A$ is given, permutations on the main diagonal of $D$ ($\sigma_j$ and $\sigma_i$ are interchanged), results in interchanging the $i$-th and $j$-th rows of $Q$ and the $i$-th and $j$-th columns of $P$ in order to keep an identity of the form $A=(P')D'(Q')^*$.
There are efficient numerical algorithms to find the SVD decomposition already implemented in many libraries (BLAS, LAPACK, etc) that can be ported to Fortran, C, C++, etc. All of them, to my knowledge, produce an $m\times n$ diagonal $D$ matrix where the main diagonal is ordered decreasingly.
Edit:
To address the comment if @Ian, define $P^{>}$ as the $m\times r$ matrix whose $j$-th column is $v_j=Au_j$ ($1\leq j\leq r$) and $P^0$ the $m\times (m-r)$ matrix whose $j$-th columns are given by the $m-r$ vectors $v_j$ used to complete a basis for $\mathbb{C}^m$. Then $P=[P^>|P^0]$. Let $D_{>}$ be the $r\times r$ diagonal matrix $\operatorname(diag)(\sigma_1,\ldots,\sigma_r)$. Let $Q^>$ be the $r\times n$ matrix whose $i$-th row ($1\leq i\leq r$) is $u^*_i$, and $Q^0$ the $(n-r)\times n$ matrix whose $i$-th row is $u_i$, $r< I$, corresponding to the zero eigenvalues of $A^*A$, that is $Q=\begin{pmatrix} Q^>\\-\\Q^0\end{pmatrix}$. Then
\begin{align}
A&=PDQ=\begin{pmatrix}P^>|P^0\end{pmatrix}\begin{pmatrix}D^>&|&0_{r\times(n-r)}\\
-&|&-\\
0_{(m-r)\times r} &| &0_{(m-r)\times(n-r)}\end{pmatrix}
\begin{pmatrix} Q^>\\-\\Q^0\end{pmatrix}\\
&=P^>D^>Q^>
\end{align}
Typical implementations of the svn algorithm has outputs $U=P^>_{m\times r}$, $\Sigma=D^>_{r\times r}$ and $V=Q^>_{r\times n}$.