If the matrix $\bf X$ were fixed and had full row rank, we would have a classical least-squares problem
$$ \hat{{\bf H}} := \arg\min_{{\bf H}} \| {\bf H} {\bf X} - {\bf Y} \|_{\text{F}}^2 = \cdots = {\bf Y} {\bf X}^*\left( {\bf X} {\bf X}^*\right)^{-1}$$
and the minimal least-squares cost
$$ \begin{aligned} \left\| \hat{{\bf H}} {\bf X} - {\bf Y} \right\|_{\text{F}}^2 &= \left\| {\bf Y} {\bf X}^*\left( {\bf X} {\bf X}^*\right)^{-1} {\bf X} - {\bf Y} \right\|_{\text{F}}^2 \\ &= \left\| {\bf Y} \left( {\bf X}^*\left( {\bf X} {\bf X}^*\right)^{-1} {\bf X} - {\bf I} \right) \right\|_{\text{F}}^2 \\ &= \left\| {\bf Y} \left( {\bf I} - {\bf X}^*\left( {\bf X} {\bf X}^*\right)^{-1} {\bf X} \right) \right\|_{\text{F}}^2 \end{aligned} $$
where ${\bf I} - {\bf X}^*\left( {\bf X} {\bf X}^*\right)^{-1} {\bf X}$ is a projection matrix. Since we have $N$ matrices, assuming all of them have full row rank, fix ${\bf X} = {\bf X}_i$, solve the corresponding least-squares problem and obtain the corresponding least-squares cost. At the end, find for which $i$ the least-squares cost is minimal
$$ \hat{i} := \color{blue}{\arg\min_{i \in [N]} \left\| {\bf Y} \left( {\bf I} - {\bf X}_i^*\left( {\bf X}_i {\bf X}_i^*\right)^{-1} {\bf X}_i \right) \right\|_{\text{F}}^2} $$
where $[N] := \{ 1, 2, \dots, N \}$.
Related: Gradient of squared Frobenius norm of a matrix