I am considering a weighted least square problem with data $X \in \mathbb{R}^{n \times p}$, (diagonal) weight matrix $W \in \mathbb{R}^{n \times n}$ and responses $y \in \mathbb{R}^n$, i.e. finding $$\beta^*= \text{argmin}_{\beta \in \mathbb{R}^p} \Vert W^{1/2} \cdot ( y - X \cdot \beta )\Vert^2_2.$$ It is well known that the solution can be written as $$\beta^* = (X^T W X)^{-1} X^T W y.$$
I am currently investigating a case, where multiple weights in $W$ tend to infinity. I can approximate this solution numerically by enforcing a very high positive value $w_\infty\gg 1$ on those weights.
As far as I understand, this solves a constrained weighted least square optimization problem, if the errors on these infinite weights can be put to zero. I was wondering, if there is a corresponding optimization problem, if the error on the infinite weights remains larger than zero?
In particular, is there literature concerning these kind of ''two-level'' optimization problems or an alternative characterization of what I am computing using the infinite weights?
\gg. And don't use math mode to simulate italics – as you can see, that messes up the spacing – text is rendered in italics when it's enclosed in asterisks. – joriki Jan 05 '24 at 22:43