2

I am considering a weighted least square problem with data $X \in \mathbb{R}^{n \times p}$, (diagonal) weight matrix $W \in \mathbb{R}^{n \times n}$ and responses $y \in \mathbb{R}^n$, i.e. finding $$\beta^*= \text{argmin}_{\beta \in \mathbb{R}^p} \Vert W^{1/2} \cdot ( y - X \cdot \beta )\Vert^2_2.$$ It is well known that the solution can be written as $$\beta^* = (X^T W X)^{-1} X^T W y.$$

I am currently investigating a case, where multiple weights in $W$ tend to infinity. I can approximate this solution numerically by enforcing a very high positive value $w_\infty\gg 1$ on those weights.

As far as I understand, this solves a constrained weighted least square optimization problem, if the errors on these infinite weights can be put to zero. I was wondering, if there is a corresponding optimization problem, if the error on the infinite weights remains larger than zero?

In particular, is there literature concerning these kind of ''two-level'' optimization problems or an alternative characterization of what I am computing using the infinite weights?

Fabi
  • 21
  • You can get $\gg$ using \gg. And don't use math mode to simulate italics – as you can see, that messes up the spacing – text is rendered in italics when it's enclosed in asterisks. – joriki Jan 05 '24 at 22:43
  • I’ve seen some talks on hierarchical optimization that seem close to what you want. Here’s a reference that I found quickly: https://link.springer.com/chapter/10.1007/978-3-642-46473-7_28 – NicNic8 Jan 06 '24 at 02:57

0 Answers0