Recently, the triangle inequality for Euclidean space was generalized in terms of probability. Let $(a,b,c)$ be the sides of a triangle. If $a \le b \le c$, and $x \ge 1$, we have the exponential form
$$ P\left(a^x + b^x \ge c^x\right) = \frac{1}{x^2}. \tag 1 $$
Similarly, unconditionally, we have the linear form $$ P(ax + by \ge c) = \frac{4}{\pi^2}\chi_2(x) + \frac{4}{\pi^2}\chi_2(y) \tag 2 $$ where $\chi_2(x)$ is the Legendre Chi function. Both the above results were conjectured in MSE and later proved MO. These results gives a probabilistic interpretation of the triangle inequality.
The triangle inequality simple as it may seem, has several applications in science and engineering. For example few not so well known but important application in my field of machine learning are:
- Classification algorithm: The $k$-means algorithm which is one of the most popular classification algorithm in machine learning where triangle inequality is use to accelerate computing.
- Image recognition: Reducing the violations of triangle inequality helps improve image recognition using neural networks.
- Deep learning architectures: There are deep learning architectures that the specifically built to satisfy the triangle inequality
Since machine learning is mostly built on top of probability and linear algebra and there are several instances where the regular triangle inequality is used, I am curious if the above probabilistic interpretations of the triangle inequality could find applications in this field for example, instead of architectures or algorithms that are constrained to satisfy the triangle inequality may be we can find a solution space that has 95% chances of satisfying the triangle inequality using much lesser computing resources i.e. accuracy cost trade-off. I would like to ask:
Questions: What are some of the applications of the triangle inequality in the field of classical and quantum machine learning?