3

Suppose we have a set of linearly independent vectors say $S = \{v_1,v_2,\cdots,v_n \}$, then we can find some $\epsilon > 0$ such that $T = \{v_1+\epsilon,v_2+\epsilon,\cdots,v_n+\epsilon \}$ is still linearly independent.

Note that by $v + \epsilon$ we mean we are adding $\epsilon$ to each of the coordinates.

This result is probably known as the stability of linearly independent vectors. I am not sure. Can someone provide some reference which includes the proof?

User8976
  • 12,966
  • 10
  • 45
  • 118
  • 2
    Yes, if they are linearly independent, the matrix formed by those vectors as columns has nonzero determinant; the determinant map is continuous so it is nonzero for all matrices in a nbhd of the original one; thus taking $\epsilon$ sufficiently small will still give an invertible matrix and thus a linearly independent set. – User8128 Sep 13 '20 at 03:46
  • @User8128 Is there any similar result for orthogonal stability? – User8976 Sep 13 '20 at 03:59
  • 1
    Close but not quite: this may be not a square matrix. – markvs Sep 13 '20 at 03:59
  • Orthogonal stability does not exist: take the basis ${1}$ of the 1-dimensional vector space. – markvs Sep 13 '20 at 04:04
  • @ShiveringSoldier Can you please provide the link? – User8976 Sep 13 '20 at 04:41
  • @JCAA Can you explain a bit more... it is not clear to me. – User8976 Sep 13 '20 at 04:42
  • https://math.stackexchange.com/questions/1685682/regarding-linear-independence-on-a-normed-linear-space-given-a-condition – cqfd Sep 13 '20 at 04:52
  • @User8128 To talk about the determinant, it requires two things which are not assumed here : first, that the dimension is finite ; secondlt that the number $n$ of vectors is equal to the dimension. – TheSilverDoe Sep 13 '20 at 09:58

1 Answers1

4

Let's denote by $u$ the vector whose all coordinates are $1$. We will prove that if $(v_1, ..., v_n)$ are linearly independent, there is at most one value of $\varepsilon$ such that $(v_1 + \varepsilon u, ..., v_n + \varepsilon u)$ are not linearly independent. Of course, this is a stronger result that the one you want to prove.

Let's suppose that there exists $\varepsilon \neq \varepsilon'$ such that both families $(v_1 + \varepsilon u, ..., v_n + \varepsilon u)$ and $(v_1 + \varepsilon' u, ..., v_n + \varepsilon' u)$ are not linearly independent. Then, there exists $(\lambda_1, ...,\lambda_n) \neq (0, ..., 0)$ and $(\mu_1, ..., \mu_n) \neq ( 0, ..., 0)$ such that $$\lambda_1(v_1 + \varepsilon u) + ... + \lambda_n(v_n + \varepsilon u) = 0$$ $$\mu_1(v_1 + \varepsilon' u) + ... + \mu_n(v_n + \varepsilon' u) = 0$$

i.e. $$\lambda_1v_1 + ... + \lambda_nv_n= -(\lambda_1 + ... + \lambda_n)\varepsilon u $$ $$\mu_1v_1 + ... + \mu_nv_n= -(\mu_1 + ... + \mu_n)\varepsilon'u $$

Notice that $\lambda_1 + ... + \lambda_n \neq 0$, and $\varepsilon \neq 0$, because $(v_1, ..., v_n)$ are linearly independent. Then you deduce that $$\mu_1v_1 + ... + \mu_nv_n = \frac{(\mu_1 + ... + \mu_n)\varepsilon'}{(\lambda_1 + ... + \lambda_n)\varepsilon}\left(\lambda_1v_1 + ... + \lambda_nv_n \right)$$

Because the $(v_1, ..., v_n)$ are linearly independent, you deduce that for all $k=1, ..., n$, $$\mu_k = \frac{(\mu_1 + ... + \mu_n)\varepsilon'}{(\lambda_1 + ... + \lambda_n)\varepsilon} \lambda_k$$

and summing for $k=1, ..., n$, you get $$\mu_1 + ... + \mu_n = \frac{(\mu_1 + ... + \mu_n)\varepsilon'}{\varepsilon}$$

But $\mu_1 + ... + \mu_n$ cannot be $0$, because $(v_1, ..., v_n)$ are linearly independent, so you deduce $\varepsilon = \varepsilon'$, which is absurd.

cqfd
  • 12,974
TheSilverDoe
  • 30,038