1

I'm learning linear algebra and i have some problems understanding the proofs about lineal independence and dependence of fiunctions.

Suppose this problem, that states:

Question: Prove that $e^x,xe^x, $ and $ x^2e^x$ are linearly independent over $\mathbb{R}$.

Almost all the answer((like this) use some type of technique which consists of evaluating on values of $x$ to get the scalars of the linear combination.

I want to know the name of the technique and why works.

My idea thinking about why works was:

If i want to prove that $f,g$ are linearly independent then i want to prove what are the scalar solutions for the linear combination $\alpha f(x)+\beta g(x)=0$ for all $x$.

And the keyword here is for all $x$ since if holds for all $x$ means that holds for every $x_i $ and therefore i can do the inverse process, that is, evaluate specific $x_i$ values such that "force" the scalars to be zero, since if they did not hold for some particular $x_i$ values, then it would not fulfill for all $x$.

In the exposed case, i have that:

$e^x(\alpha+\beta x + \gamma x^2) = 0\implies (\alpha+\beta x + \gamma x^2) = 0$

and for the particular value $x = 0$, I have that this value "force" $\alpha = 0$ and same with the others values.

Is my idea about this correct? and how is the name of this "technique"?

ESCM
  • 3,361

1 Answers1

1

I don't know if there's a name for it, other than proof by contradiction.

Linear dependence requires that one be a nonzero multiple of the other. That's it! So, finding a contradiction only requires it to fail for a single value, in the case of $\mathbb{R}\rightarrow \mathbb{R}$.