21

Prove that the union of three subspaces of V is a subspace iff one of the subspaces contains the other two.

I can do this problem when I am working in only two subspaces of $V$ but I don't know how to do it with three.

What I tried is: If one of the subspaces contains the other two, Then their union is obviously a subspace because the subspace that contains them is a subspace. (Is this sufficient??).

If the union of three subspaces is a subspace..... How do I prove that one of the subspaces must contain the other two from here?

*When proving this for two I said that there is an element in one of the subspaces that is not the other and proved by contradiction that one of the subspaces must be contained in the other. How would I do this for three?

user26857
  • 53,190
Soaps
  • 1,143
  • 1
  • 10
  • 29

7 Answers7

36

The statement is false. Consider the following counterexample:

Consider the vector space $V=(\mathbb{Z}/2\mathbb{Z})^{2}$ where $F=\mathbb{Z}/2\mathbb{Z}$. Let $V_{1}$ be spanned by $(1,0)$. Let $V_{2}$ be spanned by $(0,1)$. Let $V_{3}$ be spanned by $(1,1)$. Then we have $V=V_{1}\cup V_{2}\cup V_{3}$, but none of the $V_{1},V_{2},V_{3}$ are subspace of another.

You can usually count on field of characteristic $2$ to give you counterexample. There are many similar counterexample, too. In finite dimension, I think all counterexamples can be constructed this way. My intuition tells me that there are infinite dimensional counterexamples of other form, but have not checked clearly.

EDIT. Here is a proof of the statement with the restriction $F\not=\mathbb{Z}/2\mathbb{Z}$:

Without loss of generality, we can assume the whole space $V$ is in fact $V_{1}+V_{2}+V_{3}$. Easily seen that in fact we must also have $V=V_{1}\cup V_{2}\cup V_{3}$.

There exist $a,b\in F$ such that $a,b\not=0$ and $a-b=1$ (take $a$ to be anything except $0,1$, and take $b=a-1$).

Assume $V_{1}$ and $V_{2}$ neither contains another (otherwise this reduce to the 2-subspace case). For any $u\in V_{1}\setminus(V_{1}\cap V_{2})$ we take an arbitrary $w\in V_{2}\setminus(V_{1}\cap V_{2})$ (it exists due to the fact that neither $V_{1}$ nor $V_{2}$ contains another). Then $au+w$ is in neither $V_{1}$ nor $V_{2}$ (if in $V_{1}$ then since $au\in V_{1}$ we must have $w\in V_{1}$ so $w\in V_{1}\cap V_{2}$ contradiction; same for the other case but now using the fact that $a\not=0$), so $u+aw\in V_{3}$. Same argument apply to show $bu+w\in V_{3}$. Hence $u=(bu+w)-(au+w)\in V_{3}$. Hence $V_{1}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Same argument apply to show $V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Now for any $v\in V_{1}\cap V_{2}$ we pick a $w\in V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Then $w+v\notin V_{1}\cap V_{2}$ (otherwise $w\in V_{1}\cap V_{2}$). But $w+v\in V_{2}$. Hence $w+v\in V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Thus $v=(w+v)-w$ so $v\in V_{3}$. Hence $V_{1}\cap V_{2}\subset V_{3}$. Therefore $V_{1},V_{2}\subset V_{3}$.

user26857
  • 53,190
Gina
  • 5,408
  • 1
    Why can we assume that the whole space $V$ is in fact $V_{1}+V_{2}+V_{3}$? – ubadub Mar 04 '19 at 01:51
  • Why does $V_1\cap V_2\subset V_3$ imply $V_1\subset V_3$ and $V_2\subset V_3$? – perimasu Mar 09 '21 at 19:28
  • Gina showed that $(V_{1} \setminus (V_{1} \cap V_{2})) \subset V_{3}$ and $(V_{1} \cap V_{2}) \subset V_{3}$. Since $V_1 = (V_{1} \setminus (V_{1} \cap V_{2})) \cup (V_{1} \cap V_{2})$, then $V_{1} \subset V_{3}$. Because the argument for $V_{2}$ is essentially identical, they didn't bother showing that part. – GhostyOcean Dec 31 '22 at 15:59
16

Gina's answer is great, but I think we can clean it up a bit.

Let $U_1,U_2,U_3$ be subspaces of $V$ over a field $k\neq \mathbb{F}_2$.

$(\Leftarrow)$ Suppose that one of the subspaces contains the other two. Without loss of generality, assume $U_1\subset U_3$ and $U_2\subset U_3$. Then $U_1\cup U_2\cup U_3 = U_3$, and so $U_1\cup U_2\cup U_3$ is indeed a subspace of $V$.

$(\Rightarrow)$ Now suppose $U_1\cup U_2\cup U_3$ is a subspace. If $U_2$ contains $U_3$ (or conversely), let $W = U_2 \cup U_3$. Then applying the case of the union of two subspaces (you need to prove this case first) to the union $U_1\cup W$, we have that either $U_1$ contains $W$ or $W$ contains $U_1$, showing that one of the three subspaces contains the other two, as desired. So assume $U_2$ and $U_3$ are such that neither contains the other. Let \begin{equation*} x\in U_2\setminus U_3 ~~~ \text{and} ~~~ y\in U_3\setminus U_2, \end{equation*} and choose nonzero $a,b\in k$ such that $a-b = 1$ (such $a,b$ exist since we assume $k$ is not $\mathbb{F}_2$).

We claim that $ax + y$ and $bx + y$ are both in $U_1$. To see that $ax + y\in U_1$, suppose not. Then either $ax + y\in U_2$ or $ax + y\in U_3$. If $ax + y\in U_2$, then we have $(ax + y) - ax = y\in U_2$, a contradiction. And if $ax +y \in U_3$, we have $(ax + y) - y = ax \in U_3$, another contradiction, and so $ax+y\in U_1$. Similarly for $bx + y$, suppose $bx + y\in U_2$. Then $(bx + y) - bx = y \in U_2$, a contradiction. And if $bx + y\in U_3$, then $(bx + y) - y = bx \in U_3$, also a contradiction. Thus $bx + y\in U_1$ as well. Therefore \begin{equation*} (ax + y) - (bx + y) = (a-b)x = x \in U_1. \end{equation*} Now, since $x\in U_2\setminus U_3$ implies $x \in U_1$, we have $U_2\setminus U_3\subset U_1$. A similar argument shows that $x + ay$ and $x + by$ must be in $U_1$ as well, and hence \begin{equation*} (x + ay) - (x + by) = (a - b)y = y \in U_1, \end{equation*} and therefore $U_3\setminus U_2\subset U_1$. If $U_2\cap U_3=\{0\}$, we're done, so assume otherwise.

Now for any $u\in U_2\cap U_3$, choose $v \in U_3\setminus U_2\subset U_1$. Then $u+v\not\in U_2\cap U_3$, for otherwise $(u+v)-u=v\in U_2$, a contradiction. But this implies $u+v$ must be in $U_1$, and hence so is $(u+v) - v = u$. In other words, if $u\in U_2\cap U_3$, then $u\in U_1$, and hence $U_2\cap U_3\subset U_1$, as was to be shown. $\tag*{$\square$}$

This problem appears in the first chapter of Linear Algebra Done Right, by Axler. I personally think it's pretty challenging for so early in an introductory linear algebra book, but it's a great exercise. Lots of details to keep straight.

JeffW89
  • 686
  • 4
  • 9
3

Gina gave an excellent answer,in fact,we can have : If $V$ is a vector space over the field $F$ and there is a collection of finite number of subspaces of $V$, $\{U_1,U_2,U_3,\cdots ,U_n\}$,and $n$,the number of the elements of the collection above,is not more than the cardinality of $F$,when $F$ is finite,or $F$ is just infinite,then the union of all the subspaces $U_1,U_2,U_3,\cdots ,U_n$ is a subspace of $V$ if and only if one of the subspaces $U_1,U_2,U_3,\cdots ,U_n$ contains all other subspaces.The proof is similar to the way one proves that "a vector space over an infinite field cannot be a finite union of proper subspaces of its own", and using the technique "prove by contradiction".(Using the pigeonhole principle to deduce absurdity: Imagine that the elements of $F$ "fly" into the subspaces $U_1,U_2,U_3,\cdots ,U_n$)

W.Leywon
  • 528
  • 2
    Readers who are interested in this may also be interested in the proof of the 'prime avoidance lemma'. – W.Leywon Feb 18 '17 at 14:58
  • I can prove it for the case where $|F| > n$, but is it also true for the case where $|F| = n$? Because in this case, I don't think we have enough elements in $F^$ to apply the pigeonhole principle to gather the fact that some $U_j$ not equal to $U_{1}$ contains at least two elements of the form $x + \alpha y$, where $x \in U_1$ and $y \in \bigcup_{i = 1}^n U_i \setminus U_1$, and $\alpha \in F^$. – jsmith Aug 07 '22 at 01:38
2

With regard to Gina and Jeff's answers, I believe the simplification at the start can be made easier (at least in my opinion).

Suppose that subspaces $U_1,U_2,U_3$ union to form a subspace. We have 2 cases.

Case 1: Suppose $U_i\subseteq \cup_{j\neq i}U_j$ for some $i\in \{1,2,3\}$. WLOG, let $i=1$. Then the problem is reduced to the case of 2 subspaces $U_j, j=2,3$. WLOG again, we have $U_2\subseteq U_3$, then $U_1\subseteq U_2 \cup U_3 = U_3$. Hence $U_3$ contains the other 2 sets.

Case 2: $\forall i\in \{1,2,3\}, U_i\not\subseteq \cup_{j\neq i}U_j$. Then for each $i$, there exists $\mathbf{u}_i\in U_i$ such that $u_i\notin \cup_{j\neq i}U_j$, i.e. $u_i\notin U_j$ for $j\neq i$.

After this, we can make a similar argument to that of Gina's. For simplicity, I assume the field in concern to be the real/complex field like in Axler's book.

$\mathbf{u}_1 + \mathbf{u}_2, 2\mathbf{u}_1 + \mathbf{u}_2, \mathbf{u}_1 + 2\mathbf{u}_2$ must all be in $U_3\setminus (U_1 \cup U_2)$. (Note that all these vectors are in $\cup_{1\leq i\leq 3} U_i$.) In particular, these vectors lie in $U_3$. Hence, $\mathbf{u}_1, \mathbf{u}_2 \in U_3$, a contradiction.

Hence only case 1 is possible and we are done.

0

This is similar to the above solutions but I'm writing it here for clarity:

$\Rightarrow$ Denote the subspaces by $W_1, W_2, W_3$. Assume that $(W_1 \cup W_2) \not\subset W_3$ and similar for $W_1,W_2$. Take $x \in W_1/(W_2 \cup W_3)$ and similarly $x_2$ and $x_3$.

Assuming $\mathbb{F} \ne \mathbb{F}_2$, take $\lambda \in \mathbb{F}$ such that $\lambda \ne 0, 1$. Then, consider the set $$\{\lambda{x_1}+x_2, x_1+\lambda{x_2}, x_1+x_2, x_1\}.$$ By Pigeonhole, at least two of the elements are in the same set in same subspace. If two are in $W_1$ it can easily be deduced $x_2 \in W_1$ which gives a contradiction. If two are in $W_2$, since the last element can not be in $W_2$, one of the first two elements are in $W_2$ which with subtraction again gives $x_1 \in W_2$ a contradiction. If two elements are in $W_3$ (since the last element can not be in $W_3$), we get either $$\lambda{x_1}+x_2-\lambda{x_1}-\lambda^2{x_2}=(1-\lambda^2)x_2 \in W_3 \implies x_2 \in W_3$$ or $$\lambda{x_1}+x_2 - \lambda{x_1}-\lambda{x_2}=(1-\lambda)x_2 \in W_3 \implies x_2 \in W_3$$ with the final case symmetric. In all cases we get a contradiction, so in fact one of the inclusions holds.

$\Leftarrow$ WLOG $W_1\cup{W_2}\cup{W_3}=W_1$ so we get a subspace. $\blacksquare$

0

I am reading "Linear Algebra Done Right Fourth Edition" by Sheldon Axler.

This problem is the same problem as Exercise 1C.13 in this book.

1C.13
Prove that the union of three subspaces of $V$ is a subspace of $V$ is and only if one of the subspaces contains the other two.
This exercise is surprisingly harder than Exercise 12, possibly because this exercise is not true if we replace $\mathbb{F}$ with a field containing only two elements.

1C.12
Prove that the union of two subspaces of $V$ is a subspace of $V$ if and only if one of the subspaces is contained in the other.

My solution:

Let $\mathbb{F}$ be a field such that $1+1\neq 0$.
Let $V$ be a vector space over $\mathbb{F}$.
Let $U_1,U_2,U_3$ be subspaces of $V$.
If one of $U_1,U_2,U_3$ contains the other two, then obviously $U_1\cup U_2\cup U_3$ is a subspace of $V$.
We prove that if $U_1\cup U_2\cup U_3$ is a subspace of $V$, then one of $U_1,U_2,U_3$ contains the other two.
Suppose that $U_1\cup U_2\cup U_3$ is a subspace of $V$.
First, we prove that $U_i\subset U_j$ for some $i,j\in\{1,2,3\}$ such that $i\neq j$ by proof by contradiction.
Assume that $U_i\not\subset U_j$ for any $i,j\in\{1,2,3\}$ such that $i\neq j$.
Suppose that $\{i,j,k\}=\{1,2,3\}$.
Then, by our assumption, $U_i\not\subset U_j$ and $U_j\not\subset U_i$.
So, $U_i\setminus U_j\neq\emptyset$ and $U_j\setminus U_i\neq\emptyset$.
Let $a\in U_i\setminus U_j$ and $b\in U_j\setminus U_i$.
Then, $a+b\in U_i\cup U_j\cup U_k$ since $a\in U_i\cup U_j\cup U_k$ and $b\in U_i\cup U_j\cup U_k$ and $U_i\cup U_j\cup U_k$ is a subspace.
If $a+b\in U_i$, then $b=(a+b)-a\in U_i$, but $b\in U_j\setminus U_i$.
This is a contradiction.
So, $a+b\notin U_i$.
Similarly, $a+b\notin U_j$.
So, $a+b\in U_k\setminus (U_i\cup U_j)$.
So, $U_k\setminus (U_i\cup U_j)\neq\emptyset$.
Let $a\in U_1\setminus (U_2\cup U_3)\subset U_1\setminus U_2$.
Let $b\in U_2\setminus U_1$.
Then, $a+b\in U_3\setminus (U_1\cup U_2)\subset U_3\setminus U_1$.
And $a\in U_1\setminus (U_2\cup U_3)\subset U_1\setminus U_3$.
So, $(a+b)+a=(1+1)a+b\in U_2\setminus (U_3\cup U_1)$.
So, $(1+1)a=((1+1)a+b)-b\in U_2$ since $(1+1)a+b\in U_2$ and $b\in U_2$.
Since $1+1\neq 0$, $a\in U_2$.
But $a_1\in U_1\setminus U_2$.
This is a contradiction.
So, $U_i\subset U_j$ for some $i,j\in\{1,2,3\}$ such that $i\neq j$.
Suppose that $U_i\subset U_j$ for $i,j\in\{1,2,3\}$ such that $i\neq j$.
Let $k=\{1,2,3\}\setminus\{i,j\}$.
Then, $U_1\cup U_2\cup U_3=U_i\cup U_j\cup U_k=U_j\cup U_k$ is a subspace of $V$.
By Exercise 1C.12, $U_j\subset U_k$ or $U_k\subset U_j$ holds.
If $U_j\subset U_k$, then $U_i\subset U_j\subset U_k$.
So, one of $U_1,U_2,U_3$ contains the other two.
If $U_k\subset U_j$, then $U_k\subset U_j$ and $U_i\subset U_j$.
So, one of $U_1,U_2,U_3$ contains the other two.

Suppose that $\mathbb{F}$ is a field such that $1+1=0$.
Let $V:=\mathbb{F}^2$.
Let $U_1:=\left\{\pmatrix{0\\0},\pmatrix{1\\0}\right\}$.
Let $U_2:=\left\{\pmatrix{0\\0},\pmatrix{0\\1}\right\}$.
Let $U_3:=\left\{\pmatrix{0\\0},\pmatrix{1\\1}\right\}$.
Then, $U_1,U_2,U_3$ are subspaces of $V$.
And $U_1\cup U_2\cup U_3$ is also a subspace of $V$.
But $U_1\not\subset U_2$.
But $U_2\not\subset U_1$.
But $U_2\not\subset U_3$.
But $U_3\not\subset U_2$.
But $U_3\not\subset U_1$.
But $U_1\not\subset U_3$.

佐武五郎
  • 1,718
0

Let $U, V, Z$ subspaces such that $U \cup V \cup Z$ is a subspace. If we have $U \subset (V \cup Z)$, then $U \cup V \cup Z = V \cup Z$, and therefore for the case of two subspaces, $V \subset Z$ or $Z \subset V$, and it follows that one of the sets contains the other two.

Let's suppose by contradiction that we have $u \in U \space \setminus (V \cup Z), v \in V \space \setminus (U \cup Z)$ and $z \in Z \space \setminus (U \cup V)$. In particular, we have that $u + v, u - v \in Z$ (if we have $u+v \in U$ or $u+v \in V$ we get a contradiction). But $Z$ is a subspace, so $(u+v) + (u-v) = 2u \in Z \rightarrow \frac{1}{2}2u=u \in Z$, which is a contradiction (If the field has characteristic 2, this is not true).

So at least one subspace is a subset of the union of the other two.