24

Let $V$ be a finite dimensional real vector space with inner product $\langle \, , \rangle$ and let $W$ be a subspace of $V$. The orthogonal complement of $W$ is defined as $$ W^\perp= \left\{ v \in V \,:\, \langle v,w \rangle = 0 \text{ for all } w \in W \right\}. $$ Prove the following: $\dim W + \dim W^\perp= \dim V$.

I'm not sure how to find the relationship between number of basis vectors in $W$ and $W^\perp$.

Jake
  • 647
  • 1
  • 5
  • 14

6 Answers6

22

Let $\beta=\{w_1,w_2,\ldots,w_k\}$ and $\gamma=\{x_1,x_2,\ldots,x_m\}$ be the bases for $W$ and $W^\perp$, respectively. It suffices to show that $$\beta\cup\gamma=\{w_1,w_2,\ldots,w_k,x_1,x_2,\ldots,x_m\}$$ is a basis for $V$. Given $v\in V$, then it is well-known that $v=v_1+v_2$ for some $v_1\in W$ and $v_2\in W^\perp$. Also because $\beta$ and $\gamma$ are bases for $W$ and $W^\perp$, respectively, there exist scalars $a_1,a_2,\ldots,a_k,b_1,b_2,\ldots,b_m$ such that $v_1=\displaystyle\sum_{i=1}^ka_iw_i$ and $v_2=\displaystyle\sum_{j=1}^mb_jx_j$. Therefore $$v=v_1+v_2=\sum_{i=1}^ka_iw_i+\sum_{j=1}^mb_jx_j,$$ which follows that $\beta\cup\gamma$ generates $V$. Next, we show that $\beta\cup\gamma$ is linearly independent. Given $c_1,c_2,\ldots,c_k,d_1,d_2,\ldots,d_m$ such that $\displaystyle\sum_{i=1}^kc_iw_i+\sum_{j=1}^md_jx_j={\it 0}$, then $\displaystyle\sum_{i=1}^kc_iw_i=-\sum_{j=1}^md_jx_j$. It follows that $$\sum_{i=1}^kc_iw_i\in W\cap W^\perp\quad\mbox{and}\quad \sum_{j=1}^md_jx_j\in W\cap W^\perp.$$ But since $W\cap W^\perp=\{{\it 0}\,\}$ (gievn $x\in W\cap W^\perp$, we have $\langle x,x\rangle=0$ and thus $x={\it 0}\,$), we have $\displaystyle\sum_{i=1}^kc_iw_i=\sum_{j=1}^md_jx_j={\it 0}$. Therefore $c_i=0$ and $d_j=0$ for each $i,j$ becasue $\beta$ and $\gamma$ are bases for $W$ and $W^\perp$, respectively. Hence we conclude that $\beta\cup\gamma$ is linearly independent.

Solumilkyu
  • 3,529
  • 3
    I cannot grab this part: $\sum_{i=1}^kc_iw_i=-\sum_{j=1}^md_jx_j$ implies $\sum_{i=1}^kc_iw_i\in W\cap W^\perp\quad\mbox{and}\quad \sum_{j=1}^md_jx_j\in W\cap W^\perp.$ Can you explain this further? – mgus Jan 21 '18 at 02:52
  • 4
    By assumption, $\displaystyle\sum_{i=1}^kc_iw_i\in W$ and $\displaystyle-\sum_{j=1}^md_jx_j\in W^\perp$. So if $\displaystyle\sum_{i=1}^kc_iw_i=-\sum_{j=1}^md_jx_j$, we naturally get $\displaystyle\sum_{i=1}^kc_iw_i\in W^\perp$ and $\displaystyle-\sum_{j=1}^md_jx_j\in W$ as well. – Solumilkyu Jan 21 '18 at 12:43
13

Hint:

Take a basis $w_1,\dots,w_r$ of $W$, and consider the linear forms on $V$ defined by $w_i^*:v\mapsto\langle w_i,v\rangle$.

These linear forms are linearly independent, hence the solutions of the system of equations $w_i^*(v)=0,\ i=1,\dots r$ has codimension $r$ by the rank-nullity theorem. These solutions are precisely the orthogonal complement $\;U^{\bot}$.

Bernard
  • 179,256
  • 1
    This proof Is better sumce it does not require the <> to ve an actual inner product. So this works also in the space F_q^n fór finite field F_q. – Michal Dvořák May 02 '22 at 18:50
  • for future visitors, https://math.stackexchange.com/a/621166/405572 explains this sketch in more detail. As per the prev. comment, this proof proves the stronger result (see https://en.wikipedia.org/wiki/Orthogonal_complement#Properties) that any non-degen. bilinear form, not just inner product, in finite dimensional vector spaces, leads to orthogonal complements (which could intersect! Weird!) with complementary dimensions. This is strange, since usually we think of complementary dim. of orthog. complements as coming from the direct sum decomp. (geometry); but here it arises from pure algebra – D.R. Mar 24 '25 at 06:45
7

It is sufficient to show that $V=W\oplus W^{\perp}$. If $v\in W\cap W^{\perp}$, then $\left\langle v,v\right\rangle=0$. Hence it remains to show that any vector $v\in V$ can be written as $v=w+w'$ with $w\in W$ and $w'\in W^{\perp}$.

Choose an orthonormal basis $\left\{w_1,\dots , w_k\right\}$ of $W$ and extend to an orthonormal basis $\left\{w_1,\dots,w_k,v_{k+1},\dots ,v_n\right\}$ of $V$. By definition $v_i\in W^{\perp}$ for all $n\geq i\geq k+1$. Hence any $v\in V$ can be decomposed as we needed to show.

  • 1
    I like Bernard's answer better, also, for my answer you need to know that you can take orthonormal bases of finite dimensional vector spaces, thus you need knowledge of the Gramm-Schmidt procedure. – Mathematician 42 May 01 '16 at 11:50
  • 1
    This is false as stated. You can't just extend to a basis of $V$ and then say "by definition." – Ted Shifrin May 01 '16 at 12:11
  • Hold on, take any $w\in W$, then $w=\sum_{i=1}^k\lambda_i w_i$. Hence $\left\langle w,v_j \right\rangle=\sum_{i=1}^k\lambda_i \left\langle w_i,v_j\right\rangle=0$, hence $v_j\in W^{\perp}$ for any $j\geq k+1$. Or am I completely wrong here? – Mathematician 42 May 01 '16 at 12:16
  • The $v_j$ are very, very unlikely to be in $W^\perp$. – Ted Shifrin May 01 '16 at 12:25
  • Ah ok, sorry, I meant that you extend to an orthonormal basis of $V$ as well! – Mathematician 42 May 01 '16 at 12:29
1

There is something missing in @Solumilkyu 's answer.

In order to show that $W$ and $W^\perp$ have bases, we must show that $W, W^\perp \neq \{\vec{0}\}$, which cannot always be satisfied.

In fact, if $W=V$, then from $W \cap W^\perp=\{\vec{0}\}$ (To prove this, let $\vec{x} \in W \cap W^\perp$. Then $<\vec{x},\vec{x}>=0 \to \vec{x}=\vec{0}$) we can conclude that $W^\perp=\{\vec{0}\}$. Then $\dim W+\dim W^\perp=\dim W$. If $W\neq V$, then from the fact that $v \in V$ implies that $v=v_1+v_2$ for $v_1 \in W \wedge v_2 \in W^\perp$, we can conclude that $W^\perp \neq \{\vec{0}\}$ just by setting $v \in V \setminus W \wedge v \neq \{\vec{0}\}$ and $v_1 = \vec{0}$.

On the other hand, if $W = \{\vec{0}\}$, then obviously $W^\perp = V$.

1

I have some qualms with @Solumilkyu’s answer. To prove that that a set of vectors is indeed a basis, one needs to prove prove both, spanning property and the independence. @Solumilkyu has demonstrated $\beta \cup \gamma$ is linearly independent, but has very conveniently assumed the spanning property.

To answer the question, though its quite old, one can use the rank-nullity theorem, coupled with the fact that $\text{rank}(A) = \text{rank}(A^{T})$ and $N(A^{T}) = V^{\bot}$.

0

I think in @Mathematician 42 answer, it suffices to prove $0=W \cap W^{\bot}$. Let $\alpha \in W + W^{\bot}$, assume \begin{equation} \alpha=\alpha_1+\alpha_2,\quad \alpha_1 \in W, \quad \alpha_2 \in W^{\bot},\\ \alpha=\beta_1+\beta_2,\quad \beta_1 \in W, \quad \beta_2 \in W^{\bot}, \end{equation} Then we have $\alpha_1+\alpha_2=\beta_1+\beta_2$, thus $\alpha_1-\beta_1=\beta_2-\alpha_2$, by assumption, $\alpha_1-\beta_1=0, \beta_2-\alpha_2=0$, that is, $\alpha_1=\beta_1, , \beta_2=\alpha_2$, this imply $W + W^{\bot}$ is direct sum.

Kim
  • 133
  • 6