Questions tagged [gram-schmidt]

Questions relating to the Gram–Schmidt process, which takes a set of input vectors and produces an orthonormal set of vectors that spans the same subspace as the input set.

Orthonormal bases in linear algebra are easier to work with, especially in vector spaces where the basis may not be immediately obvious such as modular forms. The Gram–Schmidt process produces, from an arbitrary set of $n$ linearly independent vectors $\{\mathbf v_1,\mathbf v_2,\dots,\mathbf v_n\}$, an orthonormal set of vectors $\{\mathbf e_1,\mathbf e_2,\dots,\mathbf e_n\}$ with $\operatorname{span}(\{\mathbf e_i\})=\operatorname{span}(\{\mathbf v_i\})$. It does so by repeatedly projecting vectors onto each other.

Defining the projection operator as $$\operatorname{proj}_{\mathbf u}(\mathbf v)=\frac{\langle\mathbf v,\mathbf u\rangle}{\langle\mathbf u,\mathbf u\rangle}\mathbf u$$ with $\langle\cdot\rangle$ the inner product, the classical process first constructs an intermediate set of orthogonal vectors $\{\mathbf u_i\}$ as $$\mathbf u_i=\mathbf v_i-\sum_{j=1}^{i-1}\mathrm{proj}_{\mathbf u_j}(\mathbf v_k)$$ starting from $\mathbf u_1$ and going to $\mathbf u_n$. Normalising these vectors ($\mathbf e_i=\mathbf u_i/\Vert\mathbf u_i\Vert$) yields the orthonormal basis $\{\mathbf e_i\}$.

This classical process is numerically unstable, but may be stabilised by computing additional intermediate values for each $\mathbf u_i$. Applications of the process include QR decomposition of matrices and constructing the Legendre polynomials.

287 questions
33
votes
6 answers

The need for the Gram–Schmidt process

As far as I understood Gram–Schmidt orthogonalization starts with a set of linearly independent vectors and produces a set of mutually orthonormal vectors that spans the same space that starting vectors did. I have no problem understanding the…
16
votes
5 answers

Understanding the Gram-Schmidt process

I would like to better understand the gram-schmidt process. The statement of the theorem in my textbook is the following: The Gram-Schmidt sequence $[u_1, u_2,\ldots]$ has the property that $\{u_1, u_2,\ldots, u_n\}$ is an orthonormal base for the…
14
votes
1 answer

Intuitive explanation of why the modified Gram-Schmidt is more stable than the classical one?

This may be an old question, and there are certainly some related posts which I will mention below. However, there seems no clear answer to me yet. The question is: is there an intuitive way to explain why the modified Gram-Schmidt (MGS) process for…
12
votes
1 answer

Gram-Schmidt process in Minkowski space $\Bbb L^n$.

I'm trying to prove a version of Gram-Schmidt orthogonalization process in Minkowski space $\Bbb L^n$ (for concreteness, I'll put the sign last). I am not interested in the existence of orthonormal bases, but instead in the algorithm. Namely,…
9
votes
1 answer

Gram-Schmidt process on complex space

Let $\mathbb{C}^3$ be equipped with the standard complex inner product. Apply the Gram-Schmidt process to the basis: $v_1=(1,0,i)^t$, $v_2=(-1,i,1)^t$, $v_3=(0,-1,i+1)^t$ to find an orthonormal basis $\{u_1,u_2,u_3\}$. I have found $u_1 =…
sarahusher
  • 844
  • 2
  • 7
  • 15
8
votes
1 answer

Understanding properties of sequential orthogonalization methods like Gram-Schmidt.

We all know the Gram-Schmidt orthogonalization is done recursively and takes the linearly independent set of vectors one-by-one. And it can be distinguished from democratic orthogonalization like Löwdin and Wigner, which handles all the given…
IamKnull
  • 1,497
  • 11
  • 29
8
votes
1 answer

Symplectic version of "Gram-Schmidt"

Let $w$ be a symplectic form on a vector space $V$ of dimension $2g$. Suppose we already have a free family $(a_1, \dots, a_g)$ such that $w(a_i, a_j) = 0$. I also have a family $(b_1, \dots, b_g)$ which verify that $(a_1, \dots, a_g, b_1, \dots,…
8
votes
3 answers

Gram-Schmidt method to get a basis for $P_3$

If $P_3$ is a vector space of third-degree polynomials. It is known the basis for $P_3$ is ${( 1,x,x^2 , x^3})$ and $\langle p, q\rangle = \int_{0}^{1} p(x)q(x)\, dx.$ is a valid product on $P_3$ I am trying to use the Gram-Schmidt method to get…
7
votes
4 answers

Generation of Hermite polynomials with Gram-Schmidt procedure

I want to use the Gram-Schmidt procedure to generate the first three Hermite polynomials. Given the set of linearly independent vectors $\{1,x,x^2,...\}$ in the Hilbert space $L^2(R,e^{-x^2}dx)$, I apply the orthogonalisation procedure as…
7
votes
1 answer

Gram-Schmidt over GF$(2)$

I am reading the paper The Steganographic File System by Ross Anderson, Roger Needham, and Adi Shamir. On page 4, paragraph 2, the authors write: Finally, we use the Gram-Schmidt method to orthonormalise all the vectors from $i$ onwards by…
6
votes
2 answers

Help with understanding the Gram-Schmidt Process

Let $U=\langle x_1,x_2,x_3\rangle \subseteq \mathbb{R^4},$ where $$x_1=\begin {pmatrix} 3\\4 \\0\\0 \end {pmatrix}, \ x_2=\begin {pmatrix} 1\\3 \\1\\1 \end {pmatrix},\ x_3=\begin {pmatrix} 0\\5 \\5\\7 \end {pmatrix}.$$ Use the Gram-Schmidt Process…
6
votes
1 answer

How to decompose a bivector into a sum of _orthogonal_ blades?

In Geometric Algebra, any bivector $B\in\Lambda^2\mathbb R^n$ is a sum of blades: $$B = B_1 + B_2 + \cdots$$ $$= \vec v_1\wedge\vec w_1 + \vec v_2\wedge\vec w_2 + \cdots$$ Each blade's component vectors $\vec v$ and $\vec w$, if they're not already…
6
votes
1 answer

Every separable Hilbert space has an orthonormal basis

Prove the following: Every non trivial separable Hilbert space $H$ has an orthonormal basis, i.e., an orthonormal set whose linear span is dense in $H$ My attempt: Let $V = (v_n)_{n\geq1}$ be a dense countable set of vectors in $H$. Remove all …
5
votes
1 answer

How to express a Gaussian as a series of exponential? $\displaystyle e^{-x^2}=\sum_{n=1}^{\infty}c_n e^{-nx}$

Context I would like to express the Gaussian function as a series of exponentials: $$e^{-x^2}=\sum_{n=1}^{\infty}c_ne^{-n|x|}\qquad\forall x\in\mathbb{R}$$ For simplicity (the absolute value is added later), I consider only the positive…
5
votes
1 answer

Is there a formula for the derivative of Gram-Schmidt orthonormalization process?

Let $X\in \mathbb{R}^{m\times n}$ be a matrix with linearly independent columns and let $\mathcal{N}$ be a neighborhood of $X$ such that every $Y\in \mathcal{N}$ has linearly independent columns as well. Now let $F\colon \mathcal{N}\to…
Koto
  • 949
1
2 3
19 20