Questions tagged [conjugate-gradient]

For questions related to conjugate gradient (method). It is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-definite.

In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-definite. The conjugate gradient method is often implemented as an iterative algorithm, applicable to sparse systems that are too large to be handled by a direct implementation or other direct methods such as the Cholesky decomposition. Large sparse systems often arise when numerically solving partial differential equations or optimization problems.

The conjugate gradient method can also be used to solve unconstrained optimization problems such as energy minimization.

For more, you may check this link too.

33 questions
6
votes
2 answers

Newton conjugate gradient algorithm

In this video, the professor describes an algorithm that can be used to find the minimum value of the cost function for linear regression. Here, the cost function is $f$, the gradient is $g_k$ where $k$ is the $kth$ step of the algorithm, $\theta$…
5
votes
1 answer

Convex optimization using constraint projection matrices

I have a convex optimization of the form $$ \min_x \frac{1}{2} x^TAx-x^Tb \\ \text{s.t.}\ (I-P)x=0 $$ where $A$ is a $n$ by $n$ positive definite matrix, and $P$ is a $n$ by $n$ projection matrix (it has $p$ eigenvalues equal to zero, and $n-p$…
5
votes
2 answers

What is the difference between line search and gradient descent?

I understand the gradient descent algorithm, but having trouble how it relates to line search. Is gradient descent a type of line search?
4
votes
1 answer

Do there exist Conjugate-Gradient algorithms for "nested" inverses.

The Conjugate Gradient algorithm (CG) is an iterative Matrix-vector multiplication based scheme for solving equations like $$Ax = b$$ Without having to calculate an explicit inverse $A^{-1}$ or some factorization of $A$. I wonder if we have more…
2
votes
0 answers

Conjugate gradient with two preconditioners of different dimensions.

I need to estimate $\mathbf{y}$ defined as: $$ \mathbf{y} = \mathbf{P}\mathbf{B}^T(\mathbf{B}\mathbf{P}\mathbf{B}^T+\mathbf{R})^{-1} \mathbf{x}, $$ with $\mathbf{P}$ and $\mathbf{R}$ two diagonal matrices with positive entries. I want to implement…
1
vote
0 answers

Complex-valued gradient involving Kronecker product

Conisder $\mathbf X \in \mathbb C^{M_A M_S \times M_S T}$, $\mathbf W \in \mathbb C^{M_d \times T}$, and $\mathbf F \in \mathbb C^{M_A \times M_d}$. I am stuck on the following: \begin{equation} \nabla_{\mathbf F} \, \operatorname{tr} \{\mathbf X…
1
vote
0 answers

How to find a preconditioner for Riemann-Newton method on Grassmann manifold (Projector representation)

I have been working on Riemann-Newton optimization on a Grassmannian manifold with the orthogonal projector representation. This manifold is denoted by $$\operatorname{Gr} (n,k) = \left\{ P \in \mathbb{R}^{n \times n} : P=P^\top, P^2=P,…
1
vote
0 answers

Proximal operator of conjugate of Euclidean norm

I am trying to solve the following problem: Given $\tau \in \mathbb{R}_{++}, \rho \in \mathbb{R}_{++},~y_n \in \mathbb{R}^n,~z_n \in \mathbb{R}^n$. Derive $$\text{prox}_{\frac{1}{\tau} g^{*}} (y_n + \frac{z_n}{\tau})$$, where $g=\rho\|\cdot\|$, and…
1
vote
0 answers

Implementing gradient optimization methods for non-linear functionals

I'm trying to write a code solving the Dirichlet boundary problem for $p$-Laplacian on an arbitrary planar domain $\Omega$: $$\begin{cases} -\Delta_p (u) = f \text{ on } \Omega, \\ u\big|_{\partial \Omega} \equiv 0 \end{cases} $$ using…
1
vote
1 answer

Trivial inequality regarding complexity of conjugate gradient

I am reading An Introduction to the Conjugate Gradient Method Without the Agonizing Pain. On page 20, the author wrote The convergence results for Steepest Descent are $$ \lVert e_{(i)} \rVert_A \le \left(\frac{\kappa-1}{\kappa+1}\right)^i \lVert…
1
vote
0 answers

Solving simple quadratic function using conjugate gradient method - why are my directions linearly dependent?

I have a simple function: $$ f(x)=x_1^2 + 2 x_2^2 -2 x_1 x_2 - 2 x_2 + 2 x_1$$ And I'd like to solve it using conjugate gradient method at initial point $x^0=(2,2)$. I found the hessian $$ Q= \begin{bmatrix} 1 & -1 \\ -1 & 2 \\…
1
vote
0 answers

A question related with $p$-Laplacian and conjugate gradient method.

I have the following energy functional of $p$-Laplacian equation: $$ E(u) = \frac{1}{p} \int_{\Omega} |\nabla u|^p dx $$ for $2.8 \leq p \leq 5$. My goal is to minimize the energy functional by using nonlinear conjugate gradient method (…
1
vote
1 answer

Norm and conjugate of operator on $l^p$

I want to solve the following problem: Show that for every sequence $(\alpha_n) \in l^{\infty}$ formula $$A(x_n) = (\alpha_n x_n), (x_n) \in l^p,$$ gives bounded linear operator on $l^p$, find its norm and operator $A^*: (l^{p})^* \to (l^p)^*$,…
1
vote
0 answers

Is Rayleigh Quotient guaranteed to monotonically decrease in Conjugate Gradient Algorithm?

I am using Linear Conjugate Gradient to solve for $x$ in $Ax=b$, where $A$ is PSD. I noticed experimentally that the Rayleigh Quotient $\frac{x_k^TAx_k}{x^Tx}$ of the accumulated solution $x_k$ is always monotonically decreasing with more…
1
vote
0 answers

Unclear point: how do we know gradient at time step $t+1$ in conjugate gradient descent$\,$?

Below is the conjugate gradient descent algorithm. At time step $t$, how do we know gradient at time step $\;t+1\;,\;g(t+1)$, in the conjugate gradient descent ? Reference: page 7-35, enter as password Paraskavedekatriaphobia :…
1
2 3