Definitions

# Biconjugate gradient method

In mathematics, more specifically in numerical analysis, the biconjugate gradient method is an algorithm to solve systems of linear equations

$A x= b.,$

Unlike the conjugate gradient method, this algorithm does not require the matrix $A$ to be self-adjoint, but instead one needs to perform multiplications by the conjugate transpose $A^*$.

## The algorithm

1. Choose $x_0$, $y_0$, a regular preconditioner $M$ (frequently $M^\left\{-1\right\}=1$ is used) and $c$;
2. $r_0 leftarrow b-A x_0, s_0 leftarrow c-A^* y_0,$;
3. $d_0 leftarrow M^\left\{-1\right\} r_0, f_0 leftarrow M^\left\{-*\right\} s_0,$;
4. for $k=0, 1, dots,$ do
5. $alpha_k leftarrow \left\{s^*_k M^\left\{-1\right\} r_k over f_k^* A d_k\right\},$;
6. $x_\left\{k+1\right\} leftarrow x_k+ alpha_k d_k,$ $left\left(y_\left\{k+1\right\} leftarrow y_k + overline\left\{alpha_k\right\} f_k right\right),$;
7. $r_\left\{k+1\right\} leftarrow r_k- alpha_k A d_k ,$, $s_\left\{k+1\right\} leftarrow s_k- overline\left\{alpha_k\right\} A^* f_k ,$ ($r_k=b-A x_k$ and $s_k= c- A^* y_k$ are the residuums);
8. $beta_k leftarrow \left\{s_\left\{k+1\right\}^* M^\left\{-1\right\} r_\left\{k+1\right\} over s^*_k M^\left\{-1\right\} r_k\right\},$;
9. $d_\left\{k+1\right\} leftarrow M^\left\{-1\right\} r_\left\{k+1\right\} + beta_k d_k,$, $f_\left\{k+1\right\} leftarrow M^\left\{-*\right\} s_\left\{k+1\right\} + overline\left\{beta_k\right\} f_k,$.

## Discussion

The BiCG method is numerically unstable, but very important from theoretical point of view: Define the iteration steps by $x_k:=x_j+ P_k A^\left\{-1\right\}left\left(b - A x_j right\right)$ and $y_k:=y_j+A^\left\{-*\right\}P_k^*left\left(c-A^* y_j right\right)$ (
$P_k:= mathbf\left\{u_k\right\} left\left(mathbf\left\{v_k\right\}^* A mathbf\left\{u_k\right\} right\right)^\left\{-1\right\} mathbf\left\{v_k\right\}^* A$,
where $mathbf\left\{u_k\right\}=left\left(u_0, u_1, dots u_\left\{k-1\right\} right\right)$ and $mathbf\left\{v_k\right\}=left\left(v_0, v_1, dots v_\left\{k-1\right\} right\right)$. These related projections may be iterated themselves, as
$P_\left\{k+1\right\}= P_k+ left\left(1-P_kright\right) u_k otimes \left\{v_k^* Aleft\left(1-P_k right\right) over v_k^* Aleft\left(1-P_k right\right) u_k\right\}$.

The new directions $d_k:= left\left(1-P_k right\right) u_k$ and $f_k:= left\left(A left\left(1- P_k right\right) A^\left\{-1\right\} right\right)^* v_k$ then are orthogonal to the residuums: $v_i^* r_k= f_i^* r_k=0$ and $s_k^* u_j = s_k^* d_j= 0$, which themselves satisfy $r_k= A left\left(1- P_k right\right) A^\left\{-1\right\} r_j$ and $s_k= left\left(1- P_k right\right) ^* s_j$(

The biconjugate gradient method now makes a special choice and uses the setting

$u_k:= M^\left\{-1\right\} r_k$ and $v_k:=M^\left\{-*\right\} s_k$.
This special choice allows to avoid direct evaluations of $P_k$ and $A^\left\{-1\right\}$, and subsequently leads to the algorithm as stated above.

## Properties

• If $A= A^*$ is self-adjoint, $y_0= x_0$ and $c=b$, then $r_k= s_k$, $d_k= f_k$, and the conjugate gradient method produces the same sequence $x_k= y_k$.
• In finite dimensions $x_n=A^\left\{-1\right\}b$, at the latest when $P_n=1$: The biconjugate gradient method returns the exact solution after iterating the full space and thus is a direct method.
• The sequences produced by the algorithm are biorthogonal: $f_i^* A d_j = 0$ and $s_i^* M^\left\{-1\right\} r_j=0$ for $i ne j$.
• if $p_\left\{j\text{'}\right\}$ is a polynomial with
• if $p_\left\{i\text{'}\right\}$ is a polynomial with

Search another word or see biconjugateon Dictionary | Thesaurus |Spanish