Motivation
I know that in a finite dimensional Euclidean space $\Bbb{E}^n$, for every basis $G=\{g_1,g_2,...,g_n\} \subset \Bbb{E}^n$ we can define a dual basis $G'=\{g^1,g^2,...,g^n\} \subset \Bbb{E}^n$ such that
$$g_i \cdot g^j = \delta_{i}^{j} \tag{1}$$
Also, it can be proved that such a basis exists and it is unique. The main advantage of dual bases is that when we write an arbitrary vector as a linear combination of the original basis $G$ then we can obtain the coefficients of the linear combination by just using the orthogonality property of $G$ and $G'$ like below
$$\begin{align} x &= \sum_{i=1}^{n}x^i g_i \\ x \cdot g^j &= \sum_{i=1}^{n} x^i g_i \cdot g^j = \sum_{i=1}^{n} x^i \delta_{i}^{j} = x^j \\ x &= \sum_{i=1}^{n}(x \cdot g^i) g_i \end{align} \tag{2}$$
Now, let us go into the infinite dimensional space of infinitely differentiable functions $f(x)$ over the interval $[-a,a]$. We all know that the eigen-functions of the Sturm-Liouville operator can form an orthonormal basis for such functions with proper boundary conditions at the end points. However, this is really a nice operator, a self-adjoint one which produces orthonormal eigen-functions. In some Partial Differential Equations (PDEs), we encounter non-self-adjoint operators and we should expand our solution and boundary data in terms of their eigen-functions which unfortunately are not orthogonal anymore!
Just to give an example, consider the following biharmonic boundary value problem (BVP)
$$\begin{array}{lll} \Delta^2 \Phi=0 & -a \le x \le a & -b \le y \le b \\ \Phi(a,y)=0 & \Phi_x(a,y)=0 & \\ \Phi(-a,y)=0 & \Phi_x(-a,y)=0 & \\ \Phi(x,b)=f(x) & \Phi_y(x,b)=0 & \\ \Phi(x,-b)=f(x) & \Phi_y(x,-b)=0 & \\ \end{array} \tag{3}$$
where we have the symmetry $f(-x)=f(x)$. Also, for the sake of continuity of boundary conditions at the corners we require that
$$f(a)=f(-a)=f'(a)=f'(-a)=0 \tag{4}$$
Then solving this BVP leads to the following eigen-value problem
$$ \left( \frac{d^4}{dx^4}+2\omega^2\frac{d^2}{dx^2}+\omega^4 \right)X(x)=0 \\ X(a)=X(-a)=X'(a)=X'(-a)=0 \tag{5}$$
which has a non-self adjoint operator. The eigen-functions are known as Papkovich-Fadle eigen-functions. They can form a basis for the infinite dimensional space of infinitely differentiable functions $f(x)$ satisfying $(4)$ over the interval $[-a,a]$. As I told before, these eigen-functions are not orthogonal and this makes finding the coefficients $c_i$ of the expansions
$$f(x)= \sum_{i=1}^{\infty} c_i X(\omega_i;x) \tag{6}$$
really difficult leading to solve an infinite system of linear algebraic equations for the unknown coefficients $c_i$!
Questions
$1.$ Is there a dual basis thing for the basis $X(\omega_i;x)$'s that can make the computation of $c_i$'s easier? To be more specific, is there some basis $Y(\omega_j;x)$ such that
$$\int_{-a}^{a} X(\omega_i;x) Y(\omega_j;x) dx =\delta_{ij} \tag{7}$$
which can be considered to have the similar role of $g^j$. If such a thing existed then we could compute the $c_i$'s by using $(6)$ and the orthogonality in $(7)$ easily
$$\int_{-a}^{a} f(x) Y(\omega_j;x) dx = \sum_{i=1}^{\infty} c_i \int_{-a}^{a} X(\omega_i;x) Y(\omega_j;x) dx = \sum_{i=1}^{\infty} c_i \delta_{ij} = c_j \tag{8}$$
$2.$ If the answer to question $1$ is YES, how it can be computed? If NO, Why?