Let
be an integrable real valued function with respect to the positive measure
in a real vector space, which is spanned by the complete set of orthonormal basis
, where
. Orthogonality is defined as
This suggests that we could take
to be an orthogonal polynomial of degree
n in
x with pure continuous spectrum and where
is the associated positive weight function [
6,
7,
8]. The spectral theorem (a.k.a Favard’s theorem) dictates that such polynomials satisfy the following symmetric three-term recursion relation
for
and where the “recursion coefficients”
are real constants such that
for all
. This recursion gives all the polynomials of any degree starting with the two seed values
and
(i.e.,
). Associated with this space is an infinite dimensional real tridiagonal symmetric matrix (Jacobi matrix) whose elements are
For numerical computations, however, the space is truncated to a finite
N-dimensional subspace spanned by
. The tridiagonal matrix (3) becomes a finite
N×
N matrix
J. The real
N distinct eigenvalues of
J, which we designate as the set
, are the zeros of the polynomial
[i.e.
]. Let
be the normalized eigenvector of
J associated with the eigenvalue
. In this setting, Gauss quadrature integral approximation states that [
1,
2,
3]
where the “numerical weights” could be evaluated as
. Due to the lower numerical cost in computing matrix eigenvalues instead of eigenvectors, we can also write these numerical weights in terms of
and another set of eigenvalues
as
where
is the set of eigenvalues of a submatrix of
J obtained by deleting the first (zeroth) row and first column. If sorted, these eigenvalues interlace as
. The integral approximation (4) becomes exact if
is a polynomial in
x of a degree less than or equal to
. If, instead of the integral (4), we have
then this could be approximated as follows
where
are named the “derivative weights” and
.
Very often, one encounters integrals that represent matrix elements of functions in the basis set
(for example, the matrix elements of a potential function in quantum mechanics) of the form
Now, it is well established that
for
. Therefore, with
this could be rewritten in matrix form as
where
F is a diagonal matrix with elements
. Therefore, to obtain an approximate evaluation of integrals using Gauss quadrature associated with orthogonal polynomials satisfying (1) and (2), one needs only the tridiagonal symmetric matrix
J, which is constructed using the recursion coefficients
, and possibly the weight function
for integrals of the type (6).