Finite differences and modelling#

This notebooks explains a few of the subtleties of common assumptions behind finite difference schemes.

It will also outline one of the key advantages of probfindiff over competing packages: making modelling explicit.

[2]:
import jax.numpy as jnp

from probfindiff import central

Whenever you use probfindiff, remember that you are essentially building a Gaussian process model. The computation of the PN finite difference schemes assumes that the to-be-differentiated function \(f\) is

\[f \sim \text{GP}\,(0, k)\]

for some covariance kernel function \(k\). (This assumptions is implicit in non-probabilistic schemes – more on this later). This assumption is inherent in the probfindiff code. Central, forward, backward, and custom schemes automatically tailor to Gaussian covariance kernel functions.

[3]:
k_exp_quad = lambda x, y: jnp.exp(-jnp.dot(x - y, x - y) / 2.0)

scheme, xs = central(dx=1.0, kernel=k_exp_quad)
print(scheme)
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
FiniteDifferenceScheme(weights=DeviceArray([-7.014634e-01,  4.714658e-08,  7.014634e-01], dtype=float32), covs_marginal=DeviceArray(0.14908189, dtype=float32), order_derivative=DeviceArray(1, dtype=int32, weak_type=True))

Did you know that traditional finite difference coefficients \(c=(1, -2, 1)\) implicitly assume that the function to-be-differentiated is a polynomial?

[4]:
k_poly = lambda x, y: jnp.polyval(x=jnp.dot(x, y), p=jnp.ones((3,)))
scheme, xs = central(dx=1.0, order_derivative=2, kernel=k_poly)
print(scheme.weights, jnp.allclose(scheme.weights, jnp.array([1.0, -2.0, 1.0])))
[ 1. -2.  1.] True

Whether this is right or wrong for your application, has to be decided by yourself. So next time you choose a finite difference scheme, please remember that you do not have to live like this, and that you can indeed compute finite difference formulas that are perfect for your model (and not build a model that uses some magic finite difference scheme).