Published on

Fractional Differentiation and Memory

Authors
Table of Contents

Fractional Differentiation and Memory

Traditionally, differentiation to an integer degree is used to make a series stationary. However, fractional differentiation allows the exponent to be a real number. This helps to preserve memory.

The fractional model is mathematically expressed using the backshift operator BB, and a real number exponent dd as:

(1B)d=k=0(B)ki=0k1dik!(1-B)^{d} = \sum_{k=0}^{\infty}(-B)^{k} \prod_{i=0}^{k-1} \frac{d-i}{k!}

The series is then a dot product:

X~t=k=0ωkXtk\tilde{X}_t = \sum_{k=0}^{\infty} \omega_{k} X_{t-k}

Here, ω\omega consists of weights calculated based on the degree dd. The weights alternate in sign and converge to zero as kk increases, providing an optimal balance between stationarity and memory.

Fig1

Weight Sequence and Code Snippets

The sequence of weights can be generated iteratively as:

ωk=ωk1dk+1k\omega_{k} = -\omega_{k-1} \frac{d-k+1}{k}
Fig2

Python

Julia

def weighting(
degree: float,
size: int
) -> list:

def plotWeights(
degreeRange: tuple,
numberDegrees: int,
size: int
) -> None:
function weighting(
degree::Float64,
size::Int
) -> Vector{Float64}

function plotWeights(
degreeRange::Tuple,
numberDegrees::Int,
size::Int
) -> Nothing

View More: Python | Julia

These functionalities are available in both Python and Julia in the RiskLabAI library.

Convergence of Weights

The weights, ω\omega, tend to zero as kk increases. Also, the sign of ω\omega alternates depending on whether dd is an integer, even or odd. This property is vital for balancing stationarity and memory in the series.

In summary, fractional differentiation offers a powerful method for transforming non-stationary financial time series into stationary ones while preserving as much memory as possible. This is essential for both inferential statistics and machine learning models in finance.

Fractional Differentiation Methods for Time Series Analysis

In this blog, we'll focus on two techniques for fractional differentiation of a finite time series: the "expanding window" method and the "fixed-width window fracdiff" (FFD) method. These techniques are essential for handling non-stationary time series data in financial analytics.

Expanding Window Method

Given a time series ( { X_t }, t=1, \ldots, T ), this approach involves using varying weights ( \omega_k ) to calculate the fractionally differentiated value X~T\tilde{X}_T. The weights depend on the degree of differentiation ( d ) and a tolerance level ( \tau ), which limits the acceptable weight loss for the initial points.

λl=j=TlTωji=0T1ωi\lambda_l = \frac{\sum_{j=T-l}^{T}|\omega_{j}|}{\sum_{i=0}^{T-1}|\omega_{i}|}

You select a ( l^* ) such that λlτ\lambda_{l^*} \leq \tau and λl+1>τ\lambda_{l^*+1} > \tau. This is to ensure that the weight loss doesn't go beyond a certain threshold.

Python

Julia

def fracDiff(series: DataFrame,
degree: float,
threshold: float = 0.01) -> DataFrame:
function fracDiff(series::DataFrame,
degree::Float64,
threshold::Float64 = 0.01) -> DataFrame

View More: Python | Julia

The Fixed-width Window Fracdiff (FFD) method maintains a constant vector of weights ω~k\tilde{\omega}_{k} that do not fall below a given threshold ( \tau ).

ω~k={ωkif kl0if k>l\tilde{\omega}_{k} = \begin{cases} \omega_{k} & \text{if } k \leq l^* \\ 0 & \text{if } k > l^* \end{cases}

Thus, the differentiated series X~t\tilde{X}_t is computed as follows:

X~t=k=0lω~kXtk\tilde{X}_t=\sum_{k=0}^{l^*} \tilde{\omega}_{k} X_{t-k}

Python

Julia

def fracDiffFixed(series: DataFrame,
degree: float,
threshold: float = 1e-5) -> DataFrame:
function fracDiffFixed(series::DataFrame,
degree::Float64,
threshold::Float64 = 1e-5) -> DataFrame

View More: Python | Julia

Achieving Stationarity

Both methods allow for the quantification of how much memory (or autocorrelation) needs to be removed to achieve a stationary series through the coefficient ( d^* ). Many finance studies have traditionally used integer differentiation, often removing more memory than necessary and potentially losing predictive power.

Python

Julia

def minFFD(input: DataFrame) -> float:
function minFFD(input::DataFrame) -> Float64

View More: Python | Julia

Fig2

References

  1. De Prado, M. L. (2018). Advances in financial machine learning. John Wiley & Sons.
  2. De Prado, M. M. L. (2020). Machine learning for asset managers. Cambridge University Press.