Demystifying EnbPI: Mastering Conformal Prediction Forecasting

Valeriy Manokhin, PhD, MBA, CQF
6 min readAug 11, 2023

“In my article ‘Conformal Prediction Forecasting with MAPIE,’ we explored how Conformal Prediction enables us to construct powerful probabilistic forecasting models. Unlike other probabilistic forecasting methods, Conformal Prediction forecasting models can generate prediction intervals encapsulating ground truth values at user-determined coverage levels. In this article, we will dive deep into the mechanics of EnbPI under the hood.

make every forecast count

EnbPI is a Conformal Prediction method that can construct distribution-free prediction intervals for time series; the technique wraps around any bootstrap ensemble estimator. Unlike classical Conformal Prediction methods, it does not require data exchangeability assumption.

In simpler terms, the data exchangeability assumption implies that the sequence in which the observations appear in the dataset is irrelevant. Changing the order of the observations won’t alter the overall statistical properties of the data.

However, in time series analysis, this exchangeability assumption does not hold, as the order of the data points often carries significant information. This necessitates the development of a particular class of Conformal Prediction methods designed to handle time series data, where the order is crucial.

Conformal Prediction for time series has become a popular 🔥🔥🔥🔥🔥 and rapidly growing 🚀🚀🚀🚀🚀 area in both research and practical applications, leading to the creation of many robust models in academia and industry.

EnbPI does not require data exchangeability and wraps around any bootstrap ensemble estimator to construct sequential prediction intervals. EnbPI is generally easy to implement, computationally efficient, and well-suited to various regression functions.

EnbPI was the first Conformal Prediction model developed explicitly for time series by researchers from Georgia Tech, who presented the paper at ICML 2021 Conformal Prediction track. Since then, the model has been implemented in multiple Conformal Prediction libraries, including MAPIE and Amazon Fortuna.

EnbPI is suitable for non-stationary time series; prediction intervals produced by EnbPI enjoy approximately valid marginal coverage under mild assumptions on time-series stochastic errors and regression estimators. In particular, the method is suitable for non-stationary series and may attain conditional validity.

In the EnbPI paper, the authors extensively studied the performance of renewable energy estimation applications using solar and wind data to find that EnbPI maintains coverage when competing methods fail to do so. The authors also demonstrated broad applicability on time series from other application domains, on which EnbPI intervals were often shorter than competing methods.

So, how does EnbPI work under the hood? EnbPI is a method for constructing distribution-free prediction intervals for dynamic time series.

Here are the steps that EnbPI follows under the hood:

Here are the steps for constructing an EnbPI predictor:

  1. Choose a bootstrap ensemble estimator: EnbPI can be used with any bootstrap ensemble estimator.
  2. Fit B (in practice 10–20) ensemble estimators: The ensemble estimator fits the bootstrap from the training data using the chosen bootstrap method B times. For each ensemble estimator, sample with replacement from the training dataset and fit the ensemble estimator on the bootstrap sample. The result is B different ensemble estimators.
  3. For each point in t = 1,…, T compute the residuals using only ensemble estimators for which point t was not used for training. The idea is to use out-of-sample errors as a nonconformity measure indicating variance of predictions. Append all such out-of-sample errors into one array.
  4. Generate predictions: The ensemble estimator generates point predictions for the test data by aggregating bootstrap ensemble predictions (for example, using the median or mean of ensemble predictions).
  5. Construct prediction intervals: The prediction intervals are constructed using the predictions from the ensemble estimator and a chosen significance level. Like many other conformal prediction methods, one can select quantiles from the distribution of out-of-sample errors. If the user sets the confidence level at 95%, one can choose 95% quantile from the distribution of errors in Step 3.

EnbPI is available in several libraries, including MAPIE and Amazon Fortuna. We can demonstrate how to use EnbPI using the bike-sharing dataset following the EnbPI example using Amazon Fortuna.

The objective is to predict demand in the column “count”.

Average hourly bike demand

We can plot the bike demand; the pattern has seasonality. As a standard, we split the dataset into training and test datasets.

df = df.sample(500)
y = df["count"] / df["count"].max()
X = df.drop("count", axis="columns")
X_train, X_test = train_test_split(X, test_size=0.2, shuffle=False)
y_train, y_test = train_test_split(y, test_size=0.2, shuffle=False)

EnbPI requires bootstrapping the data, i.e. sampling with replacement random subsets of the time series and training a model for each sample.

class DataFrameBootstrapper:
def __init__(self, n_samples: int):
self.n_samples = n_samples

def __call__(
self, X: np.ndarray, y: np.ndarray
) -> tuple[np.ndarray, list[tuple[np.ndarray, np.ndarray]]]:
indices = np.random.choice(y.shape[0], size=(self.n_samples, y.shape[0]))
return indices, [(X.iloc[idx], y.iloc[idx]) for idx in indices]
n_bs_samples = 10
bs_indices, bs_train_data = DataFrameBootstrapper(n_samples=n_bs_samples)(
X_train, y_train
)

In this example, we have used ten bootstrap samples created using DataFrameBootstrapper(). bs_indices is an array of indices from the original data frame with ten rows and 400 columns (400 rows in the training dataset).

We can look at the first bootstrap sample of X

First bootstrap sample of features X

We can look at the first bootstrap sample of y

First bootstrap sample of y

The sampling is with replacement; we can confirm this by looking at the first index (1306) duplicated in the first bootstrap sample.

The first duplicated index in the first bootstrap sample

Now that we understand how the bootstrap samples were created, we can run a Histogram-based Gradient Boosting Regression Tree on each bootstrap sample and produce predictions for each of the 10 ensemble models.

Fit the model on each of the 10 ensemble models and compute predictions

Now that we have predictions for each of the 10 models, we can delegate the rest to EnbPI implementation in Amazon Fortuna to compute conformal prediction intervals.

EnbPI in Amazon Fortuna

We can compute actual coverage using Fortuna functionality — the coverage is 96.99% exceeding the user-specified confidence level of 95%. Excellent!

We also notice that prediction intervals include point predictions in 100% cases as a bonus.

Prediction intervals produced by EnbPI

We can inspect the prediction intervals produced by EnbPI, it appears they have permanent width. This is indeed the case in the baseline version without retraining.

Can we do better? Indeed we can — we can use EnbPI with online feedback to recompute prediction intervals after observing every new point in the test dataset. The great advantage of the EnbPI is that is does not require retraining the B ensemble models, such models are only trained once and all we do is recompute residuals and adjust the width of prediction intervals accordingly. As we can see now, the prediction intervals are adaptive.

Prediction intervals produced by EnbPI with online feedback

We can compute coverage the same as we did in offline mode. In online mode, we still have coverage of %95.99, exceeding user-specified level of 95% and again, all point predictions are inside prediction intervals.

Percentage of intervals containing average bootstrap predictions: 1.0. Percentage of intervals containing true targets: 0.9599999785423279.

We saw how the Conformal Prediction model EnbPI produced excellent prediction intervals.

If you want to learn more about Conformal Prediction, consider my book “Practical Guide to Applied Conformal Prediction: Learn and apply the best uncertainty frameworks to your industry applications”

References:

  1. “Practical Guide to Applied Conformal Prediction: Learn and apply the best uncertainty frameworks to your industry applications”
  2. Awesome Conformal Prediction
  3. Conformal Prediction forecasting with MAPIE
  4. Amazon Fortuna Time series regression with EnbPI, a conformal prediction method
  5. Conformal Prediction Interval for Dynamic Time-Series

--

--

Valeriy Manokhin, PhD, MBA, CQF
Valeriy Manokhin, PhD, MBA, CQF

Written by Valeriy Manokhin, PhD, MBA, CQF

Principal Data Scientist, PhD in Machine Learning, creator of Awesome Conformal Prediction 👍Tip: hold down the Clap icon for up x50

Responses (5)