Recurrent neural architectures for nonlinear volatility modeling


Details
We invite you to the fifty seventh seminar of the monthly series of meetings conducted jointly by QFRG (Quantitative Finance Research Group) and DSLab (Data Science Lab). The meeting will be devoted to the topic: Recurrent neural architectures for nonlinear volatility modeling, in which Josip Arnerić (University of Zagreb) and Mateusz Buczyński (FES UW) will talk about two advances in volatility modeling – the Jordan neural network and the GARCHNet model, critically analyzing their relative strengths, limitations and potential synergies.
Presentation abstract:
We bring together two advances in volatility modeling – the Jordan Neural Network (Arnerić et al.) and GARCHNet (Buczyński & Chlebus) – and critically examine their relative strengths, limitations and potential synergies.
We first introduce the JNN(1,1,1) approach, a parsimonious Jordan‐type recurrent network designed as a semi‐parametric analog to GARCH(1,1). Applied to daily CROBEX returns, the JNN outperforms the standard GARCH(1,1) in out‐of‐sample conditional variance forecasts, preserving interpretability of key parameters while capturing time‐varying nonlinearity.
Next, we present GARCHNet, which embeds an LSTM architecture within a classical GARCH framework to capture richer nonlinear dynamics in conditional variance. By combining maximum‐likelihood‐based GARCH estimation with an LSTM module, the model flexibly accommodates normal, t and skewed‐t innovations. An empirical study on WIG20, S&P 500 and FTSE 100 returns (2005–2021) demonstrates improved in‐sample fit and VaR performance, while suggesting further improvements for extending to alternative distributions and longer‐memory architectures.
A debate will follow to contrast these approaches and explore landscape for their integration.

Recurrent neural architectures for nonlinear volatility modeling