Sparse Identification of Nonlinear Dynamics with SHallow REcurrent Decoder Networks
SINDy-SHRED combines sparse dynamics identification with shallow recurrent decoder networks to reconstruct full spatiotemporal fields from sparse sensor measurements while discovering interpretable governing equations in the latent space.
Key Features:
Applications:
Title: Sparse Identification of Nonlinear Dynamics with SHallow REcurrent Decoder Networks (SINDy-SHRED)
Preprint: arXiv:2501.13329
git clone https://github.com/gaoliyao/sindy-shred.git
cd sindy-shred
pip install . # Core dependencies
pip install ".[notebooks]" # Include JupyterLab for notebooks
python -c "import torch; import pysindy; import sindy_shred; print('Installation successful!')"
Download the dataset and place it in the Data/ directory:
A ready-to-run notebook is available on Google Colab:
The SINDySHRED class provides an end-to-end interface:
from sindy_shred import SINDySHRED
# Initialize the model
model = SINDySHRED(
latent_dim=3,
poly_order=1,
hidden_layers=2,
l1=350,
l2=400,
dropout=0.1,
batch_size=128,
num_epochs=200,
lr=1e-3,
threshold=0.05,
sindy_regularization=10.0,
)
# Fit to data
model.fit(
num_sensors=3,
dt=1/52.0,
x_to_fit=data, # shape: (time, space)
lags=52,
train_length=1000,
validate_length=30,
sensor_locations=sensor_locations,
)
# Discover governing equations
model.sindy_identify(threshold=0.05, plot_result=True)
# Predict latent dynamics and decode to physical space
z_predict = model.sindy_predict()
forecast = model.shred_decode(z_predict)
# Or use the convenience method
forecast = model.forecast(n_steps=100)
# Automatic threshold tuning
best_threshold, results = model.auto_tune_threshold(adaptive=True)
For finer control over training and inference:
import torch
import pysindy as ps
from sindy_shred_net import SINDy_SHRED_net, fit
import sindy
# Calculate library dimension
library_dim = sindy.library_size(latent_dim, poly_order, include_sine=False, include_constant=True)
# Initialize the network
shred = SINDy_SHRED_net(
input_size=num_sensors,
output_size=state_dim,
hidden_size=latent_dim,
hidden_layers=2,
l1=350,
l2=400,
dropout=0.1,
library_dim=library_dim,
poly_order=3,
include_sine=False,
dt=dt,
).to(device)
# Train with custom datasets
validation_errors = fit(
shred,
train_dataset,
valid_dataset,
batch_size=128,
num_epochs=600,
lr=1e-3,
verbose=True,
threshold=0.25,
patience=5,
sindy_regularization=10.0,
thres_epoch=100,
)
# Extract latent trajectories
gru_outs, _ = shred.gru_outputs(train_dataset.X, sindy=True)
latent = gru_outs[:, 0, :].detach().cpu().numpy()
# Post-hoc SINDy discovery
model = ps.SINDy(
optimizer=ps.STLSQ(threshold=0.1),
feature_library=ps.PolynomialLibrary(degree=poly_order),
)
model.fit(latent_normalized, t=dt)
model.print()
# Simulate and decode predictions
z_sim = model.simulate(init_cond, t_array)
z_tensor = torch.tensor(z_denormalized, dtype=torch.float32).to(device)
physical_pred = shred.decode(z_tensor) # Decode latent to physical space
| Notebook | Description |
|---|---|
sst_sindy_shred_refactor.ipynb |
Sea Surface Temperature with high-level API |
sst_sindy_shred.ipynb |
Sea Surface Temperature with low-level API |
synthetic_data_sindy_shred_refactor.ipynb |
FitzHugh-Nagumo synthetic data with high-level API |
synthetic_data_sindy_shred.ipynb |
FitzHugh-Nagumo synthetic data with low-level API |
complex_data_sindy_shred_refactor.ipynb |
Complex dynamical systems with high-level API |
complex_data_sindy_shred.ipynb |
Complex dynamical systems with low-level API |
| Module | Description |
|---|---|
sindy_shred.py |
High-level SINDySHRED class for end-to-end workflows |
sindy_shred_net.py |
Core SINDy_SHRED_net neural network and training functions |
sindy.py |
SINDy library functions for sparse dynamics identification |
plotting.py |
Visualization utilities for latent space and predictions |
processdata.py |
Data loading and preprocessing utilities |
utils.py |
Helper functions (device selection, datasets) |
SINDy-SHRED achieves state-of-the-art performance compared to existing methods including Convolutional LSTM, PredRNN, ResNet, and SimVP.
If you find SINDy-SHRED useful in your research, please cite:
@misc{gao2025sparse,
title={Sparse identification of nonlinear dynamics and Koopman operators with Shallow Recurrent Decoder Networks},
author={Mars Liyao Gao and Jan P. Williams and J. Nathan Kutz},
year={2025},
eprint={2501.13329},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2501.13329},
}