An open exploration of viable human-AI systems.
View the Project on GitHub algoplexity/cybernetic-intelligence
Welcome to the Algoplexity contributor guide for the ADIA Structural Break Detection challenge. This document introduces the architecture of our solution, describes each component’s role, and provides clear entry points for you to contribute meaningfully — without requiring access to proprietary symbolic–latent research.
We model structural breaks in univariate time series using a two-stage deep learning pipeline:
This results in a robust model that generalizes well to unseen structural regimes.
The entire solution runs as a single Python notebook executed by the ADIA Challenge Platform. It contains two callable entry points:
train(X_train, y_train, model_dir)infer(X_test, model_dir)The platform calls these functions and expects infer() to yield a sequence of scalar break predictions.
| Component | Role |
|---|---|
| Training Pipeline | Controls pre-training on ECA data and fine-tuning on real data. |
| Inference Pipeline | Loads the trained encoder and computes prediction scores from test data. |
| Core Library | Houses reusable modules: symbolic processors, model architectures, and encoders. |
| Model Store | Filesystem layer used to save and load encoder weights and configuration. |
train() FunctionPre-Training Stage
Fine-Tuning Stage
SeriesProcessor.model_dir.infer() FunctionEncoderLoader.For each test time series:
SeriesProcessor.Fingerprinter to generate high-level vector encodings.BreakScoreCalculator to compute the distance between pre- and post-fingerprints (e.g. cosine distance).| Class/Module | Purpose |
|---|---|
PermutationSymbolizer |
Converts numeric series into symbolic ordinal patterns |
SeriesProcessor |
Transforms full time series into windows of symbolic sequences |
TransformerEncoder |
Learns vector representations (fingerprints) from sequences |
DynamicalAutoencoder |
Encodes + decodes sequences for unsupervised learning |
StructuralBreakClassifier |
Binary classifier predicting breaks from two fingerprints |
| Class | Description |
|---|---|
ECADataGenerator |
Produces labeled symbolic sequences using chaotic ECA rules |
MDLPreTrainer |
Trains the DynamicalAutoencoder using a dual-loss signal |
BreakClassifierFinetuner |
Fine-tunes the break classifier on real labeled data |
EncoderSaver |
Persists model weights and hyperparameters |
| Class | Description |
|---|---|
EncoderLoader |
Loads model and config from Model Store |
Fingerprinter |
Uses SeriesProcessor + encoder to create symbolic fingerprints |
BreakScoreCalculator |
Compares before/after fingerprints to produce final break score |
X_train: pd.DataFrame with MultiIndex [id, time], columns: value, periody_train: pd.Series with id → bool indicating break presenceX_test: List[pd.DataFrame], one per time seriesYou are welcome to contribute to the following areas without IP conflict:
Improve ECA Sampling Logic
Experiment with Alternative Fingerprinting Methods
TransformerEncoder with simpler architectures (e.g., GRU, CNN)Enhance Preprocessing
SeriesProcessorTune Classifier Heads or Loss Functions
Modularization / Engineering Improvements
Add Test-Time Augmentations
algoplexity.github.io notebook template or request edit access via the internal repo.