Skip to content

Tensor-Network Machine Learning with Matrix Product States, trained via a surrogate (projective) loss instead of standard negative log-likelihood

License

Notifications You must be signed in to change notification settings

ZekiZeybek/TNML-surrogate-loss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TNML‑Surrogate‑Loss

This repository was originally prepared for a journal club presentation on Stokes & Terilla, Entropy 21(12):1236 (2019), “Probabilistic Modeling with Matrix Product States”. The code and examples were designed to illustrate the exact single-site DMRG projection update and its application to benchmark datasets (P20 parity, DIV7, Bars & Stripes).

Tensor‑Network Machine Learning with Matrix Product States (MPS) trained via a surrogate (projective) loss inspired by exact single‑site DMRG. This repo provides a small, reproducible codebase to model discrete distributions as Born machines, with examples on P20 parity, DIV7, and Bars & Stripes.

Features

  • Matrix product state Born machine generative model
  • Exact single‑site DMRG style update via analytic projection in an effective subspace
  • Command‑line interface for parity, div7, bars_stripes
  • Reproducible experiments (sweeps over bond dimension (\chi)), plots + metrics
  • Pure NumPy + opt_einsum dependency footprint

Install

# Directly from GitHub
pip install "git+https://github.com/ZekiZeybek/TNML-surrogate-loss.git"
# or clone and enter the repo root
pip install -e .   # for development

Quickstart (CLI)

# P20 parity with 2% / 2% / 2% splits (train/val/test sampled from P20)
mps_surrogate_loss parity --n-bits 20 --frac 0.02 --seed 7   --max-chi 9 --sweeps 25 

# Bars & Stripes 
mps_surrogate_loss bars_stripes --img-dim 4 --max-chi 10 --sweeps 25 

Outputs (metrics JSON and NLL plot) are written under results directories in the repo root by default.

Common flags

  • --n-bits N (for parity/div7)
  • --frac F fraction per split (train/val/test); e.g. 0.02 or 0.10
  • --seed S
  • --max-chi K (or use --chi 2 3 5 to pass explicit values)
  • --sweeps S number of full left→right→left sweeps
  • --outdir PATH custom results folder

Usage

See notebooks/example_parity.ipynb for end‑to‑end walkthrough. Minimal snippet:

from mps_surrogate_loss.data import even_parity_strings, split_data
from mps_surrogate_loss.training import train_over_chi

N = 20
data = even_parity_strings(N)
train, val, test = split_data(data, frac=0.02, seed=7)

results = train_over_chi(
    n_bits=N,
    train_data=train,
    val_data=val,
    test_data=test,
    max_chi=9,
    sweeps=25,
)

Reference

  • J. Stokes & J. Terilla, Probabilistic Modeling with Matrix Product States, Entropy 21(12):1236, 2019.

About

Tensor-Network Machine Learning with Matrix Product States, trained via a surrogate (projective) loss instead of standard negative log-likelihood

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages