Skip to content

VikVador/shaggy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Shaggy is a lightweight package that implements autoencoder models in PyTorch. It provides modular encoder–decoder architectures, the SOAP optimizer, gradient-checkpointing utilities, and save/load tools. Basically everything needed to go from raw data to a trained latent representation with minimal boilerplate.


C O N T R I B U T O R S


We build on the work of François Rozet, Gerome Andry, and Sacha Lewin as well as to the entire Science with AI Laboratory (SAIL) team. Thanks !


T U T O R I A L


A self-contained tutorial is available as a Jupyter notebook. It walks through dataset loading, model configuration, training with a live loss plot, and reconstruction visualization on CIFAR-10.

notebook/demo.ipynb


I N S T A L L A T I O N


  • If you want the latest version, install it directly from GitHub:

    pip install git+https://github.com/VikVador/shaggy
    
  • If you want a local editable install with all optional dependencies (training, notebooks, linting):

    conda create -n shaggy python=3.11
    conda activate shaggy
    

    then

    pip install --editable '.[all]' --extra-index-url https://download.pytorch.org/whl/cu121
    

    Optionally, install the pre-commit hooks to automatically detect code issues before each commit:

    pre-commit install --config pre-commit.yml
    

About

🐕 Autoencoders in PyTorch

Topics

Resources

License

Stars

Watchers

Forks

Contributors