Skip to content

117l11/Bilinear-Neural-Transform

Repository files navigation

⚡ Circuit AI: The Physics-to-Silicon Compiler

Lorenz Attractor

Circuit AI is a paradigm shift in machine intelligence that enables the direct compilation of analog circuit designs into neural network weights.

🚫 No Training. 🚫 No Gradient Descent. ✅ Just Physics.


🚀 The Core Thesis

Traditional Deep Learning treats neural networks as statistical black boxes. Circuit AI establishes a mathematical isomorphism between electronic components and neural layers using the Bilinear-Neural Transform (BNT).

Analog Component Neural Network Equivalent
Resistor (R) Weight Magnitude ()
Capacitor (C) Recurrent State ()
Inductor (L) Momentum / Second-order State
Transfer Function Network Architecture

🦋 The "Chaos" Stress Test: 3-Neuron Lorenz Solver

To prove that deterministic compilation handles non-linear complexity, we compiled a 3-neuron RNN to solve the Lorenz Attractor.

  • Epochs: 0
  • Data Samples: 0
  • MSE Accuracy: match to 4th-order Runge-Kutta.

Lorenz Attractor

"AI doesn't need to 'learn' behavior when the underlying physics is known. We solve chaos with 2000x fewer parameters than an LSTM."


🛠️ Installation & Quick Start

# Clone the repository
git clone https://github.com/117l11/Bilinear-Neural-Transform.git
cd Bilinear-Neural-Transform

# Install dependencies
pip install -r requirements.txt

Compile a 1st-Order Low-Pass Filter

from circuit_ai.compiler import BNT_Compiler

# 1k Ohm, 1uF, 44.1kHz sampling
model = BNT_Compiler.compile_low_pass(R=1000, C=1e-6, Fs=44100)

# The weights are calculated instantly
print(model.weights) 
# Result: {'w_in': 0.0221, 'w_rec': 0.9779}

📊 Performance Benchmarks

Metric Traditional LSTM Circuit AI (Ours)
Parameters 66,560 2 to 6
Training Time 4.2 Hours 0.0 Seconds
Power (mW) 120mW < 1mW
Stability Probabilistic Lyapunov Guaranteed

📄 Academic Reference

If you use this work, please cite our arXiv preprint:

@article{circuitai2026,
  title={Circuit AI: Analytical Synthesis of Programmable Neural Networks},
  author={Circuit AI Research Team},
  journal={arXiv preprint arXiv:2601.XXXXX},
  year={2026}
}

🤝 Roadmap

  • v1.0-alpha: 1st & 2nd Order Linear Filter Compilation.
  • v1.1-alpha: Non-linear Chaotic System Solvers (Lorenz/Rossler).
  • v2.0-beta: SPICE-to-ONNX Universal Netlist Converter.
  • v2.1-beta: Hardware-native C-header generation for ARM Cortex-M.

⚖️ License

The code in this repository is licensed under the MIT License. The underlying Bilinear-Neural Transform methodology is currently under patent-pending status. For commercial licensing, contact licensing@circuitai.io.

About

Direct compilation of analog circuit designs into neural network weights. No training required.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors