Skip to content

Add experimental submodule#1932

Open
chaoming0625 wants to merge 30 commits intolululxvi:masterfrom
chaobrain:master
Open

Add experimental submodule#1932
chaoming0625 wants to merge 30 commits intolululxvi:masterfrom
chaobrain:master

Conversation

@chaoming0625
Copy link
Copy Markdown

@chaoming0625 chaoming0625 commented Jan 15, 2025

In response to chaobrain/pinnx#19, #1904, and #1899, I am trying to merge pinnx into DeepXDE.

This pull request includes a submodule deepxde.experimental, which enables explicit variables and physical units in physics-informed neural networks.

Motivation

experimental module was engineered to address a pivotal challenge prevalent in current PINN libraries: the absence of explicit physical semantics.

  • Enhanced variable management with semantic clarity

Existing PINN libraries, such as DeepXDE, necessitate that users manually track the order and significance of variables. For instance, within these frameworks, variables[0] might denote amplitude, variables[1] for frequency, and so forth. This method lacks an intrinsic link between the sequence of variables and their physical meanings, thereby increasing complexity and the potential for errors. In stark contrast, experimental empowers users to assign clear and meaningful names to variables (e.g., variables["amplitude"] and variables["frequency"]). This approach obviates the need for manual management of variable order, enhancing both code readability and maintainability.

  • Simplified gradient relationship management

Another significant limitation of existing PINN libraries is their reliance on users to manage intricate Jacobian and Hessian relationships between variables. This process is not only cumbersome but also prone to mistakes. experimental revolutionizes this aspect by employing a straightforward dictionary indexing mechanism to track intuitive gradient relationships. For example, users can effortlessly access the Hessian matrix element $\partial^2 y / \partial x^2$ via hessian["y"]["x"] and the Jacobian matrix element $\partial y / \partial t$ via jacobian["y"]["t"]. This simplification streamlines the workflow, allowing users to focus on modeling rather than matrix management.

  • Integration of unit-aware automatic differentiation

Another prevalent deficiency in current PINN frameworks is the lack of support for physical units, which are crucial for maintaining dimensional consistency in physical equations. Take the Burgers' equation for example, the left-hand side $\frac{\partial u}{\partial t} + u\frac{\partial u}{\partial x}$ and the right-hand side $\nu\frac{\partial^2 u}{\partial x^2}$ must share the same physical units (e.g., meters per second squared). To ensure such consistency, experimental integrates brainunit.autograd module, enabling unit-aware automatic differentiation. This feature preserves unit information during the computation of first, second, or higher-order derivatives, ensuring that physical unit dimensions remain consistent throughout the differentiation process. Consequently, experimental effectively mitigates errors arising from unit mismatches, thereby enhancing the reliability of simulations.

A quick example

import brainstate as bst
import brainunit as u
from deepxde import experimental as deepxde

# geometry
geometry = deepxde.geometry.GeometryXTime(
    geometry=deepxde.geometry.Interval(-1, 1.),
    timedomain=deepxde.geometry.TimeDomain(0, 0.99)
).to_dict_point(x=u.meter, t=u.second)

uy = u.meter / u.second
v = 0.01 / u.math.pi * u.meter ** 2 / u.second

# boundary conditions
bc = deepxde.icbc.DirichletBC(lambda x: {'y': 0. * uy})
ic = deepxde.icbc.IC(lambda x: {'y': -u.math.sin(u.math.pi * x['x'] / u.meter) * uy})

# PDE equation
def pde(x, y):
    jacobian = approximator.jacobian(x)
    hessian = approximator.hessian(x)
    dy_x = jacobian['y']['x']
    dy_t = jacobian['y']['t']
    dy_xx = hessian['y']['x']['x']
    residual = dy_t + y['y'] * dy_x - v * dy_xx
    return residual

# neural network
approximator = deepxde.nn.Model(
    deepxde.nn.DictToArray(x=u.meter, t=u.second),
    deepxde.nn.FNN(
        [geometry.dim] + [20] * 3 + [1],
        "tanh",
        bst.init.KaimingUniform()
    ),
    deepxde.nn.ArrayToDict(y=uy)
)

# problem
problem = deepxde.problem.TimePDE(
    geometry,
    pde,
    [bc, ic],
    approximator,
    num_domain=2540,
    num_boundary=80,
    num_initial=160,
)

# training
trainer = deepxde.Trainer(problem)
trainer.compile(bst.optim.Adam(1e-3)).train(iterations=15000)
trainer.compile(bst.optim.LBFGS(1e-3)).train(2000, display_every=500)
trainer.saveplot(issave=True, isplot=True)

This new submodule support most of PINN examples in DeepXDE.

The documentation is hosted in docs/experimental_docs directory. The ample examples are included in examples/experimental_examples.

Installation

To use the experimental module, install it with the following command:

pip install deepxde[experimental]

Dependency

experimental module heavily depends on the following packages:

Low-precision training support

Changing the training precision is very easy in experimental. Simply setting

import brainstate as bst


bst.environ.set(precision='b16')

models will be trained using bfloat16.

Here is an example:

import brainstate as bst
import brainunit as u
import optax
from deepxde import experimental as deepxde

bst.environ.set(precision='b16')  # or, 32, 64, 16

geometry = deepxde.geometry.GeometryXTime(
    geometry=deepxde.geometry.Interval(-1, 1.),
    timedomain=deepxde.geometry.TimeDomain(0, 0.99)
).to_dict_point(x=u.meter, t=u.second)

uy = u.meter / u.second
bc = deepxde.icbc.DirichletBC(lambda x: {'y': 0. * uy})
ic = deepxde.icbc.IC(lambda x: {'y': -u.math.sin(u.math.pi * x['x'] / u.meter) * uy})

v = 0.01 / u.math.pi * u.meter ** 2 / u.second


def pde(x, y):
    jacobian = approximator.jacobian(x)
    hessian = approximator.hessian(x)
    dy_x = jacobian['y']['x']
    dy_t = jacobian['y']['t']
    dy_xx = hessian['y']['x']['x']
    residual = dy_t + y['y'] * dy_x - v * dy_xx
    return residual


approximator = deepxde.nn.Model(
    deepxde.nn.DictToArray(x=u.meter, t=u.second),
    deepxde.nn.FNN(
        [geometry.dim] + [20] * 3 + [1],
        "tanh",
    ),
    deepxde.nn.ArrayToDict(y=uy)
)

problem = deepxde.problem.TimePDE(
    geometry,
    pde,
    [bc, ic],
    approximator,
    num_domain=2540,
    num_boundary=80,
    num_initial=160,
)

trainer = deepxde.Trainer(problem, )
trainer.compile(bst.optim.OptaxOptimizer(optax.adamw(1e-3)))
trainer.train(iterations=10000)

@lululxvi
Copy link
Copy Markdown
Owner

Hi @chaoming0625 , thank you for adding the code. It is nice.

My first suggestion is that as there is some overlapped code between DeepXDE and PINNx, such as geometry, we should reduce the duplication as much as possible. You can import and reuse the code in DeepXDE.

@chaoming0625
Copy link
Copy Markdown
Author

Hi, @lululxvi , I have made some changes. However, deepxde.pinnx.geometry is necessary, since most of the points are generated using jax.numpy rather than numpy. These changes can enable the model compatible with bfloat16.

@chaoming0625
Copy link
Copy Markdown
Author

Therefore, hosting geometry submodule is essential for deepxde.pinnx module.

Comment thread deepxde/geometry/geometry.py Outdated
@chaoming0625
Copy link
Copy Markdown
Author

Hi, @lululxvi , in the next, my colleague will continue to promote the integration of 'pinnx' into 'deepxde', and he will focus more on solving every problem in the integration.

@lululxvi
Copy link
Copy Markdown
Owner

Hi, @lululxvi , in the next, my colleague will continue to promote the integration of 'pinnx' into 'deepxde', and he will focus more on solving every problem in the integration.

No problem.

Comment thread deepxde/geometry/geometry.py Outdated
Comment thread deepxde/pinnx/__init__.py Outdated
Comment thread deepxde/pinnx/fnspace.py Outdated
we can install pinnx module using the following command:

pip install deepxde[pinnx]
@chaoming0625
Copy link
Copy Markdown
Author

I have updated the PR introduction to clearly explain the motivation behind the pinnx module. This should help users to understand the core purpose of the module.

@chaoming0625
Copy link
Copy Markdown
Author

I have also added an option to install pinnx as an optional dependency, allowing users to quickly install the necessary components. Specifically, users who wish to use the pinnx module can install it with the command:

pip install deepxde[pinnx]

@chaoming0625
Copy link
Copy Markdown
Author

I updated the PR introduction once again.

@chaoming0625
Copy link
Copy Markdown
Author

I have maximized function reuse from deepxde to minimize API redundancy.

@lululxvi
Copy link
Copy Markdown
Owner

lululxvi commented Mar 6, 2025

I don't have a good name now. How about "experimental", which tf, torch uses to indicate some new features?

@chaoming0625
Copy link
Copy Markdown
Author

I agree this is a good name. If this name is applied, how can we import this module?

import deepxde.experimental as deepxde

@lululxvi
Copy link
Copy Markdown
Owner

lululxvi commented Mar 7, 2025

import deepxde.experimental as deepxde

This looks good to me.

@chaoming0625
Copy link
Copy Markdown
Author

import deepxde.experimental as deepxde

This looks good to me.

I have made all necessary updates by renaming deepxde.pinnx to deepxde.experimental. Also updated all corresponding scripts.

@chaoming0625 chaoming0625 changed the title Add pinnx submodule Add experimental submodule Mar 13, 2025
@lululxvi
Copy link
Copy Markdown
Owner

I have made all necessary updates by renaming deepxde.pinnx to deepxde.experimental. Also updated all corresponding scripts.

Sounds good. But I realize that you changed the code without reusing the DeepXDE code. It is still better to reuse existing code as many as possible.

@chaoming0625
Copy link
Copy Markdown
Author

I have made all necessary updates by renaming deepxde.pinnx to deepxde.experimental. Also updated all corresponding scripts.

Sounds good. But I realize that you changed the code without reusing the DeepXDE code. It is still better to reuse existing code as many as possible.

I'm unclear on what you're referring to. Code reuse problems were resolved across many previous commits. This commit's sole purpose is to change the module name.

Comment thread deepxde/experimental/_trainer.py Outdated
Comment thread deepxde/experimental/_trainer.py Outdated
@lululxvi
Copy link
Copy Markdown
Owner

Please format the code using black https://github.com/psf/black

@chaoming0625
Copy link
Copy Markdown
Author

Please format the code using black https://github.com/psf/black

Done

@chaoming0625
Copy link
Copy Markdown
Author

Hi, @lululxvi can this PR be merged now? Or are there any other issues that need to be solved?

@lululxvi
Copy link
Copy Markdown
Owner

Sorry, I was very busy over the past couple of months. This PR is getting better, but it takes time to go over each file. I would suggest splitting this PR into a few independent PRs: 1. code; 2. docs; and 3. demos. What do you think?

@chaoming0625
Copy link
Copy Markdown
Author

Sorry. It's really hard to make such a split. I think this PR has a very good fit and quality.

@lululxvi
Copy link
Copy Markdown
Owner

lululxvi commented Jun 25, 2025

Why is it hard? I think the first PR can be the code inside folder deepxde; the second is the examples in the folder examples/experimental_examples; then the third one is docs, including README.md, pyproject.toml and docs

Routhleck added a commit to chaobrain/deepxde that referenced this pull request Jun 26, 2025
Co-Authored-By: Chaoming Wang <chao.brain@qq.com>
Routhleck added a commit to chaobrain/deepxde that referenced this pull request Jun 26, 2025
Co-Authored-By: Chaoming Wang <chao.brain@qq.com>
Routhleck added a commit to chaobrain/deepxde that referenced this pull request Jun 26, 2025
Co-Authored-By: Chaoming Wang <chao.brain@qq.com>
@Routhleck
Copy link
Copy Markdown

Hi @lululxvi, I just spilt this pr into #2003 #2004 #2005. Hope this works for you.

@echen5503
Copy link
Copy Markdown
Contributor

@lululxvi Recommend close this PR in favor of #2003, #2004, #2005

@chaoming0625
Copy link
Copy Markdown
Author

@lululxvi Recommend close this PR in favor of #2003, #2004, #2005

OK, we will make update.

echen5503 pushed a commit to echen5503/deepxde that referenced this pull request Mar 30, 2026
Co-Authored-By: Chaoming Wang <chao.brain@qq.com>
echen5503 pushed a commit to echen5503/deepxde that referenced this pull request Mar 30, 2026
Co-Authored-By: Chaoming Wang <chao.brain@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants