A lightweight framework for building Physics-Informed Neural Networks (PINNs) with symbolic PDE definitions using SymPy and automatic differentiation in PyTorch.
It provides:
- Flexible neural architectures (
NeuralNetworkclass). - Wrapper for inverse parameter estimation (
PINN). - Factory for PDE-driven loss generation from symbolic equations (
PINNFactory).
Clone the repository and install the requirements:
git clone https://github.com/barrosyan/pinnfactory.git
cd pinnfactory
pip install -r requirements.txt- torch
- numpy
- matplotlib
- sympy
from pinn_generator import NeuralNetwork, PINN
# Example: 1 input, 1 output, 3 hidden layers with 20 neurons each
net = NeuralNetwork(num_inputs=1, num_outputs=1, num_layers=3, num_neurons=20)
pinn = PINN(net)from pinn_generator import PINNFactory
# PDE: u_xx + u = 0 (example)
pde_residuals = ["Derivative(u(x), (x,2)) + u(x)"]
# Boundary conditions: u(0) = 0, u(pi) = 0
conditions = [
{"equation": "u(0)"},
{"equation": "u(pi)"}
]
factory = PINNFactory(
pde_residuals=pde_residuals,
conditions=conditions,
independent_vars=["x"],
dependent_vars=["u"]
)
loss_fn = factory.generate_loss_function()import torch
optimizer = torch.optim.Adam(pinn.parameters(), lr=1e-3)
# Example training batch
x = torch.linspace(0, 3.14, 100).view(-1, 1).requires_grad_(True)
for epoch in range(1000):
optimizer.zero_grad()
loss, loss_components = loss_fn(pinn, {"collocation": (x,)})
loss.backward()
optimizer.step()
if epoch % 100 == 0:
print(f"Epoch {epoch} - Total Loss: {loss.item():.6f}")- Add support for system of PDEs.
- Implement collocation sampling utilities.
- Add GPU support for large-scale PDE solving.
- Integrate with visualization tools for PINN training.
- Add more NN Architectures.
Apache License 2.0.
You may use, modify, and distribute this project under the terms of the Apache License 2.0.