Physics-Informed Neural networks (PINNs) are mesh-free Deep Learning (DL) framework to solve Partial Differential equations (PDEs). This technique embeds physical laws directly into the training process, enabling the solution of forward and inverse problems governed by PDEs. Unlike traditional neural networks, PINNs incorporate the governing equations, initial conditions, and boundary conditions directly into the loss function. Automatic differentiation in PINNs avoids truncation errors and ensures high precision enforcing the governing equations. Despite their advantages, PINNs face several challenges. PINNs struggle to solve Convection-Diffusion Equations (CDEs), particularly at the region where the convection term dominated. To overcome this problem, an extended form of PINNs is discussed here. Adaptive Gradient-enhanced PINNs (AG-PINNs) are extensions of PINNs, where Residual-based Adaptive Refinement (RAR) and the derivatives of the governing equations are also enforced during training. However, adding gradient constraints leads to over-constraining the network, increased computational cost, and inefficient learning in smooth regions. This motivates RAR, which improves the solution accuracy while avoiding over-constraining the neural network in smooth regions. In this paper we discuss convection-diffusion equation with high Péclet number. As Pé increased the convection terms dominated so it become challenging for standard PINNs, to mitigate these challenges AG-PINNs is used. AG-PINNs is better than standard PINNs which is shown in this paper by comparing results of AG-PINNs with standard PINNs technique. This work is carried out through Python Jupiter Notebook in a deepXDE library.
Javeed Ahmad “Adaptive Gradient-Enhanced PINNs for Numerical Solution of Convection-Diffusion Equa Vol. 13 Issue 02 PP. 13-21 February 2026. https://doi.org/10.5281/zenodo.18753514.
[1] J. Cadena-Morales, C. L´opez-Castro, J. Alba-Maldonado, Applications of differential equations to model the physical phenomenon of heat transfer with an internal energy source, in: Journal of Physics: Conference Series, Vol. 2102, IOP Publishing, 2021, p. 012018.
[2] H. Nguyen, R. Tsai, Numerical wave propagation aided by deep learning, Journal of Computational Physics 475 (2023) 111828.
[3] J. W. Sanders, A. C. DeVoria, N. J. Washuta, G. A. Elamin, K. L. Skenes, J. C. Berlinghieri, A canonical hamiltonian formulation of the navier–stokes problem, Journal of Fluid Mechanics 984 (2024) A27.
[4] H. Lhachemi, R. Shorten, Boundary output feedback stabilization of state delayed reaction–diffusion pdes, Automatica 156 (2023) 111188.
[5] S. Ghosh-Dastidar, H. Adeli, Spiking neural networks, International journal of neural systems 19 (04) (2009) 295–308.
[6] V. Davydovych, V. Dutka, R. Cherniha, Reaction–diffusion equations in mathematical models arising in epidemiology, Symmetry 15 (11) (2023) 2025.
[7] J. H. Lagergren, J. T. Nardini, G. Michael Lavigne, E. M. Rutter, K. B. Flores, Learning partial differential equations for biological transport models from noisy spatio-temporal data, Proceedings of the Royal Society A 476 (2234) (2020) 20190800.
[8] X. Yu, K. Lan, J. Wu, Green’s functions, linear second-order differential equations, and one-dimensional diffusion advection models, Studies in Applied Mathematics 147 (1) (2021) 319–362.
[9] M. Raissi, P. Perdikaris, G. E. Karniadakis, Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations, Journal of Computational physics 378 (2019) 686–707.
[10] W. S. McCulloch, W. Pitts, A logical calculus of the ideas immanent in nervous activity, The bulletin of mathematical biophysics 5 (1943) 115–133