Synopsis

Teaching a Neural Network the Hard Way

Physics 14, s25
A neural network can be made to produce more reliable predictions of nonlinear systems if it is created with conservation laws built in.
R. Gauthier-Butterfield/UC Irvine

A baby learns by observation how objects move. But without accounting for conservation of momentum, its developed understanding is just an educated guess. Similarly, an artificial neural network (ANN) learns from empirical data how a particular system works, but without explicitly considering the conservation laws that govern that system, it risks making unreliable predictions. To address this limitation, Tom Beucler at the University of California, Irvine, and colleagues have devised a way to hardwire an ANN with such laws. They demonstrated the technique using an atmospheric model for the climate, but they say that their method can be applied to models of any physical system [1].

An ANN is an algorithm-based tool for turning a set of inputs into a set of outputs. The development or “training” process for an ANN involves incrementally reconfiguring its underlying algorithm with data from observations until the outputs accurately match reality. During this process, a conventional ANN can be discouraged from making physically impossible predictions by imposing “soft constraints” on its outputs—rewarding outcomes that better conform to physical laws.

Beucler and colleagues used an ANN to simulate the climatic effects of atmospheric convection. Such models are reliable only if they respect strict conservation of mass and energy. Instead of imposing these laws using soft constraints, the team embedded them in the architecture of the ANN as “hard constraints.” Likened to an infant’s learning process, the method is equivalent to making physics-defying predictions literally unthinkable.

The addition of these hard constraints makes it more difficult for the ANN to reach its optimal output. The researchers say that this price is worth paying though, as conservation laws are critical to climate models.

–Marric Stephens

Marric Stephens is a Corresponding Editor for Physics Magazine based in Bristol, UK.

References

  1. T. Beucler et al., “Enforcing analytic constraints in neural networks emulating physical systems,” Phys. Rev. Lett. 126, 098302 (2021).

Subject Areas

Biological PhysicsComplex SystemsNonlinear Dynamics

Related Articles

Uncovering Networks in Rainforest Plants
Biological Physics

Uncovering Networks in Rainforest Plants

The spatial arrangement of plants in a rainforest corresponds to a special “critical” state that could be vital for ecosystem robustness.   Read More »

Shape Matters in Self-Assembly
Nanophysics

Shape Matters in Self-Assembly

A theoretical study of self-assembly finds that hexagon-shaped building blocks can form large structures faster than triangular or square blocks. Read More »

The Neuron vs the Synapse: Which One Is in the Driving Seat?
Complex Systems

The Neuron vs the Synapse: Which One Is in the Driving Seat?

A new theoretical framework for plastic neural networks predicts dynamical regimes where synapses rather than neurons primarily drive the network’s behavior, leading to an alternative candidate mechanism for working memory in the brain. Read More »

More Articles