How Anesthesia Switches Off Consciousness

Physics 8, 85
A computer model of a network of neurons shows that a sudden breakdown in the net's ability to transmit information mimics the brain wave changes that accompany anesthesia.
Using your network. A model of networked neurons can make a sudden transition to a state of low information flow that mimics anesthesia.

Entering anesthesia, the mind seems to shut down abruptly and then later re-emerge from the blackness with equal swiftness. A new theoretical model suggests that these changes may result from a sudden, global change in the ability of the network of neurons to transmit information. The model can reproduce the changes in electrical activity (“brain waves”) seen with anesthetized patients. The researchers say that their theory could provide a simple foundation for understanding how the brain acquires its conscious cognitive functions.

Researchers don't understand how the activity of individual, interlinked neurons leads to the overall effect of anesthesia. Yan Xu and colleagues at the University of Pittsburgh School of Medicine wondered whether the loss of consciousness might be related to a reduced ability of sensory information to find its way through the brain’s neural network. This information must be transmitted from a region called the thalamus, which regulates consciousness and alertness, to the cortex, where “higher” cognitive functions process the information into a picture of the world.

The researchers developed a simple model with a tree-like network of "nodes," in which an input signal entering the “trunk” is disseminated through the branches. Each node—which could represent an individual neuron or an entire brain region—sums the inputs it receives from connected nodes and then passes this sum on to other nodes.

In this model, the chance of a signal being successfully passed along any link is controlled by a probability factor p that applies to the entire network. For a given run of the simulation, p is fixed at a value between 1 (100% probability of transmission) and 0 (zero probability of transmission), and each link's on/off state is re-evaluated periodically. If p is set to 0.6, for example, then there is a 60% chance, at each instant, of a successful transmission between any two connected nodes. Unless p is zero, if a link is turned off at one moment, it could turn on again in the next moment. Decreasing p mimics the effect of anesthetics, which block signals between neurons.

Xu and colleagues found that a computer realization of their branching model, with 7381 nodes in all, captures the electrical signature of anesthesia in patients. Electrodes on the scalp pick up some of the neural signals and generate complicated waveforms (EEGs) that are processed to indicate their component frequencies (using the standard technique of Fourier analysis). The waves are categorized based on these frequency components. An anesthetized brain undergoes a change from the so-called gamma and beta waves associated with consciousness to alpha waves associated with relaxation and drowsiness and delta waves associated with deep sleep.

When the researchers ran their simulations using the random signal known as white noise as the input, they found that a randomly selected node at the output layer of the network (branch tips) produced signals that matched those of patients. The signals switched from predominantly gamma- and beta-like waves at high p (close to 1) to mostly alpha and delta waves at low p (below 0.5).

The researchers also drew on standard information theory to define the amount of information encoded in the input and output signals of the network in terms of their so-called information entropy. The entropy of the output relative to the input dropped abruptly at a p value of about 0.3, meaning that very little information was getting transmitted through the network. The relative abruptness of this transition matched the observation that there is a critical concentration of anesthetic for which consciousness is abruptly and completely lost.

The researchers say that this breakdown of information transmission reflects the fact that at low p it becomes almost impossible for the information to find a continuous path through the network. This loss of a fully connected route is called a percolation transition, resembling the way that a fluid flowing through a random porous network (like hot water through packed coffee grains) “searches” for a complete path. But even at very low p, the researchers say, a route might still open up transiently by chance.

“It is intriguing that a simple model within the traditional percolation theory framework, with a single parameter related to network connectivity, can account for several features of brain dynamics under anesthesia,” says Plamen Ivanov of Boston University. “The beauty of the approach is that it is simple, built from first principles, and generates rich dynamics controlled by a single parameter.” But he cautions that the model is still a long way from explaining actual mechanisms of consciousness.

This research is published in Physical Review Letters.

–Philip Ball

Philip Ball is a freelance science writer in London. His latest book is Beautiful Experiments (University of Chicago Press, 2023).

Subject Areas

Biological Physics

Related Articles

Shape Matters in Self-Assembly

Shape Matters in Self-Assembly

A theoretical study of self-assembly finds that hexagon-shaped building blocks can form large structures faster than triangular or square blocks. Read More »

The Neuron vs the Synapse: Which One Is in the Driving Seat?
Complex Systems

The Neuron vs the Synapse: Which One Is in the Driving Seat?

A new theoretical framework for plastic neural networks predicts dynamical regimes where synapses rather than neurons primarily drive the network’s behavior, leading to an alternative candidate mechanism for working memory in the brain. Read More »

Molecular Lawnmower Drives Itself
Biological Physics

Molecular Lawnmower Drives Itself

A protein-based motor uses a trimming mechanism to move forward across a field of grass-like peptide segments. Read More »

More Articles