Viewpoint

Time Delays Improve Performance of Certain Neural Networks

    Sarah Marzen
    • W. M. Keck Science Department, Pitzer, Scripps, and Claremont McKenna College, Claremont, CA, US
Physics 17, 111
Both the predictive power and the memory storage capability of an artificial neural network called a reservoir computer increase when time delays are added into how the network processes signals, according to a new model.
Steve Young/stock.adobe.com
Figure 1: Researchers tested an improved reservoir computer’s ability to memorize time-series data from a Lorenz attractor such as that shown in orange.

A reservoir computer—a type of artificial neural network—can use information about a system’s past to predict the system’s future. Reservoir computers are far easier to train than their more general counterpart, recurrent neural networks. However, researchers have yet to develop a way to determine the optimal reservoir-computer construction for memorizing and forecasting the behavior a given system. Recently, Seyedkamyar Tavakoli and André Longtin of the University of Ottawa, Canada, took a step toward solving that problem by demonstrating a way to enhance the memory and prediction capabilities of a reservoir computer [1]. Their demonstration could, for example, allow researchers to make a chatbot or virtual assistant, such as ChatGPT, using a reservoir computer, a possibility that so far has been largely unexplored.

For those studying time-series-forecasting methods—those that can predict the future outcomes of complex systems using historical time-stamped data—the recurrent neural network is king [2]. Recurrent neural networks contain a “hidden state” that stores information about features of the system being modeled. The information in the hidden state is updated every time the network gains new information about the system and is then fed into an algorithm that is used to predict what will happen next to the system.

Both the hidden-state-update process and the prediction process are optimized by training algorithms incorporated into the recurrent neural network. But current training methods tend to lose key information about the system of interest, which degrades the neural network’s performance [3].

To get around the information-loss problem, researchers developed the reservoir computer, which is essentially a recurrent neural network in which the hidden-state-update process stays the same. Training still happens but only on how the network makes predictions. As such, when compared with a commensurate recurrent neural network, a reservoir computer usually makes less accurate predictions. The lack of hidden-state-update training also impacts the comparative size of the reservoir computer. Without such training capabilities, the reservoir computer must be able to store all the information it may need to make a prediction. That means that to solve a given problem, the required reservoir computer will typically be larger than the needed recurrent neural network, making it more resource intensive to construct. Researchers have shown that they can reduce a given reservoir computer’s size by adding time delays into the method by which it processes signals. But how to choose the optimal time delays has been an open question.

To address this question, Tavakoli and Longtin considered a theoretical reservoir computer that operates using optoelectronic oscillators—oscillators in which electronic and optical signals interact in feedback loops. The final signal produced by an oscillator is inherently cyclic, with a period known as the clock cycle. After leaving the oscillator, the signal passes into a “delay loop,” which could, for example, be an optical fiber. As the signal travels through the loop, it interacts with nodes of the neural network that delay some fraction of the signal by a certain length of time.

To study the impact of these time delays, Tavakoli and Longtin adjusted the spacing between time delays and the number of time delays. They then tested the reservoir computer’s ability to memorize time-series data from three different systems—a Lorenz attractor (Fig. 1), a Mackey-Glass model, and a NARMA10 task—and to make predictions about the future behavior of those systems.

The results of the tests reveal that adding in time delays improves both the reservoir computer’s memory capacity and its predictive capabilities, with each additional delay further improving performance. But this enhancement occurs only under certain conditions, a result in line with previous studies [4]. For example, when the length of a single time delay matches the clock cycle, Tavakoli and Longtin show that the reservoir computer will not retain all the input data and so has a lower memory capacity and makes less accurate predictions than it otherwise would.

Interestingly, Tavakoli and Longtin found that a reservoir computer with a higher memory capacity has a lower prediction error, and vice versa. Previous studies, including my own, have shown that this correlation is far from inevitable—a reservoir computer can have an infinite memory and no predictive capabilities, for example [5].

Together, these findings provide both a qualitative and a quantitative starting point for constructing an optimal reservoir computer. They also suggest that incorporating time delays could offer advantages to living neural networks (such as those found in human and animal brains). Such a finding would be tantalizing, as time delays are known to decrease performance in living systems [6]. For example, for a baseball player facing an oncoming ball, a longer time delay between perception and action (which is learned from experience) will decrease the likelihood they hit a home run. Are there instead cases in which time delays increase an organism’s ability to perform some task? Has evolution shaped our brains, which could perhaps be thought of as a collection of reservoir computers, so that the time delay between one neuron sending a signal and a second receiving it is exactly the right length for understanding the visual and audio that constantly impinge upon our eyes and ears? Does adding time delays impact the number of neurons the brain needs to operate correctly? Further work is needed to answer these questions, but such work could lead to a new understanding of how biological organism’s function.

References

  1. S. Kamyar Tavakoli and A. Longtin, “Boosting reservoir computer performance with multiple delays,” Phys. Rev. E 109, 054203 (2024).
  2. S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Comput. 9, 1735 (1997).
  3. R. Pascanu et al., “On the difficulty of training recurrent neural networks,” Proc. International Conference on Machine Learning/Proc. Machine Learning Research (PMLR) 28, 1310 (2023), https://proceedings.mlr.press/v28/pascanu13.html.
  4. F. Stelzer et al., “Performance boost of time-delay reservoir computing by non-resonant clock cycle,” Neural Networks 124, 158 (2020).
  5. S. Marzen, “Difference between memory and prediction in linear recurrent networks,” Phys. Rev. E 96, 032308 (2017).
  6. Y. Sawaya et al., “Framework for solving time-delayed Markov Decision Processes,” Phys. Rev. Res. 5, 033034 (2023).

About the Author

Image of Sarah Marzen

Sarah Marzen is an assistant professor of physics at Pitzer College, Scripps College, and Claremont McKenna College, California. Her research focuses on biophysics problems, but she dabbles in information theory and machine learning. In her spare time, she writes and composes music, while managing her paranoid schizophrenia.


Subject Areas

Nonlinear DynamicsComputational Physics

Related Articles

How to Pop a Microscopic Cork
Materials Science

How to Pop a Microscopic Cork

Researchers used machine learning to optimize the process by which a tiny cage is opened to release a molecule. Read More »

Why Emus Favor Fast Walking
Computational Physics

Why Emus Favor Fast Walking

Emus inherited from their dinosaur ancestors a crouched posture that dictates the gait they adopt when moving quickly, according to a new computer simulation of bird motion. Read More »

Harnessing Machine Learning to Guide Scientific Understanding
Computational Physics

Harnessing Machine Learning to Guide Scientific Understanding

A clever use of machine learning guides researchers to a missing term that’s needed to accurately describe the dynamics of a complex fluid system. Read More »

More Articles