Next Generation Reservoir Computing: A New Method to Solve the “Hardest of the Hardest” Computer Problems

Scientists can now tackle the most complex information-processing problems with a new computing type very similar to the brain.

Researchers have found a way to make reservoir computing work 33 times faster with significantly fewer computing resources.

Researchers solved a complex computing problem on a desktop computer using next-generation reservoir computing in less than one second.

The same problem can be solved using the most current technology. However, according to Daniel Gauthier (lead author and professor of Physics at The Ohio State University), it takes longer to solve.

Gauthier stated, “we can do very complex information processing tasks within a fraction of time using much less computing resources than what reservoir computing can currently accomplish.”

“Reserve computing was already a significant step forward from what was possible previously.”

The study was published in Nature Communications on September 21, 2021.

Gauthier stated that reservoir computing, a machine-learning algorithm developed in the early 2000s, solves the most challenging computing problems, such as forecasting the evolution and change of dynamical systems.

He said that dynamic systems like weather are hard to predict as a slight change in one condition could have huge effects on the future.

The butterfly effect is a famous example. In this illustrative example, changes caused by a butterfly flapping its wings may eventually impact the weather for weeks.

Gauthier stated that previous research has shown reservoir computing is well-suited to learning dynamical systems and can give accurate forecasts of how they will behave in the future.

Artificial neural networks are used to accomplish this. They work similarly to the human brain. Scientists use data from a dynamical network to create a “reservoir,” a collection of randomly connected artificial neurons. Scientists can use the network’s output to forecast the future better.

The more complex and accurate the forecast, the more extensive the network of artificial neurons must be. This will require more computing resources and time.

Gauthier stated that one issue is that the artificial neuron reservoir is a “black box,” and scientists don’t know what it contains. They only know that it works.

Gauthier explained that artificial neural networks, built on mathematics, are the heart of reservoir computing.

He said that mathematicians looked at the networks and asked, “To what extent do all these parts in the machinery need?”

Gauthier and colleagues examined this question in their study. They found that the entire reservoir computing system could have been greatly simplified. This would reduce the computing resources required and save significant time.

Their concept was tested on a forecasting task that involved a weather system created by Edward Lorenz. Lorenz’s work helped to understand the butterfly effect.

Their next-generation reservoir computing over the current state-of-the-art Lorenz forecasting technology won this task. The new system was 33-163 times faster than the existing model in a simple simulation on a desktop computer.

The next-generation reservoir computing was 1 million times faster when the goal was to forecast accurately. Gauthier stated that the new-generation computing achieved the same accuracy using only 28 neurons as the current-generation model required.

The next generation of reservoir computing is so fast because it requires less training and warmup than the current generation.

Warmup refers to training data that must be entered into the reservoir computer to prepare it to perform its task.

Gauthier stated, “for our next-generation reservoir computing, there is almost no heating time required.”

To heat it, scientists need to input 1,000 to 10,000 data points. That’s all the data that is lost. It is optional for the actual work. He said we only need to input one, two, or three data points.”

Once researchers have trained the reservoir computer to forecast, using fewer data in the next-generation system will be much easier.

The researchers got the same results with 400 data points in their Lorenz forecasting task test as the current generation using 5,000 data points, depending on their desired accuracy.

Gauthier stated, “What’s exciting about this next-generation reservoir computing is that it takes what was already very well and makes it significantly more effective.”

He and his colleagues plan to expand this work to address more complex computing problems, such as forecasting fluid dynamics.

“That is a complicated problem to solve. We are trying to speed up the process using our simplified reservoir computing model.


Your email address will not be published. Required fields are marked *

Hello there! Thank you for stopping by Interead, and I am Adam, the mastermind behind this versatile blog. At Interead, I like to cater to basic issues regarding technology, fashion, home decor, and more for my readers so they could benefit from every post. Have a good read!