eduzhai > Applied Sciences > Engineering >

Reservoir Computing meets Recurrent Kernels and Structured Transforms

  • Save

... pages left unread,continue reading

Document pages: 18 pages

Abstract: Reservoir Computing is a class of simple yet efficient Recurrent NeuralNetworks where internal weights are fixed at random and only a linear outputlayer is trained. In the large size limit, such random neural networks have adeep connection with kernel methods. Our contributions are threefold: a) Werigorously establish the recurrent kernel limit of Reservoir Computing andprove its convergence. b) We test our models on chaotic time series prediction,a classic but challenging benchmark in Reservoir Computing, and show how theRecurrent Kernel is competitive and computationally efficient when the numberof data points remains moderate. c) When the number of samples is too large, weleverage the success of structured Random Features for kernel approximation byintroducing Structured Reservoir Computing. The two proposed methods, RecurrentKernel and Structured Reservoir Computing, turn out to be much faster and morememory-efficient than conventional Reservoir Computing.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×