Poster + Paper
18 June 2024 Cascading photonic reservoirs with deep neural networks increases computational performance
Author Affiliations +
Conference Poster
Abstract
Deep neural networks (DNNs) have been successfully applied to solve complex problems, such as pattern recognition when analyzing big data. To achieve a good computational performance, these networks are often designed such that they contain a large number of trainable parameters. However, by doing so, DNNs are often very energy-intensive and time-consuming to train. In this work, we propose to use a photonic reservoir to preprocess the input data instead of directly injecting it into the DNN. A photonic reservoir consists of a network of many randomly connected nodes which do not need to be trained. It forms an additional layer to the deep neural network and can transform the input data into a state in a higher dimensional state-space. This allows us to reduce the size of the DNN, and the amount of training required for the DNN. We test this assumption using numerical simulations that show that such a photonic reservoir as preprocessor results in an improved performance, shown by a lower test error, for a deep neural network, when tested on the one-step ahead prediction task of the Santa Fe time-series. The performance of the stand-alone DNN is poor on this task, resulting in a high test error. As we also discuss in detail in [Bauwens et al, Frontiers in Physics 10, 1051941 (2022)], we conclude that photonic reservoirs are well-suited as physical preprocessors to deep neural networks for tackling time-dependent tasks due to their fast computation times and low-energy consumption.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Ian Bauwens, Guy Van der Sande, Peter Bienstman, and Guy Verschaffelt "Cascading photonic reservoirs with deep neural networks increases computational performance", Proc. SPIE 13017, Machine Learning in Photonics, 1301710 (18 June 2024); https://doi.org/10.1117/12.3017209
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Artificial neural networks

Neural networks

Semiconductor lasers

Reservoir computing

Machine learning

Network architectures

Photonics systems

RELATED CONTENT

Optoacoustic recurrent operator
Proceedings of SPIE (January 01 1900)
Dual-mode semiconductor lasers in reservoir computing
Proceedings of SPIE (May 21 2018)
Analog CMOS contrastive Hebbian networks
Proceedings of SPIE (September 16 1992)
ReverbaProp: simultaneous learning of credit and weight
Proceedings of SPIE (April 06 1995)

Back to Top