Paper
4 April 1997 Chebyshev polynomials-based (CPB) unified model neural networks for function approximation
Tsu-Tian Lee, Jin-Tsong Jeng
Author Affiliations +
Abstract
In this paper, we propose the approximate transformable technique, which includes the direct transformation and indirect transformation, to obtain a CPB unified model neural networks for feedforward/recurrent neural networks via Chebyshev polynomials approximation. Based on this approximate transformable technique, we have derived the relationship between the single-layer neural networks and multilayer perceptron neural networks. It is shown that the CPB unified model neural networks can be represented as a functional link networks that are based on Chebyshev polynomials, and these networks use the recursive least squares method with forgetting factor as learning algorithm. It turns out that the CPB unified model neural networks not only has the same capability of universal approximator, but also has faster learning speed than conventional feedforward/recurrent neural networks. Computer simulations show that the proposed method does have the capability of universal approximator in some functional approximation with considerable reduction in learning time.
© (1997) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Tsu-Tian Lee and Jin-Tsong Jeng "Chebyshev polynomials-based (CPB) unified model neural networks for function approximation", Proc. SPIE 3077, Applications and Science of Artificial Neural Networks III, (4 April 1997); https://doi.org/10.1117/12.271500
Lens.org Logo
CITATIONS
Cited by 9 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Computer simulations

Evolutionary algorithms

Back to Top