Paper
26 October 1999 Pseudo orthogonal bases give the optimal generalization capability in neural network learning
Masashi Sugiyama, Hidemitsu Ogawa
Author Affiliations +
Abstract
Pseudo orthogonal bases are a certain type of frames proposed in the engineering field, whose concept is equivalent to a tight frame with frame bound 1 in the frame terminology. This paper shows that pseudo orthogonal bases play an essential role in neural network learning. One of the most important issues in neural network learning is `what training data provides the optimal generalization capability?', which is referred to as active learning in the neural network community. We derive a necessary and sufficient condition of training data to provide the optimal generalization capability in the trigonometric polynomial space, where the concept of pseudo orthogonal bases is essential. By utilizing useful properties of pseudo orthogonal bases, we clarify the mechanism of achieving the optimal generalization.
© (1999) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Masashi Sugiyama and Hidemitsu Ogawa "Pseudo orthogonal bases give the optimal generalization capability in neural network learning", Proc. SPIE 3813, Wavelet Applications in Signal and Image Processing VII, (26 October 1999); https://doi.org/10.1117/12.366809
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Space operations

Solids

Computer simulations

Curium

Inverse problems

Interference (communication)

Back to Top