Theoretical Advances in Neural Computation and Learning edited by Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky.

For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda­ tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for st...

Full description

Saved in:
Bibliographic Details
Corporate Author: SpringerLink (Online service)
Other Authors: Roychowdhury, Vwani (Editor), Kai-Yeung Siu (Editor), Orlitsky, Alon (Editor)
Format: eBook
Language:English
Published: New York, NY : Springer US : Imprint: Springer, 1994.
Edition:1st ed. 1994.
Series:Springer eBook Collection.
Subjects:
Online Access:Click to view e-book
Holy Cross Note:Loaded electronically.
Electronic access restricted to members of the Holy Cross Community.
Table of Contents:
  • I Computational Complexity of Neural Networks
  • 1 Neural Models and Spectral Methods
  • 2 Depth-Efficient Threshold Circuits for Arithmetic Functions
  • 3 Communication Complexity and Lower Bounds for Threshold Circuits
  • 4 A Comparison of the Computational Power of Sigmoid and Boolean Threshold Circuits
  • 5 Computing on Analog Neural Nets with Arbitrary Real Weights
  • 6 Connectivity Versus Capacity in the Hebb Rule
  • II Learning and Neural Networks
  • 7 Computational Learning Theory and Neural Networks: A Survey of Selected Topics
  • 8 Perspectives of Current Research about the Complexity of Learning on Neural Nets
  • 9 Learning an Intersection of K Halfspaces Over a Uniform Distribution
  • 10 On the Intractability of Loading Neural Networks
  • 11 Learning Boolean Functions via the Fourier Transform
  • 12 LMS and Backpropagation are Minimax Filters
  • 13 Supervised Learning: can it Escape its Local Minimum?.