Sunday, July 23, 2006

sources

have a look at this sources:

* Neural Network Toolbox for MATLAB Open this result in new window
www.mathworks.com/products/neuralnet
* Neural Nets Open this result in new window
Lecture notes from an MSc course. Covers TLUs, delta rule, multilayer nets, Hopfield nets, Kohonen nets, node types, cubic nodes.
www.shef.ac.uk/psychology/gurney/notes/contents.html
* Neural Networks at your Fingertips Open this result in new window
Neural network simulators for eight different network architectures with embedded example applications coded in portable ANSI C.
www.neural-networks-at-your-fingertips.com
* Neural Network Using Genetic Algorithms (NNUGA) Open this result in new window
Includes screen shots and the free source code.
www.cs.bgu.ac.il/~omri/NNUGA
* Neural Machines Open this result in new window
Discusses the creation and application of advanced neural network technology in optimization and pattern recognition.
www.neuralmachines.com
* Perceptron, The Open this result in new window
A small and free neural network software package, demonstrating using NN's for pattern classification. Includes screen shots and the source code.
www.cs.bgu.ac.il/~omri/Perceptron
* Neural Network Announcements and General Information Open this result in new window
At Los Alamos.
www.www-xdiv.lanl.gov/XCM/neural/neural_announcements.html
* Web Directory: Sumeet's Neural Net Links Open this result in new window
www.geocities.com/CapeCanaveral/Lab/3765/neural.html
* FAQ - comp.ai.neural-nets Open this result in new window
www.cs.cmu.edu/Groups/AI/html/faqs/ai/neural/faq.html
* Joone: Java Object Oriented Neural Engine Open this result in new window
Neural net framework to create, train, and test neural networks.
www.joone.org
* Web Directory: Neural Networks Open this result in new window
Listing newsgroups, list servers, and mailing lists.
www.emsl.pnl.gov:2080/docs/cie/neural/newsgroups.html
* LANL Advanced Adaptive Control Open this result in new window
Neural networks adaptive control applications.
www.www-xdiv.lanl.gov/XCM/neural/projects/projects.html
* FAQ - Neural Networks Open this result in new window
www.ftp://ftp.sas.com/pub/neural/FAQ.html

Tuesday, July 04, 2006

Network Architectures

Supervised Networks

Supervised neural networks are trained to produce desired outputs in response to sample inputs, making them particularly well suited to modeling and controlling dynamic systems, classifying noisy data, and predicting future events.

* Feedforward networks have one-way connections from input to output layers. They are most commonly used for prediction, pattern recognition, and nonlinear function fitting. Supported feedforward networks include feedforward backpropagation, cascade-forward backpropagation, feedforward input-delay backpropagation, linear, and perceptron networks.
* Radial basis networks provide an alternative, fast method for designing nonlinear feedforward networks. Supported variations include generalized regression and probabilistic neural networks.
* Dynamic networks use memory and recurrent feedback connections to recognize spatial and temporal patterns in data. They are commonly used for time-series prediction, nonlinear dynamic system modeling, and control system applications. Prebuilt dynamic networks in the toolbox include focused and distributed time-delay, nonlinear autoregressive (NARX), layer-recurrent, Elman, and Hopfield networks. The toolbox also supports dynamic training of custom networks with arbitrary connections.
* LVQ is a powerful method for classifying patterns that are not linearly separable. LVQ lets you specify class boundaries and the granularity of classification.

Unsupervised Networks

Unsupervised neural networks are trained by letting the network continually adjust itself to new inputs. They find relationships within data and can automatically define classification schemes.

The Neural Network Toolbox supports two types of self-organizing, unsupervised networks: competitive layers and self-organizing maps.

Competitive layers recognize and group similar input vectors. By using these groups, the network automatically sorts the inputs into categories.

Self-organizing maps learn to classify input vectors according to similarity. Unlike competitive layers they also preserve the topology of the input vectors, assigning nearby inputs to nearby categories.

(source: http://www.mathworks.com/products/neuralnet/descripton3.html)