Neural Networks and Pattern Recognition Tutorial
Chapter 1 Pattern Classification
1.1 What is Pattern Recognition?
1.2 Basics
1.3 An Example
1.4 Approaches to Pattern Recognition
1.5 Pattern Recognition Systems
1.5.1 Sensing
1.5.2 Segmentation and Grouping
1.5.3 Feature Extraction
1.5.4 Classification
1.5.5 Post Processing
Chapter 2 Matrix Theory and Applications with MATLAB
2.1 Vectors and Matrices
2.1.1 Vector and Matrix Notation
2.1.2 Matrices
Matrix Addition, Subtraction, Multiplication
Square matrix, diagonal matrix
Identity matrix
Matrix Transpose
The rank and trace of a matrix
Matrix inverse
Singular matrix
Pseudo-inverse matrix
Determinant
Eigen-vectors and eigen-values
2.1.3 Vectors
Magnitude (norm)
Inner product (dot product)
Orthogonal and orthonormal vectors
Projection
2.1.4 Linear Transformations
Linearly dependent / independent vectors
Linear orthonormal transformation
2.1.5 Vector Spaces
Orthogonal and orthonormal vector sets
Gram-Schmidt orthonormalization procedure
Euclidean distance
2.2 Matrix Operations in MATLAB
2.2.1 Array Construction
Manuel construction
The 1:n shorthand
The linspace command
The logspace command
2.2.2 Matrix Construction
Manuel construction
Concatenating arrays and matrices
2.2.3 Array and Matrix Indexing
2.2.4 Array and Matrix Operations
The rank function
The diag function
The inv function
The det function and singular matrix
The pinv function
The eig function
The norm function
The cross function
The dot function
2.2.5 Standard Arrays and Matrices
The eye function
The ones and zeros functions
2.2.6 Array and Matrix Size
The size function
The length function
Chapter 3 Network Object Reference
3.1 Introduction to Programming with MATLAB
3.2 Notation in Functions
3.2.1 Dimensions
3.2.2 Variables
3.2.3 Utility Function Variables
3.2.4 Other
3.3 Network Object Reference
3.4 Network Properties
3.4.1 Architecture
3.4.2 Sub-object structures and properties
Inputs
Layers
Outputs
Targets
Biases
Input Weights
Layer Weights
3.4.3 Functions
3.4.4 Parameters
3.4.5 Weight and Bias Values
Other
3.4.6 Other Issues
Chapter 4 Bayesian Decision Theory
4.1 Introduction
4.2 Bayesian Decision Theory (continuous)
4.2.1 Two-Category Classification
4.3 Minimum Error Rate Classification
4.3.1 Minimax Criterion
4.4 The Gaussian (Normal) Density
4.4.1 Interpretation of eigenvalues and eigenvectors
4.5 Discriminant Functions, and Decision Surfaces
4.6 Discriminant Functions For The Normal Density
4.7 Bayesian Decision Theory (discrete)
References
Chapter 5 Principal Component Analysis
5.1 Introduction
5.2 Principal Component Analysis (PCA)
5.3 Principal Component Analysis in MATLAB (prepca, trapca)
5.4 Sample PCA Application in MATLAB
References
Chapter 6 Introduction to Neural Networks
6.1 Introduction
6.2 Histroy of Artificial Neural Networks
6.3 How Artificial Neural Networks Are Being Used
6.3.1 Language Processing
6.3.2 Character Recognition
6.3.3 Image (data) Compression
6.3.4 Pattern Recognition
6.3.5 Signal Processing
6.3.6 Financial
6.3.7 Servo Control
6.4 Summary
References
Chapter 7 Neural Network
7.1 Neurophysiological Motivation
7.2 Mathematical Model of Neural Network
7.3 Neural Network
7.3.1 Architectural Dynamics
7.3.2 Computational Dynamics
7.3.3 Adaptive dynamics
References
Chapter 8 Classical Models of Neural Network
8.1 The Network of Perceptrons
8.2 A Perceptron as a Pattern Classifier
8.3 Vectors
8.3.1 The length of a vector
8.3.2 Comparing vectors - the inner product
8.3.3 Inner products and perceptrons
8.4 Selection of Weights for The Perceptron
8.4.1 Selection of weights by off-line calculations
8.4.2 The perceptron learning law
8.5 Example
References
Chapter 9 Linear Discriminant Functions
9.1 Introduction
9.2 Linear Discriminant Functions and Decision Surfaces
9.2.1 The Two-Category Case
9.2.2 The Multicategory Case
9.3 Generalized Linear Discriminant Functions
9.4 The Two-Category Linearly Separable Case
9.5 The Perceptron Criterion Function
9.6 Minimum Squared-Error Procedures
9.6.1 Minimum Squared-Error and the Pseudoinverse
9.6.2 The Widrow-Hoff or LMS Procedure
9.7 MATLAB Implementation
References
Chapter 10 Multilayer Neural Networks
10.1 Feedforward Operation and Classification
10.2 Backpropagation Algorithm
10.2.1 Network Learning
10.2.2 Training Protocols
10.2.3 Learning Curves and Stopping Criteria
10.3 Error Surfaces
10.4 Backpropagation as Feature Mapping
10.4.1 Representations at the Hidden Layer-Weights
10.5 Backpropagation, Bayes Theory and Probability
10.6 Practical Techniques for Improving Backpropagation
10.6.1 Activation Function
10.6.2 Parameters for the Sigmoid
10.6.3 Scaling Input
10.6.4 Target Values
10.6.5 Training with Noise
10.6.6 Manufacturing Data
10.6.7 Number of Hidden Units
10.6.8 Initializing Weights
10.6.9 Learning Rates
10.6.10 Momentum
10.6.11 Weight Decay
10.6.12 Hints
10.6.13 On-Line, Stochastic or Batch Training
10.6.14 Stopped Training
10.6.15 Number of Hidden Layers
10.6.16 Criterion Function
10.7 Second-Order Methods
10.7.1 Quickprop
10.7.2 Conjugate Gradient Descent
10.8 Radial Basis Function Networks (RBFs)
10.9 MATLAB Implementation
10.9.1 Algorithm: Batch-Backpropagation
References
Chapter 11 Non-Parametric Techniques
11.1 Introduction
11.2 Density Estimation
11.3 Parzen Windows
11.3.1 Convergence of the Mean
11.3.2 Convergence of the Variance
11.3.3 Probabilistic Neural Networks (PNNs)
11.3.4 Choosing the Window Function
11.3.5 Estimation of Posterior Probabilities
11.4 kn Nearest Neighbor Estimation
11.5 MATLAB Implementation
References