Techn. Fakultät Website deprecated and outdated. Click here for the new site. FAU-Logo

Introduction to Pattern Recognition [IntroPR]

The goal of this lecture is to familiarize the students with the overall pipeline of a Pattern Recognition System. The various steps involved from data capture to pattern classification are presented. The lectures start with a short introduction, where the nomenclature is defined. Analog to digital conversion is briefly discussed with a focus on how it impacts further signal analysis. Commonly used preprocessing methods are then described. A key component of Pattern Recognition is feature extraction. Thus, several techniques for feature computation will be presented including Walsh Transform, Haar Transform, Linear Predictive Coding, Wavelets, Moments, Principal Component Analysis and Linear Discriminant Analysis. The lectures conclude with a basic introduction to classification. The principles of statistical, distribution-free and nonparametric classification approaches will be presented. Within this context we will cover Bayesian and Gaussian classifiers, as well as artificial neural networks. The accompanying exercises will provide further details on the methods and procedures presented in this lecture with particular emphasis on their application.

Dates & Rooms:
Tuesday, 10:15 - 11:45; Room: 0.68
Wednesday, 10:15 - 11:45; Room: 0.68

General Information

  • Friday's exercises and Wednesday's exercises are again synchronized.
  • Tentative course syllabus: It provides a roadmap to the lectures throughout the semester.
  • The lecture videos from Winter Semester 2013 are available Opens external link in new windowhere. (They are only accessible from inside the university network; If you want to watch the videos from home, consider tunnelling the connection).
  • Prof. Niemann's textbook on Pattern Recognition Klassifikation von Mustern is available Opens external link in new windowonline.

  • A. Exam Dates

    Monday 17.02.2014 

    Tuesday 18.02.2014

    Thursday 20.02.2014 

    Wednesday 19.03.2014

    Monday 31.03.2014

    Tuesday 01.04.2014


    B. Signing up for the Exam

    Reserving a slot for the exam is only possible after January 6th, 2014

    You must reserve a time-slot for the exam, independent of whether you have signed up at meinCampus. You can do so:

    either by personally visiting the secretaries at the Pattern Recognition Lab, at the 09.138 at Martenstr. 3, 91058 Erlangen,

    or by sending them an email at Kristina Müller at mueller(at) or at Iris Koppe at koppe(at) .
    Make sure in your email to write your full name, student ID, program of Studies, birthdate, number of credits and type of exam (e.g. benoteter Schein, unbenoteter Schein, Prüfung durch meinCampus, etc.).



    NEWS: The results of the IntroPR exam were added to myCampus. If you want to take a look at your corrected exam (Prüfungseinsicht), please contact Opens window for sending emailSimone Gaffling. The deadline for Prüfungseinsicht is Nov 7th, after this date the grades are finalized.






The updated slides will be posted on the web soon after the corresponding lecture is completed.

In order to prepare yourself for an upcoming lecture, look at the Opens internal link in current windowslides of the previous Winter semester.

Introduction:course outline, examples of PR systems
Key PR Concepts: the pipeline of a PR system, terminology, postulates of PR 
Sampling:review of Fourier analysis, Nyquist sampling theorem 
Quantization:signal-to-noise ratio, pulse code modulation, vector quantization, k-means algorithm
Histogram Equalization and Thresholding:histogram equalization, thresholding, binarization, maximum likelihood estimation, various thresholding algorithms (intersection of Gaussians, Otsu's algorithm, unimodal algorithm, entropy-based)
Noise Suppression:Linear Shift Invariant transformations, convolution, mean filter, Gaussian filter, median filter
Edge Detection:gradient-based edge detector, Laplacian of Gaussian, sharpening
Non-linear Filtering:recursive filters, homomorphic filters, cepstrum, morphological operators, rank operators
Normalization:size normalization, location normalization, pose normalization, geometric moments, central moments
Introduction to Feature Extraction:curse of dimensionality, heuristic versus analytic feature extraction methods, projection on orthogonal bases, Fourier transform as a feature
Orthonormal Bases for Feature Extraction:spectrogram, Walsh-Hadamard transform, Haar transform
LPC and Moments:linear predictive coding, moments as features, Hu moments
Multiresolution Analysis:short-time fourier transform, continuous wavelet transform, discrete wavelt transform, wavelet series
PCA, LDA:component analysis, eigenfaces, linear discriminant analysis, fisherfaces
OFT:optimal feature transform, Mahalanobis distance, feature transform
Optimization Methods:gradient descent, coordinate descent
Feature Selection:objective functions for feature selection including entropy and KL-divergence, strategies for exploring the space of feature subsets including branch-and-bound
Bayesian Classifier:introduction to classification, decision function, misclassification cost, misclassification risk, Bayesian classifier
Gaussian Classifier:Gaussian classifier, linear vs. quadratic decision boundaries
Polynomial Classifiers:polynomial classifier, discriminant functions
Non-parametric Classifiers:K-nearest neighbor density estimation, Parzen windows
An Intro to Artificial Neural Networks:introduction to ANNs, ANN and classification, Radial Basis Function ANNs
Multilayer Perceptrons:ANN layouts, feed-forward networks, perceptron, MLPs, back-propagation
Deep Convolutional Neural Nets:Convolutional Neural Networks, local computation, shared weights, convolutional layer, examples
Review:end of lecture review, brief recap of what was covered in class

Supplemental Material

  1. A nice review/tutorial on Fourier Analysis can be found at
  2. A comprehensive yet brief presentation on the K-means algorithm by Prof. Xindong Wu can be found at
  3. Follow this link for additional information on Covariance Matrices.
  4. A more detailed tutorial on Principal Component Analysis can be found here.
  5. Here are some notes on Singular Value Decomposition.
  6. A nice presentation on the use of trees and pruning in game playing can be found at
  7. An analysis on the error rate of the nearest neighbor classifier can be found at
  8. The original paper with the detailed computation of the error probability bounds of  the nearest neighbor classifier is: Opens external link in new windowThomas Cover and Peter Hart. "Nearest neighbor pattern classification." IEEE Transactions on Information Theory, Vol.13., No.1 (1967): 21-27.
  9. Hava Siegelmann has extensively analyzed the connection between neural networks and Turing machines in his book Neural Networks and Analog Computation: Beyond the Turing Limit. Boston, MA: Birkhauser, 1999.
  10. An article summarizing Siegelmann's findings is Opens external link in new windowHava T. Siegelmann, "Neural and super-Turing computing." Minds and Machines,Opens external link in new window Vol. 13, No. 1, pp. 103-114. 2003.