Achtung:

Sie haben Javascript deaktiviert!
Sie haben versucht eine Funktion zu nutzen, die nur mit Javascript möglich ist. Um sämtliche Funktionalitäten unserer Internetseite zu nutzen, aktivieren Sie bitte Javascript in Ihrem Browser.

Statistical methods for learning and pattern recognition

Two-Layer neuronal network

Short Description

The course on Statistical Learning and Pattern Recognition presents an introduction into the components and algorithms prevalent in statistical pattern recognition. Both parametric and non-parametric density estimation and classification techniques will be presented, as well as supervised and unsupervised learning paradigms. The presented techniques can be applied to a variety of classification problems, both for one-dimensional input data (e.g., speech), two-dimensional (e.g., image) or symbolic input data (e.g., documents).

Contents

  • Decision rules: Bayes decision rule, Discriminants and surface boundaries, Discriminants for multivariate gaussians, Error rates of the Bayesian classifier, Example: Soft-feature speech recognition
  • Supervised training: Mean and covariance of a gaussian distribution, Bayesian learning
  • Linear dimension reduction: Principal component analysis, Linear discriminant analysis
  • Linear discriminants: Least squares and LDA, LMS, Support vector machines, Example: MLLR speaker adaptation
  • Multilayer perceptron: Classification with neuronal networks, Training of neuronal networks
  • Unsupervised training: Mixture densities, Clustering methods

Learning Outcomes, Competences

After completion of the course students will be able to

  • choose an appropriate decision rule for a given classification problem
  • Apply supervised or unsupervised learning techniques to data of various kinds and critically assess the outcome of the learning algorithms
  • work with dedicated pattern classification software (e.g., for artificial neural networks, support vector machines) on given data sets and optimize parameter settings
  • find, for a given training set size, an appropriate choice of classifier complexity und feature vector dimensionality

The students

  • have gathered sufficient proficiency in Matlab, well beyond what is needed to realize pattern classification techniques
  • can assess the importance of the principle of parsimony and are able to transfer it to other
  • are able to apply the knowledge and skills learnt in this course to a wide range of disciplines
  • can work cooperatively in a team and subdivide an overall task into manageable subtasks and work packages

Implementation

  • Lectures predominantly using the blackboard or overhead projector, occasional presentations of (PowerPoint) slide,
  • Exercise classes with exercise sheets and demonstrations on computer and
  • Implementation of learning and classification algorithms on a computer by the students themselves; use of algorithms on real-world data or data generated on the computer, evaluation of the simulation results

Proposed Literature

  • R.O. Duda, P.E. Hart und D.G. Stork: "Pattern Classification", 2nd Edition,Wiley, 2000
  • K. Fukunaga, Statistical Pattern Recognition, Academic Press, 1990
  • Hastie, Tibshirani, The Elements of Statistical Learning, Springer 2003
  • McLachlan G.J., Discriminant analysis and statistical pattern recognition, John Wiley&sons, 1992
  • Radford M., Bayesian Learning for Neural Networks, Springer 1996
  • Schürmann J., Pattern classification,a unified view of statistical...,John Wiley & sons, 1996
  • Vapnik, The Nature of Statistical Learning Theory, Springer-Verlag 2000
  • Vapnik, Statistical Learning Theory, John Wiley 1998
  • Vidyasagar M., Learning and Generalization, Springer 2003

General Information

  • Course for master students
  • ECTS: 6
  • Language: German or English (depending on preference of students)
  • Semester: Sommersemester

The University for the Information Society