Description: |
The course treats important types of feed-forward neural networks, such as Multi-Layer Perceptron, Radial Basis Function networks, Deep Convolutional Networks, Support Vector Machines. In a comprehensive manner, the basic problem of algorithmic learning is treated, including Bias-Variance Dilemma, and solutions are presented. Related to Organic Computing, self-X competences are discussed. A special effort is put on relationships to basic techniques from other fields, e.g. gradient descent, linear and quadratic optimization, statistical decision theory. Typical applications include signal filtering, pattern recognition, robot control. Contents at a glance:
- Introduction
- McCulloch-Pitts Zelle, Perzeptron, Adaline
- Statistical decision theory
- Multi-Layer Perceptron, Deep Convolutional Networks
- Radial Basis Function Networks
- Bias-Variance-Dilemma
- Support Vector Machines
- Organic Computing
|
Learning Targets: |
The students should understand for certain types of neural networks their structure and learning method, as well as the mathematical foundation, and they should know possible applications. They have the competence to propose for certain types of problems, the potentially useful types of networks and learning procedures. |
Literature: |
- C. Bishop: Neural Networks for Pattern Recognition; Oxford Press, 1995.
- C. Bishop: Pattern Recognition and Machine Learning; Springer, 2006.
- I. Goodfellow, et al.: Deep Learning; MIT Press, 2016
- T. Hastie, et al.: The Elements of Statistical Learning, Springer, 2003.
- M. Mohri, et al.: Foundations of Machine Learning; MIT Press, 2012.
- R. Rojas: Neuronale Netze; Springer-Verlag, 1996.
- Z. Zell: Simulation neuronaler Netze; Addison-Wesley, 1994.
- Aktuelle eigene Artikel sowie Bachelor-/Master-/Doktorarbeiten.
|