Seminar Deep Learning Theory & Applications (SemDL)
- Dozentinnen/Dozenten
- Prof. Dr.-Ing. habil. Andreas Maier, Dipl.-Inf. Vincent Christlein, Tobias Würfl, M. Sc., Katharina Breininger, M. Sc., Dipl.-Ing. Marc Aubreville
- Angaben
- Seminar
4 SWS, benoteter Schein, ECTS-Studium, ECTS-Credits: 5
nur Fachstudium, Sprache Englisch
Zeit und Ort: Mo 16:00 - 17:30, Übung 4 / 01.253-128
- Studienfächer / Studienrichtungen
- WPF INF-MA 1 (ECTS-Credits: 5)
WPF MT-MA-BDV 1 (ECTS-Credits: 5)
WPF CE-MA-TA-MT 1 (ECTS-Credits: 5)
- Inhalt
- Deep Neural Networks or so-called deep learning has attracted significant attention in the recent years. Interestingly the concept of Neural Networks inspired researchers already over generations since Minky's famous book (cf. http://en.wikipedia.org/wiki/Society_of_Mind ). Yet again, this technology brings researchers to the believe that Neural Networks will eventually be able to learn everything (cf. http://www.ted.com/talks/jeremy_howard_the_wonderful_and_terrifying_implications_of_computers_that_can_learn ).
In this seminar, we will investigate the basics of neural networks as already found in Minsky's design and expand on this on the advances to modern deep convolutional neural networks. In addition, we will also investigate open-source deep learning libraries and look into the newest results in literature.
Application (summer term 2017) in StudOn: https://www.studon.fau.de/studon/goto.php?target=crs_1815925
- Empfohlene Literatur
- Representation Learning: A Review and New Perspectives, Yoshua Bengio, Aaron Courville, Pascal Vincent, Arxiv, 2012.
Deep Learning, Yoshua Bengio, Ian Goodfellow, Aaron Courville, MIT Press
Gradient-Based Learning Applied to Document Recognition, Yann Lecun, 1998
Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Srivastava et al. 2014
Greedy layer-wise training of deep networks, Bengio, Yoshua, et al. Advances in neural information processing systems 19 (2007): 153.
Reducing the dimensionality of data with neural networks, Hinton et al. Science 313.5786 (2006): 504-507.
Training Deep and Recurrent Neural Networks with Hessian-Free Optimization, James Martens and Ilya Sutskever, Neural Networks: Tricks of the Trade, 2012.
Deep boltzmann machines, Hinton et al.
Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion, Pascal Vincent et al.
A fast learning algorithm for deep belief nets, Hinton et al., 2006
ImageNet Classification with Deep Convolutional Neural Networks, Alex Krizhevsky, Ilya Sutskever, Geoffrey E Hinton, NIPS 2012
Regularization of Neural Networks using DropConnect, Wan et al., ICML
OverFeat: Integrated recognition, localization and detection using convolutional networks. Computing Research Repository, abs/1312.6229, 2013.
http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutorial
http://deeplearning.net/tutorial/
Deep Learning Course on Coursera by Hinton
DL platform with GPU support: caffe, lasagne, torch etc.
- ECTS-Informationen:
- Credits: 5
- Zusätzliche Informationen
- Schlagwörter: deep learning; neural networks; machine learning; pattern recognition
Erwartete Teilnehmerzahl: 10, Maximale Teilnehmerzahl: 10
www: https://www5.cs.fau.de/lectures/ss-17/seminar-deep-learning-theory-applications-semdl/ Für diese Lehrveranstaltung ist eine Anmeldung erforderlich. Die Anmeldung erfolgt über: StudOn
- Verwendung in folgenden UnivIS-Modulen
- Startsemester SS 2017:
- Seminar Deep Learning Theory & Applications (SemDL)
- Seminar Medizintechnik und Medizinethik (Medtech Ethik)
- Institution: Lehrstuhl für Informatik 5 (Mustererkennung)
|
|