UnivIS
Information system of Friedrich-Alexander-University Erlangen-Nuremberg © Config eG 
FAU Logo
  Collection/class schedule    module collection Home  |  Legal Matters  |  Contact  |  Help    
search:      semester:   
 
 Layout
 
printable version

 
 
 Also in UnivIS
 
course list

lecture directory

 
 
events calendar

job offers

furniture and equipment offers

 
 

  Seminar Deep Learning Theory & Applications (SemDL)

Lecturers
Prof. Dr.-Ing. habil. Andreas Maier, Dipl.-Inf. Vincent Christlein, Lennart Husvogt, M. Sc., Katharina Breininger, M. Sc.

Details
Seminar
4 cred.h, benoteter certificate, ECTS studies, ECTS credits: 5
nur Fachstudium, Sprache Englisch
Time and place: Mon 16:00 - 18:00, 01.134

Fields of study
WPF INF-MA 1 (ECTS-Credits: 5)
WPF MT-MA-BDV 1 (ECTS-Credits: 5)
WPF CE-MA-TA-MT 1 (ECTS-Credits: 5)

Contents
Deep Neural Networks or so-called deep learning has attracted significant attention in the recent years. Interestingly the concept of Neural Networks inspired researchers already over generations since Minky's famous book (cf. http://en.wikipedia.org/wiki/Society_of_Mind ). Yet again, this technology brings researchers to the believe that Neural Networks will eventually be able to learn everything (cf. http://www.ted.com/talks/jeremy_howard_the_wonderful_and_terrifying_implications_of_computers_that_can_learn ). In this seminar, we will investigate the basics of neural networks as already found in Minsky's design and expand on this on the advances to modern deep convolutional neural networks. In addition, we will also investigate open-source deep learning libraries and look into the newest results in literature.

Recommended literature
  • Representation Learning: A Review and New Perspectives, Yoshua Bengio, Aaron Courville, Pascal Vincent, Arxiv, 2012.
  • Deep Learning, Yoshua Bengio, Ian Goodfellow, Aaron Courville, MIT Press

  • Gradient-Based Learning Applied to Document Recognition, Yann Lecun, 1998

  • Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Srivastava et al. 2014

  • Greedy layer-wise training of deep networks, Bengio, Yoshua, et al. Advances in neural information processing systems 19 (2007): 153.

  • Reducing the dimensionality of data with neural networks, Hinton et al. Science 313.5786 (2006): 504-507.

  • Training Deep and Recurrent Neural Networks with Hessian-Free Optimization, James Martens and Ilya Sutskever, Neural Networks: Tricks of the Trade, 2012.

  • Deep boltzmann machines, Hinton et al.

  • Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion, Pascal Vincent et al.

  • A fast learning algorithm for deep belief nets, Hinton et al., 2006

  • ImageNet Classification with Deep Convolutional Neural Networks, Alex Krizhevsky, Ilya Sutskever, Geoffrey E Hinton, NIPS 2012

  • Regularization of Neural Networks using DropConnect, Wan et al., ICML

  • OverFeat: Integrated recognition, localization and detection using convolutional networks. Computing Research Repository, abs/1312.6229, 2013.

  • http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutorial

  • http://deeplearning.net/tutorial/

  • Deep Learning Course on Coursera by Hinton

  • DL platform with GPU support: caffe, lasagne, torch etc.

ECTS information:
Credits: 5

Additional information
Keywords: deep learning; neural networks; machine learning; pattern recognition
Expected participants: 10, Maximale Teilnehmerzahl: 10
www: https://www5.cs.fau.de/lectures/ss-16/seminar-deep-learning-theory-applications-semdl/
Registration is required for this lecture.
Die Registration via: persönlich beim Dozenten

Verwendung in folgenden UnivIS-Modulen
Startsemester SS 2016:
Seminar Deep Learning Theory & Applications (SemDL)
Seminar Medizintechnik und Medizinethik (Medtech Ethik)

Department: Chair of Computer Science 5 (Pattern Recognition)
UnivIS is a product of Config eG, Buckenhof