|
Lehrveranstaltungssuche
|
Deep Learning [DL] -
- Dozentinnen/Dozenten:
- Andreas Maier, Katharina Breininger, Sulaiman Vesal
- Angaben:
- Vorlesung, 2 SWS, ECTS: 2,5, nur Fachstudium
- Termine:
- Di, 8:30 - 10:00, H4
- Studienrichtungen / Studienfächer:
- WPF INF-MA ab 1
WPF MT-MA-BDV 1
WF CME-MA ab 1
- Voraussetzungen / Organisatorisches:
- The following lectures are recommended:
Application via https://www.studon.fau.de/crs2526786.html
- Inhalt:
- Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry.
This lecture introduces the core elements of neural networks and deep learning, it comprises:
(multilayer) perceptron, backpropagation, fully connected neural networks
loss functions and optimization strategies
convolutional neural networks (CNNs)
activation functions
regularization strategies
common practices for training and evaluating neural networks
visualization of networks and results
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
deep reinforcement learning
unsupervised learning (autoencoder, RBM, DBM, VAE)
generative adversarial networks (GANs)
weakly supervised learning
applications of deep learning (segmentation, object detection, speech recognition, ...)
The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.
- Empfohlene Literatur:
- Ian Goodfellow, Yoshua Bengio, Aaron Courville: Deep Learning. MIT Press, 2016
Christopher Bishop: Pattern Recognition and Machine Learning, Springer Verlag, Heidelberg, 2006
Yann LeCun, Yoshua Bengio, Geoffrey Hinton: Deep learning. Nature 521, 436–444 (28 May 2015)
- Schlagwörter:
- deep learning; machine learning
|
|
Deep Learning Exercises [DL E] -
- Dozentinnen/Dozenten:
- Leonid Mill, Hendrik Schröter
- Angaben:
- Übung, 2 SWS, ECTS: 2,5, nur Fachstudium
- Studienrichtungen / Studienfächer:
- WPF INF-MA ab 1
WF CME-MA ab 1
- Schlagwörter:
- deep learning; machine learning
| | | Mo | 12:00 - 14:00 | 0.01-142 CIP | |
Schröter, H. Mill, L. | |
| | Di | 10:00 - 12:00 | 0.01-142 CIP | |
Mill, L. Schröter, H. | |
| | Mi | 10:00 - 12:00 | 0.01-142 CIP | |
Mill, L. Schröter, H. | |
| | Do | 14:00 - 16:00 | 0.01-142 CIP | |
Mill, L. Schröter, H. | |
| | Fr | 8:00 - 10:00 | 0.01-142 CIP | |
Mill, L. Schröter, H. | |
|
HS Deep Learning for NLP [HSprakt] -
- Dozent/in:
- Stefan Evert
- Angaben:
- Hauptseminar, 2 SWS, ECTS: 5, Bachelor
- Termine:
- Mi, 10:15 - 11:45, 01.019
- Voraussetzungen / Organisatorisches:
- Participants must register for the StudOn course linked below. Seminar places are assigned on a first come, first served basis.
- Inhalt:
- Deep neural networks – also known as deep learning – have attracted significant attention in recent years. They have had a transformative influence on natural language processing (NLP) and artificial intelligence (AI), with numerous success stories and even claims of superhuman learning performance in certain tasks. According to Young et al. (2017), more than 70% of the papers presented at recent NLP conferences made use of deep learning techniques.
This seminar will focus on the application of deep learning techniques to natural language processing tasks and on the topic "Social Bots: Danger or Myth?".
- Empfohlene Literatur:
- Representation Learning: A Review and New Perspectives, Yoshua Bengio, Aaron Courville, Pascal Vincent, Arxiv, 2012.
Deep Learning, Yoshua Bengio, Ian Goodfellow, Aaron Courville, MIT Press
Goldberg, Yoav (2017). Neural Network Methods for Natural Language Processing. Number 37 in Synthesis Lectures on Human Language Technologies. Morgan & Claypool.
Young, Tom; Hazarika, Devamanyu; Poria, Soujanya; Cambria, Erik (2017). Recent trends in deep learning based natural language processing. CoRR, abs/1708.02709. http://arxiv.org/abs/1708.02709
http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutorial
http://deeplearning.net/tutorial/
Stanford University CS 224: Deep Learning for NLP (http://cs224d.stanford.edu/)
University of Oxford: Deep Natural Language Processing (https://github.com/oxford-cs-deepnlp-2017/lectures)
|
|
Seminar Deep Learning Theory & Applications [SemDL] -
- Dozentinnen/Dozenten:
- Vincent Christlein, Stefan Evert, Ronak Kosti
- Angaben:
- Seminar, 4 SWS, benoteter Schein, ECTS: 5, nur Fachstudium
- Termine:
- Mi, 10:15 - 11:45, 01.019
- Studienrichtungen / Studienfächer:
- WPF INF-MA 1
WPF MT-MA-BDV 1
WPF CE-MA-TA-MT 1
- Inhalt:
- Deep Neural Networks or so-called deep learning has attracted significant attention in the recent years.
They have had a transformative influence on Natural Language Processing (NLP) and Artificial Intelligence (AI), with numerous success stories recent claims of superhuman learning performance in certain tasks.
According to Young et al. (2017), more than 70% of the papers presented at recent NLP conferences made use of deep learning techniques.
Interestingly, the concept of Neural Networks inspired researchers already over generations since Minky's famous book (cf. http://en.wikipedia.org/wiki/Society_of_Mind ).
Yet again, this technology brings researchers to the believe that Neural Networks will eventually be able to learn everything (cf. http://www.ted.com/talks/jeremy_howard_the_wonderful_and_terrifying_implications_of_computers_that_can_learn ).
This year's main topic is: „Social Bots: Danger or Myth?".
- Empfohlene Literatur:
- Representation Learning: A Review and New Perspectives, Yoshua Bengio, Aaron Courville, Pascal Vincent, Arxiv, 2012.
Deep Learning, Yoshua Bengio, Ian Goodfellow, Aaron Courville, MIT Press
Goldberg, Yoav (2017). Neural Network Methods for Natural Language Processing. Number 37 in Synthesis Lectures on Human Language Technologies. Morgan & Claypool.
Young, Tom; Hazarika, Devamanyu; Poria, Soujanya; Cambria, Erik (2017). Recent trends in deep learning based natural language processing. CoRR, abs/1708.02709. http://arxiv.org/abs/1708.02709
Gradient-Based Learning Applied to Document Recognition, Yann Lecun, 1998
Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Srivastava et al. 2014
Greedy layer-wise training of deep networks, Bengio, Yoshua, et al. Advances in neural information processing systems 19 (2007): 153.
Reducing the dimensionality of data with neural networks, Hinton et al. Science 313.5786 (2006): 504-507.
Training Deep and Recurrent Neural Networks with Hessian-Free Optimization, James Martens and Ilya Sutskever, Neural Networks: Tricks of the Trade, 2012.
Deep Boltzmann machines, Hinton et al.
Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion, Pascal Vincent et al.
A fast learning algorithm for deep belief nets, Hinton et al., 2006
ImageNet Classification with Deep Convolutional Neural Networks, Alex Krizhevsky, Ilya Sutskever, Geoffrey E Hinton, NIPS 2012
Regularization of Neural Networks using DropConnect, Wan et al., ICML
OverFeat: Integrated recognition, localization and detection using convolutional networks. Computing Research Repository, abs/1312.6229, 2013.
http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutorial
http://deeplearning.net/tutorial/
Deep Learning Course on Coursera by Hinton
DL platform with GPU support: caffe, lasagne, torch etc.
Stanford University CS 224: Deep Learning for NLP (http://cs224d.stanford.edu )
University of Oxford: Deep Natural Language Processing (https://github.com/oxford-cs-deepnlp-2017/lectures )
- Schlagwörter:
- deep learning; neural networks; machine learning; pattern recognition; natural language processing
|
Suchmodus:
Umlaute können auch in der Ersatzdarstellung eingegeben werden.
|
|
|
|
UnivIS ist ein Produkt der Config eG, Buckenhof |
|
|