Deep learning is the predominant machine learning paradigm in natural language processing (NLP). This approach not only gave huge performance improvements across a large variety of natural language processing tasks.
This time we will focus on the theory of deep learning as a good understanding of the theory is important for success in practical applications. The topics will be based on chapters of the book Mathematical Aspects of Deep Learning . I would be open to suggestions of alternative books. We can discuss this in the first meeting.
Lecturer: Dietrich Klakow
Location: t.d.b
Time: block course in the spring break 2024 however preparations start earlier. Here the specific time line
Closing topic doodle: tbd
Kick-Off: some time tbd (doodle)
One page outline: tbd
Draft presentation: tbd
Practice talks and final talks will be during the spring break. Time/date will be decided during the kick-off.
Application for participation: see CS seminar system (for CS; DSAI, VC, ES, …), for CoLI,LST, LCT use this registration system.
HISPOS registration deadline: tbd
Grading (tentative):
- 5% one page talk outline
- 10% draft presentation
- 10% practice talk
- 35% final talk
- 5% contributions to discussion during final talk of fellow participants
- 35% report
List of Topics (tentative):
The Modern Mathematics of Deep Learning
Generalization in Deep Learning
Expressivity of Deep Neural Networks
Optimization Landscape of Neural Networks
Explaining the Decisions of Convolutional and Recurrent Neural Networks
Stochastic Feedforward Neural Networks: Universal Approximation
Deep Learning as Sparsity-Enforcing Algorithms
The Scattering Transform
Deep Generative Models and Inverse Problems
Dynamical Systems and Optimal Control Approach to Deep Learning
Bridging Many-Body Quantum Physics and Deep Learning via Tensor Networks