Block Seminar Machine Learning for Natural Language Processing (Fall 2020)

Deep learning is the predominant machine learning paradigm in natural language processing (NLP). This approach not only gave huge performance improvements across a large variety of natural language processing tasks. It also allowed for a much easier integration of real world knowledge and visual information into NLP systems. However, a big problem of deep learning is the need for massive amounts of training data. Therefore in some of the topics you have to look into methods that can cope with this issue, e.g. by automatically creating noisy training data.

Lecturer:  Dietrich Klakow

Location:  to be announced

Time: block course in the fall break 2020

Corresponding LSF entry.

Application for participation: 

Fully booked.

After the registration deadline, I will set up a doodle for a kick-off meeting and the topic assignment. Dates for the practice talks and the final talks will be set during that kick-off meeting.

HISPOS registration deadline: 3. July 2020

List of Topics:

  1. Transfer Learning
  2. A Survey on Transfer Learning
  3. Meta-Learning in Neural Networks: A Survey
  4. Curriculum Learning by Transfer Learning: Theory and Experiments with Deep Networks
  5. Transfer Learning using Kolmogorov Complexity:Basic Theory and Empirical Evaluations
  6. Transfer Learning via Dimensionality Reduction
  7. A Theory of Transfer Learning with Applications to Active Learning
  8. Deep Learning of Representations for Unsupervised and Transfer Learning
  9. Meta-Transfer Learning for Few-Shot Learning
  10. To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks
  11. Exploring the limits of transfer learning with a unified text-to-text transformer
  12. Understanding and Improving Information Transfer in Multi-Task Learning