Deep learning is the predominant machine learning paradigm in natural language processing (NLP). This approach not only gave huge performance improvements across a large variety of natural language processing tasks. It also allowed for a much easier integration of real world knowledge and visual information into NLP systems. However, a big problem of deep learning is the need for massive amounts of training data. Therefore in some of the topics you have to look into methods that can cope with this issue, e.g. by automatically creating noisy training data.
Lecturer: Dietrich Klakow
Location: to be announced
Time: block course in the fall break 2020
Corresponding LSF entry.
Application for participation:
Fully booked.
After the registration deadline, I will set up a doodle for a kick-off meeting and the topic assignment. Dates for the practice talks and the final talks will be set during that kick-off meeting.
HISPOS registration deadline: 3. July 2020
List of Topics:
- Transfer Learning
- A Survey on Transfer Learning
- Meta-Learning in Neural Networks: A Survey
- Curriculum Learning by Transfer Learning: Theory and Experiments with Deep Networks
- Transfer Learning using Kolmogorov Complexity:Basic Theory and Empirical Evaluations
- Transfer Learning via Dimensionality Reduction
- A Theory of Transfer Learning with Applications to Active Learning
- Deep Learning of Representations for Unsupervised and Transfer Learning
- Meta-Transfer Learning for Few-Shot Learning
- To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks
- Exploring the limits of transfer learning with a unified text-to-text transformer
- Understanding and Improving Information Transfer in Multi-Task Learning