Block Seminar Machine Learning for Natural Language Processing (Spring 2020)

Deep learning is the predominant machine learning paradigm in natural language processing (NLP). This approach not only gave huge performance improvements across a large variety of natural language processing tasks. It also allowed for a much easier integration of real world knowledge and visual information into NLP systems. However, a big problem of deep learning is the need for massive amounts of training data. Therefore in some of the topics you have to look into methods that can cope with this issue, e.g. by automatically creating noisy training data.

Lecturer:  Dietrich Klakow

Location:  to be announced

Time: block course in the spring break 2020

Application for participation:  closer

Exam date: there is a nominal exam date of December 2nd in order to create a HISPOS registration deadline. Please keep that in mind and register in time in HISPOS.

List of Topics:

  1. Learning to Reweight Examples for Robust Deep Learning by Ren, Mengye and Zeng, Wenyuan and Yang, Bin and Urtasun, Raquel
  2. Learning from noisy labels with distillation by Li, Yuncheng and Yang, Jianchao and Song, Yale and Cao, Liangliang and Luo, Jiebo and Li, Li-Jia
  3. Learning with noisy labels by Natarajan, Nagarajan and Dhillon, Inderjit S and Ravikumar, Pradeep K and Tewari, Ambuj
  4. Generalized cross entropy loss for training deep neural networks with noisy labels by Zhang, Zhilu and Sabuncu, Mert
  5. Pre-Learning Environment Representations for Data-Efficient Neural Instruction Following by Gaddy, David and Klein, Dan
  6. Joint concept learning and semantic parsing from natural language explanations by Srivastava, Shashank and Labutov, Igor and Mitchell, Tom
  7. Learning Cooperative Visual Dialog Agents with Deep Reinforcement Learning by Das, Abhishek and Kottur, Satwik and Moura, Jose MF and Lee, Stefan and Batra, Dhruv
  8. Visual coreference resolution in visual dialog using neural module networks by Kottur, Satwik and Moura, Jose MF and Parikh, Devi and Batra, Dhruv and Rohrbach, Marcus
  9. Active and semi-supervised learning in asr: Benefits on the acoustic and language models by Drugman, Thomas and Pylkkonen, Janne and Kneser, Reinhard
  10. Transfer learning for speech recognition on a budget by Kunze, Julius and Kirsch, Louis and Kurenkov, Ilia and Krug, Andreas and Johannsmeier, Jens and Stober, Sebastian
  11. Transfer learning for named-entity recognition with neural networks by Lee, Ji Young and Dernoncourt, Franck and Szolovits, Peter
  12. Semi-supervised sequence modeling with cross-view training by Clark, Kevin and Luong, Minh-Thang and Manning, Christopher D and Le, Quoc V
  13. A Survey on Transfer Learning by Sinno Jialin Pan and Qiang Yang
  14. Taskonomy: Disentangling Task Transfer Learning by Amir R. Zamir, Alexander Sax, William Shen, Leonidas Guibas, Jitendra Malik and Silvio Savarese
  15. Deep Transfer Learning with Joint Adaptation Networks by Mingsheng Long, Han Zhu, Jianmin Wang and Michael I. Jordan
  16. Learning Transferable Features with Deep Adaptation Networks by Mingsheng Long, Yua Cao, Jiamin Wang and Michael Jordan
  17. Optimal Bayesian Transfer Learning by Alireza Karbalayghareh, Xiaoning Qian and Edward R. Dougherty