Block Seminar Machine Learning for Natural Language Processing and beyond (Spring 2023)

Deep learning is the predominant machine learning paradigm in natural language processing (NLP) and beyond. This approach not only gave huge performance improvements across a large variety of natural language processing, computer vision and other tasks, it also allows to integrate external knowledge sources. The dominance of deep neural networks also brought different fields closer together as many approaches and ideas are shared. Even though the seminar has NLP in its name, it will try to take this more general perspective. It will also put a stronger emphasis on theory.

Lecturer:  Dietrich Klakow, JJ Alabi

Location:  to be announced

Time: block course in the spring break 2023; there will be a kick-off meeting in December

Application for participation: CoLi/LST/LCT please use our internal registration system;  CS, DSAI, and everybody else: please use the CS seminar assignment system . Please ignore the LSF for applying for participation. We will not consider it!

Grading:

  • 5% one page outline of talk (pdf, due in December, exact date will follow)
  • 10% draft presentation (pdf, due in January, exact date will follow)
  • 10% practice talk (two weeks before the official talk)
  • 35% final talk (date in the break set by a doodle)
  • 5% contributions to discussion during final talk of fellow participants
  • 35% report (deadline in May, exact date will follow)

HISPOS registration deadline: 4.12.2022. If you miss it, you will not be able to participate

Tentative List of Topics:

  1. A survey of data augmentation approaches for NLP by Feng, Steven Y et al. 
  1. Research on data augmentation for image classification based on convolution neural networks by Jia Shijie, Wang Ping, Jia Peiyi, Hu Siping 
  1. Text data augmentation for deep learning by Shorten, Connor and Khoshgoftaar, Taghi M and Furht, Borko 
  1. Counterfactual Data Augmentation for Mitigating Gender Stereotypes in Languages with Rich Morphology by Ran Zmigrod, Sabrina J. Mielke, Hanna Wallach, Ryan Cotterell 
  1. GenAug: Data Augmentation for Finetuning Text Generators by Steven Y. Feng, Varun Gangal, Dongyeop Kang, Teruko Mitamura, Eduard Hovy 
  1. Substructure Substitution: Structured Data Augmentation for NLP by Haoyue Shi, Karen Livescu, Kevin Gimpel 
  1. SpecAugment: A Simple Data Augmentation Method for Automatic Speech Recognition by Daniel S. Park, William Chan, Yu Zhang, Chung-Cheng Chiu, Barret Zoph, Ekin D. Cubuk, Quoc V. Le 
  1. In-Domain and Out-of-Domain Data Augmentation to Improve Children’s Speaker Verification System in Limited Data Scenario by S. Shahnawazuddin, Waquar Ahmad, Nagaraj Adiga, Avinash Kumar 
  1. Data Augmentation for Spoken Language Understanding via Joint Variational Generation by Kang Min Yoo , Youhyu Shin and Sang-goo Lee 
  1. Quantifying the Evaluation of Heuristic Methods for Textual Data Augmentation by Omid Kashefi, Rebecca Hwa 
  1. Data Augmentation for Graph Neural Networks by Tong Zha et al 
  1. Data augmentation instead of explicit regularization by A Hernández-García, P König 
  1. A Kernel Theory of Modern Data Augmentation by Tri Dao, Albert Gu, Alexander J. Ratner, Virginia Smith, Christopher De Sa, Christopher Ré 
  1. Does Data Augmentation Lead to Positive Margin? By Shashank Rajput et al.  
  1. Data augmentation revisited: Rethinking the distribution gap between clean and augmented data by Zhuoxun He et al.  
  1. A group-theoretic framework for data augmentation by Chen, Shuxiao and Dobriban, Edgar and Lee, Jane H 
  2. Data Boost: Text Data Augmentation Through Reinforcement Learning Guided Conditional Generation Ruibo Liu at a