Language Modeling for Natural Language Processing

LecturerYoussef Oualil

Location:  Seminarraum (Seminar Room) C7.2  

Time: Block Seminar in August/September 2017

 

Announcements:

Registration: Due to the high number of students who already registered for this seminar. The registration is now closed.

Please notice that we have a limited number of slots for this seminar. Students who could not be on the main list will be move to the waiting list. This decision is based on the registrations date/time order. 

 

Course Summary: In this seminar, we will review and discuss language modeling with a particular focus on its application to natural language processing tasks.

 

Seminar Format and Grading: Each of the participants will be assigned a paper or a book chapter (based on his/her preferences) that he/she needs to study and present to the rest of the group for a 50% of the grade. The remaining 50% is evaluated based on the final report following the seminar.

 

Seminar Schedule

Day 1:
09:00 - 10:00: Topic 18 (Natalia Skachkova)
10:00 - 11:00: Topic 5   (Badr Abdullah)
11:00 - 12:00: Topic 6   (Muhammad Ali)
    1 Hour Break
13:00 - 14:00: Topic 1   (Margarita Ryzhova)
14:00 - 15:00: Topic 12 (Maksym Andriushchenko)
15:00 - 16:00: Topic 22 (Sarah McLeod)

Day 2:
09:00 - 10:00: Topic 3  (Nima Nabizadeh) 
10:00 - 11:00: Topic 17 (Maximilian Wolf)
11:00 - 12:00: Topic 8   (Ali Adib Abbas)
    1 Hour Break
13:00 - 14:00: Topic 13 (Sourav Dey)
14:00 - 15:00: Topic 14 (Harshita Jhavar)
15:00 - 16:00: Topic 7   (Eleni Metheniti)

 

Seminar Topics: 

1) Sequence to Sequence Learning with Neural Networks (Margarita Ryzhova)

2) Two Discourse Driven Language Models for Semantics

3) Broad Context Language Modeling as Reading Comprehension (Nima Nabizadeh)

4) Hierarchical Recurrent Neural Network for Document Modeling

5) Character-Word LSTM Language Models (Badr Abdullah)

6) Improving cross-domain n-gram language modelling with skipgrams (Muhammad Ali)

7) Class-Based Language Modeling for Translating into Morphologically Rich Languages (Eleni Metheniti)

8) Improving Lexical Embeddings with Semantic Knowledge (Ali Adib Abbas)

9) Combining Relevance Language Modeling and Clarity Measure for Extractive Speech Summarization

10) Larger-Context Language Modelling with Recurrent Neural Network

11) Compact, Efficient and Unlimited Capacity Language Modeling with Compressed Suffix Trees

12) Learning Longer Memory in Recurrent Neural Networks (Maksym Andriushchenko)

13) Comparing Character-level Neural Language Models Using a Lexical Decision Task (Sourav Dey)

14) Recurrent Conditional Random Field for Language Understanding (Harshita Jhavar)

15) Contextual Bidirectional Long Short-Term Memory Recurrent Neural Network Language Models A Generative Approach to Sentiment Analysis

16) Script Induction as Language Modeling

17) Cross-Lingual Projection for Class-based Language Models (Maximilian Wolf)

18) Semantic Spaces for Improving Language Modeling (Natalia Skachkova)

19) Cross-Lingual Word Embeddings for Low-Resource Language Modeling

20) Batch normalized recurrent neural networks

21) Derivation of Document Vectors from Adaptation of LSTM Language Model

22) Strategies for Training Large Vocabulary Neural Language Models (Sarah McLeod)

23) Efficient GPU-based Training of Recurrent Neural Network Language Models Using Spliced Sentence Bunch

24) End-To-End Memory Networks

 

 

 

Useful Links:

  

For more information, please send an email to youssef.oualil@lsv.uni-saarland.de. Please include the course tag "[LM-NLP]" in all emails related to this course.