Block Seminar Machine Learning for Natural Language Processing (Spring 2025)

Deep learning is the predominant machine learning paradigm in natural language processing (NLP). This approach not only gave huge performance improvements across a large variety of natural language processing tasks.

This versions topic still needs to be decided

Lecturer:  Dietrich Klakow

Location:  t.d.b

Time: block course in the spring break 2024 however preparations start earlier. Here the specific time line

Closing topic doodle: tbd
Kick-Off: some time tbd (doodle)
One page outline: tbd
Draft presentation: tbd

Practice talks and final talks will be during the spring break. Time/date will be decided during the kick-off.

Application for participation:  see CS seminar system (for CS; DSAI, VC, ES, …), for CoLI,LST, LCT use this registration system.

HISPOS registration deadline: 31.1.25

Grading (tentative):

  • 5% one page talk outline
  • 10% draft presentation
  • 10% practice talk
  • 35% final talk
  • 5% contributions to discussion during final talk of fellow participants
  • 35% report

List of Topics (tentative):

  1. Transformer for Graphs: An Overview from Architecture Perspective
  2. A generalization of transformer networks to graphs
  3. Graphit: Encoding graph structure in transformers
  4. Do transformers really perform badly for graph representation?
  5. Structure-aware transformer for graph representation learning
  6. Recipe for a general, powerful, scalable graph transformer
  7. Exphormer: Sparse transformers for graphs
  8. Relational attention: Generalizing transformers for graph-structured tasks
  9. Graph-based molecular representation learning
  10. Geometric deep learning on molecular representations
  11. MG-BERT: leveraging unsupervised atomic representation learning for molecular property prediction
  12. TopoFormer: Multiscale Topology-enabled Structure-to-Sequence Transformer for Protein-Ligand Interaction Predictions

Thanks to Anna Karnysheva for providing her paper list.