Instructor: Ivan Titov

Time: Wednesday, 10.15 - 11.45 am

Location: Building C 7.2, room 2.11 (may be changed later)

Office hours: Wednesday, 2 pm - 3 pm in Ivan's office (C7.4, 3.22), or send me a message by e-mail.

**Short Description**

The class will cover machine learning methods for structured prediction problems, the main focus will be on problems from natural language processing but most of the considered methods will have applications in other domains (e.g., bioinformatics, vision, information retrieval, etc).

Structured prediction problems are classification problems where the classifier predicts not a binary/multiclass label but rather an element of some structured space. Examples of structured problems include sequence labeling problems, segmentation problems, parsing (syntactic or semantic in NLP or, e.g, image parsing in vision) and many others. In the class we will cover most of the state-of-the-art methods for this class of problems: starting from hidden Markov models, structured perceptron, conditional random fields to more advanced techniques such as structured SVM, Searn and others.

Though most of the applications considered papers will be from the NLP domain, I do not require any prior exposure to NLP (though it would be a plus). Ideally, I expect that you have some prior experience with machine learning, statistical NLP or IR. If hesitant, feel free to contact me and ask.

**Requirements**

- Present a paper to the class (30 - 45 minutes long presentation)
- Write 2 critical reviews (surveys) on two selected topics (1 - 2 pages each)
- Write a term paper (12 - 15 pages)
**(If you are registered for 4 points you do not need to write the term paper)** - Read papers before the talks and participate in discussion

**Grading**

- Class participation grade: 60 %
- Your talk and discussion after the talk
- Participation in discussion of other papers
- 2 reviews (5 % each )
- Term paper grade: 40 %
- Only if you get 7 points, otherwise class participation constitutes 100 %

You can skip ONE class without giving any explanation to me (if it is not the class on which you are presenting). If you need to skip more, you will need to write an additional critical review for every paper presented while you were absent.

**Presentation**

- Present the chosen paper in an accessible way
- Present sufficient background, do not expect the audience to know much about Machine Learning or Natural Language Processing, except for the material already covered in the class (according to surveys there is a good number of people who have no ML background)
- Have a critical view on the paper: discuss shortcomings, possible future work, etc
- To give a good presentation in most of the cases you will need to read one or two additional papers (e.g., those referenced in the paper)
- You should have a look into material on how to give a good presentation compiled by Alexander Koller
- The language for talks and discussions will be English
- Given the number of students now, we are planning to have 35 minutes long presentations, on some days we may decide to have 2 presentations
- Send me your slides (preferably in PDF) 4 days before the talk by 6 pm (the first 2 presenters can send me slides 2 days before the talk)
- If we keep the class on Friday, it means that the deadline is Mon, 6 pm
- I will give my feedback 2 days before the seminar (on Wed)

**Critical reviews**

- A short critical (!) essay reviewing one of the papers in the list
- One or two paragraphs presenting the essence of the paper
- Other parts underlying both positive sides (what you like) of the paper and shortcomings
- You need to submit 2 reviews. There will be up-to 3 reviewers for each presentation.
- The review should be submitted (by email in pdf) before the presentation of the paper in class (Exception is the additional reviews submitted for the classes you missed: you should submit such an additional review within 2 weeks of the corresponding class and before the end of the term)
- No copy-paste from the paper. It should be all your words.
- Length: 1 - 1.5 pages each

**Term paper**

** Goal: **

- Describe the paper you presented in class.
- It should be written in a style of a research paper, the only difference is that in this paper most of the work you present here is not your own
- Your ideas, analysis, comparison
- It should be written in English
- Comparison of the methods used in the paper with other material presented in the class or any other related work
- Any ideas on improvement of the approach
- Any alternative interpretation or analysis

** Grading criteria: **

- Clarity
- Paper organization
- Technical correctness
- Style (written in research style without inappropriate speculations, correct citations, etc)
- Your ideas are meaningful and interesting

** Length: 12 - 15 pages **

** Deadline: Available Later ** I would recommend to submit it soon after your presentation, as it would probably be easy.

** Submitted in PDF to my email **

**Note:** References to papers, dates, and speakers are provided in the Google Docs (a reference was sent to attenders)

- Introduction into structured prediction: problems, settings, etc (given by Ivan)
- Hidden Markov models (Ivan)
- Structured perceptron (Ivan)
- Local models: Maximum entropy Markov models
- Conditional random fields (sequence labeling / segmentation settings)
- SVM: binary, multilable and structured settings (SVM-Struct)
- Maximum margin Markov networks (M3Ns)
- Combining learning and search: SEARN and predecessors
- Parsing: weighted context-free grammars (CFGs): generative vs discriminative training
- Parsing: transition-based vs global models (in dependency parsing context)
- Parsing: CFGs with latent annotation
- learning with latent representation of the context
- Semi-supervised methods for structured prediction

- Introduction to structured prediction [Speaker: Ivan] [powerpoint] [pdf, no animations]
- Sequence Modeling: Hidden Markov Models vs Structured Perceptron [Speaker: Ivan] [powerpoint] [pdf, no animations] Additional materials: theory for voted perceptron , structured perceptron.
- Maximum likelihood estimation vs discriminative models for WCFGs [Speaker: Lea] [pdf] Additional materials: note that we compared estimation methods, not model classes, as normalization does not affect the class of modeled distribution [see details here]
- Maximum-Entropy Markov Models vs Conditional Random Fields [MEMMs by Florian] [CRFs by Angeliki]
- Dependency parsing: Transition-based (history-based) models vs global models [Nivre's parser by Kang] [MST by Maria]
- Max-margin methods: SVM Struct vs M3Ns [SVM Struct by Qinqing] [M3Ns by Wenbin]
- Inference with Integer Linear Programming, applications: SRL and dependency parsing [SRL by Fang] [Dependency parsing by Todd]
- Latent variables: PCFGs with latent annotation [Prashanth's slides]
- Latent features vs Determinstic Features: Incremental Sigmoid Belief Networks [Ivan's slides]
- Incremental Perceptron vs SEARN [Inc Perceptron by Xiaojun] [SEARN by Olga]
- Semi-Supervised Learning for Parsing: vocabulary clustering vs. self-training [Dep Parsing with Clustering by Luciano] [Self-training by Dominikus]
- Ordinal regression + Multi-aspect Ranking [Besnik's slides]
- Brief conclusions [powerpoint]