Seminar: Efficient Inference and Large-Scale Machine Learning
- The session on Monday 26.06. will start at 12:00. All other sessions will take place according to the regular schedule.
- Slides with organizational updates can be found here.
As both the amount and complexity of available data grows, machine learning practitioners are interested in finding increasingly more sophisticated patterns and interactions in it. The framework of Bayesian statistics provides a powerful tool for modeling and learning these dependencies. Bayesian inference has found numerous applications in the domains like computer vision, natural language processing and data science. Nevertheless, new tools are constantly needed in order to perform inference in the models of ever-growing complexity. Moreover, scalable algorithms are becoming essential for handling the datasets of massive scale.
The main purpose of the seminar is to get the students acquainted with the recent advances in probabilistic machine learning research. The two core topics addressed are efficient (approximate) inference for Bayesian modeling and large-scale optimization.
It might be a good idea to brush up your knowledge of the following topics in order to be better prepared for the seminar contents
Murphy*: 2.2, 3, 5 (optional)
Any statistics textbook: chapter on Bayes theorem
|Probabilistic Graphical Models|
Murphy*: 10.1-10.3, 19.1-19.4
|Date||Topic||Student||Supervisor||References||Reviewer 1||Reviewer 2|
|08.05||Variational Inference: Foundations||Ivan||Oleksandr||Ukrit||Can|
|15.05||Variational Inference: Scaling Up||Jakob||Aleksandar||Ukrit||Yu|
|22.05||Variational Inference: Beyond Mean Field||Jan||Oleksandr||Jakob||Can|
|29.05||Message Passing and Expectation Propagation||Christoph||Oleksandr|
Bishop*: 8.4, 10.7
Bishop*: 11.2 - 11.4
Murphy*: 24.1 - 24.3
Sampling: Advanced Techniques
|03.07||Natural Gradients (Cancelled)||Haris||Aleksandar||------||------|
|24.07||Large-Scale Learning Systems||Ukrit||Amir||Ivan||Mesut|
*Bishop = Pattern Recognition and Machine Learning
Both books are available in the TUM library
- 12 Participants
- 5 ECTS
- Language: English
- Weekly meetings every Monday 12:30-14:00, room 00.08.055
- Mandatory attendance of the weekly sessions
- Please send your questions regarding the seminar to firstname.lastname@example.org.
- The seminar is designed for Master students of the Computer Science department.
- This seminar deals with advanced and cutting edge topics in machine learning and data mining research. Therefore, the students are expected to have a solid background in these areas (e.g. having attended at least one of the related lectures, such as "Mining Massive Datasets", "Machine Learning", etc.).
- Extended abstract: 1 page article document class with motivation, key concepts and results.
- Paper: 5-8 pages in ACM format.
- Presentation: 30 minutes talk + 15 minutes discussion. (Optional: Beamer template)
- Peer-review process.
- 27.01.2017 16:00: Pre-course meeting in Interims Hörsaal 2. Slides can be found here.
- 03.02.17 - 08.02.17: Application and registration in the matching system of the department.
- After 15.02.17: Notification of participants.
- 01.03.2017 11:30: Kick-off meeting in the room 02.09.014. Slides can be found here.
- Starting 24.04: Weekly meetings every Monday 12:30-14:00, room 00.08.055.
- 1 week before the talk: submission of an extended abstract and slides
- One day before the talk: submission of a preliminary paper for review
- 1 week after the talk: receiving comments from reviewers
- 2 week after the talk: submission of the final paper