Seminar: Efficient Inference and Large-Scale Machine Learning

News

  • The session on Monday 26.06. will start at 12:00. All other sessions will take place according to the regular schedule.
  • Slides with organizational updates can be found here

Description

As both the amount and complexity of available data grows, machine learning practitioners are interested in finding increasingly more sophisticated patterns and interactions in it. The framework of Bayesian statistics provides a powerful tool for modeling and learning these dependencies. Bayesian inference has found numerous applications in the domains like computer vision, natural language processing and data science. Nevertheless, new tools are constantly needed in order to perform inference in the models of ever-growing complexity. Moreover, scalable algorithms are becoming essential for handling the datasets of massive scale.

The main purpose of the seminar is to get the students acquainted with the recent advances in probabilistic machine learning research. The two core topics addressed are efficient (approximate) inference for Bayesian modeling and large-scale optimization.

Background

It might be a good idea to brush up your knowledge of the following topics in order to be better prepared for the seminar contents

 

TopicReadingLectures
Bayesian Inference

Murphy*: 2.2, 3, 5 (optional)

Any statistics textbook: chapter on Bayes theorem

Lecture 1

Lecture 2

Lecture 3

Probabilistic Graphical Models

Bishop*: 8.1-8.3

Murphy*: 10.1-10.3, 19.1-19.4

Lecture 1(start at 34:30)

Lecture 2

Lecture 3

 

Topics

DateTopicStudentSupervisorReferencesReviewer 1Reviewer 2
24.04Tensor FactorizationStephanOleksandr

Tensor Decompositions and Applications

Tensor Decomposition for Signal Processing and Machine Learning

YuHaris
08.05Variational Inference: FoundationsIvanOleksandr

Variational Inference: A Review for Statisticians

Stochastic Variational Inference

Lecture Notes

UkritCan
15.05Variational Inference: Scaling UpJakobAleksandar

Black Box Variational Inference

Doubly Stochastic Variational Bayes for non-Conjugate Inference

Tutorial on Variational Autoencoders

UkritYu
22.05Variational Inference: Beyond Mean FieldJanOleksandr

Normalizing flows

Hierarchical Variational Models

JakobCan
29.05Message Passing and Expectation PropagationChristophOleksandr

Murphy*: 20

Bishop*: 8.4, 10.7

Expectation Propagation for approximate Bayesian inference

DenizRaymond
12.06Sampling: Foundations MesutOleksandr

Bishop*: 11.2 - 11.4

Murphy*: 24.1 - 24.3

Slice sampling

ChristophJan
19.06

Sampling: Advanced Techniques

RaymondAleksandar

Bishop*: 11.5

The No-U-Turn Sampler

DenizMesut
26.06

Particle Filters

CanAleksandar

Murphy*: 23.5

A Tutorial on Particle Filtering and Smoothing

StephanChristoph
03.07Natural Gradients (Cancelled)HarisAleksandar

New insights and perspectives on the natural gradient method

Revisiting natural gradient for deep networks

------------
10.07Bayesian OptimizationYuAleksandar

A Tutorial on Bayesian Optimization

Practical Bayesian Optimization of ML Algorithms

Stephan

 

Jan

Ivan

17.07Probabilistic NumericsDenizOleksandr

Probabilistic numerics and uncertainty in computations

Fast Probabilistic Optimization from Noisy Gradients

JakobRaymond
24.07Large-Scale Learning SystemsUkritAmir

MXNet

TensorFlow

IvanMesut

 

*Bishop  = Pattern Recognition and Machine Learning

*Murphy = Machine Learning: A Probabilistic Perspective

Both books are available in the TUM library

Organizational Details

  • 12 Participants
  • 5 ECTS
  • Language: English
  • Weekly meetings every Monday 12:30-14:00, room 00.08.055
  • Mandatory attendance of the weekly sessions
  • Please send your questions regarding the seminar to kdd-seminar-inference@in.tum.de.

Prerequisites

  • The seminar is designed for Master students of the Computer Science department.
  • This seminar deals with advanced and cutting edge topics in machine learning and data mining research. Therefore, the students are expected to have a solid background in these areas (e.g. having attended at least one of the related lectures, such as "Mining Massive Datasets", "Machine Learning", etc.). 

Requirements

  • Extended abstract: 1 page article document class with motivation, key concepts and results.
  • Paper: 5-8 pages in ACM format.
  • Presentation: 30 minutes talk + 15 minutes discussion. (Optional: Beamer template)
  • Peer-review process.

Dates

  • 27.01.2017 16:00: Pre-course meeting in Interims Hörsaal 2. Slides can be found here.
  • 03.02.17 - 08.02.17: Application and registration in the matching system of the department.
  • After 15.02.17: Notification of participants.
  • 01.03.2017 11:30: Kick-off meeting in the room 02.09.014. Slides can be found here.
  • Starting 24.04: Weekly meetings every Monday 12:30-14:00, room 00.08.055.

Deadlines

  • 1 week before the talk: submission of an extended abstract and slides
  • One day before the talk: submission of a preliminary paper for review
  • 1 week after the talk: receiving comments from reviewers
  • 2 week after the talk: submission of the final paper