lunduniversity.lu.se

Automatic Control

Faculty of Engineering, LTH

Denna sida på svenska This page in English

FRTN50 - Optimization for Learning

Optimering för maskininlärning, 7.5 hp

Syllabus CEQ  Schedule 2019

General Information

Elective for: D5-mai, E4, F5, F5-r, F5-mai, I4, M4, Pi5-ssr
The course will be given in English

Aim

Learning from data is becoming increasingly important in many different engineering fields. Models for learning often rely heavily on optimization; training a machine is often equivalent solving a specific optimization problem. These problems are typically of large-scale. In this course, we will learn how to solve such problems efficiently. The large-scale nature of the problems renders traditional methods inapplicable. We will provide a unified view of algorithms for large-scale convex optimization and treat algorithms for the nonconvex problem of training deep neural networks.

Learning outcomes

Knowledge and understanding
For a passing grade the student must

  • know basic convex analysis
  • understand the connection between machine learning and optimization
  • have an understanding on the role of regularization in learning from an optimization point of view
  • understand unifying framework for large-scale convex optimization
  • understand concepts such as nonexpansiveness, and averagedness and their relation to monotone operators and their role for convergence of algorithms
  • understand how to derive specific algorithms from the few general ones
  • understand methods for avoiding numerical issues in deep neural network training.

Competences and skills
For a passing grade the student must

  • be able to describe optimality conditions that are useful for large-scale methods
  • be able to describe the building blocks that are the foundations of large-scale optimization algorithms and why they are used
  • be able to analyze performance of optimization algorithms
  • be able to solve optimization problems numerically using software and own implementations
  • be able to present results in writing.

Judgement and approach
For a passing grade the student must

  • understand what algorithm that should be used for different machine learning training problems
  • be able to participate in the team-work needed to solve the hand-in assignments.

Contents

The course has lectures, exercises, and four hand-in assigments.

The lectures will cover: Convexity, models for learning, unified convex optimization algorithm view, fixed-point iterations, monotone operators, nonexpansive mappings, stochastic methods, reduced variance methods, block-coordinate methods, nonconvex stochastic gradient descent and variations for for deep learning training.

Examination details

Grading scale: TH - (U,3,4,5) - (Fail, Three, Four, Five)
Assessment: Written exam (5 hours), 4 hand-in exercises. In case of less than 5 registered students, the exam may be given in oral form.

Admission

Recommended prior knowledge: FMAN60 Optimization
The course might be cancelled: If the number of applicants is less than 12.

Reading list

  • Lecture slides and notes.

Contact and other information

Course coordinator: Pontus Giselsson, pontusg@control.lth.se

Page Manager: