ECE228 and SIO209 Machine learning for physical applications, Spring 2019
Professor Peter Gerstoft, Spiess Hall 462, Gerstoft@ucsd.edu
TA Siva Prasad Varma Chiluvuri, sivapvarma@gmail.com
TA Harshuk Gupta, h6gupta@eng.ucsd.edu
TA Ruixian Liu, rul188@eng.ucsd.edu
Location: SOLIS 107
Time: Monday and Wednesday 56:20pm
Sylabus: Machine learning has received enormous interest. To learn from data we use probability theory, which has been the mainstay of statistics and engineering for centuries. The class will focus on implementations for physical problems. Topics: Gaussian probabilities, linear models for regression, linear models for classification, neural networks, kernel methods, support vector machines, graphical models, mixture models, sampling methods, sequential estimation. Prerequisites: graduate standing.
It is not a computer science class, so we go slowly through the fundamentals to appreciate the methods, implement these. We will discuss their use in Physical Sciences. In the first part of the class we focus on theory and implementations. We will then transition to focus on machine learning final projects.
Books (all available online): Main book:
Chris M Bishop
Pattern
Recognition and Machine Learning . A third party Matlab implementation of many of the
algorithms in the book.
Other good books:
Hastie and Tibshirani
The Elements of Statistical Learning
Kevin P. Murphy: Machine Learning: A Probabilistic Perspective. UCSD license
. Matlab codes used in Murphy's
book.
Online resources: While not required, I recommend taking these classes. Both
are online classes are excellent.
Statistical
Learning by Hastie and Tibshirani. My favorite class.
Andrew Ng's Coursera class, Machine learning. This was the first class offered
by Coursera.
Grading: Full scale of the letter grade. Grade consists of About
50 % homework and 50% finalproject (10% poster, 10 % code and 30 % Report). Your and my purpose
is to lean, so a good effort is sufficient.
Homework Cody homework will be graded. I'm a strong believer in leaning by doing. Thus we will
have computerbased homework each week.
Final project: Propose a topic before May 1.
We will make teams on ~April 26. Propose a topic before ~May 5. Preliminary presentation 20 May, Final Poster 5 June. Report due Saturday June 15.
Suggested topics (see last years projects
Spring 2018 and Spring 2017
):
Schedule (15 Lectures. Two presentation classes )
Hastie CH 9Lecture  Date  Topic  Required reading  Assignments 
1  April 1  Introduction, probability theory [Slides Annotated Slides ] 
Bishop CH 1.01.2  
2  April 3  Gaussian probability theory [Slides, Annotated Slides ] Question to lecture 2 in Stanford CNN lecture 
Bishop CH 1.5, 1.6, 2.3  
3  April 8  Gaussian probability theory, Gaussian processes [Slides Annotated Slides] Question to lecture 34 in Stanford CNN lecture 
Bishop CH 2.3, 6.4 Murphy ch15 
Homework 1 due 4/22 
4  April 10  Gaussian Processes, Linear models for Regression Diego Nozal: Using Gaussian Processes for outdoor Concert sound fields [Slides Annotated Slides] Question to lecture 5 in Stanford CNN lecture  Bishop CH 3.03.1  Homework 2 due 4/22 
5  April 15  Linear models for Regression [Slides Annotated Slides] Question to lecture 6 in Stanford CNN lecture 
Bishop CH 3.23.3  
6  April 17  Question to lecture 7 in Stanford CNN lecture Emma Ozanich: ML on NOAA climate data using jupyter and GPU [Slides Annotated Slides] 
GPU Jupiter homework relased due 5/10  
7  April 22  Question to lecture 8 in Stanford CNN lecture Linear models for Classification Backpropagation[Slides Annotated Slides] 
Bishop 4.04.3.2, 4.3.4, 5.05.2  Homework 3 (NN) due 5/5. Subset of MNIST Data to use for HW3 on you laptop 
8  April 24  Question to lecture 9 in Stanford CNN lecture Kernel methods [Slides Annotated Slides] 
Bishop CH  
9  April 29 
Question to lecture 10 in Stanford CNN lecture Christopher Johnson: Machine learning with CNN in seismology. [Slides Annotated Slides] 
Bishop CH 6  
10  May 1 
Question to lecture 11 in Stanford CNN lecture Support vector machines [Slides Annotated Slides] 
Bishop CH 7  
11  May 6 
Question to lecture 12 in Stanford CNN lecture Kmeans, EM and Mixture models [Slides Annotated Slides]  Bishop CH 9  Project proposals due Homework 5(SVM +Dictionary Learning) due 5/29 
12  May 8 
Question to lecture 13 in Stanford CNN lecture Mike Bianco: Dictionary leaning 
Read Bianco's Dictionary learning paper  
  May 13  No Class, work on Final project.  
13  May 15  TA Siva Varma: Jupyter homework and leftover material on Dictionary Leanring  
14  May 20 
Question to lecture 13 and 14 in Stanford CNN lecture Sequential estimation [Slides Annotated Slides] 
Bishop CH 13  
1  May 22  midproject presentation of finalprojects  
15  May 29 
Question to lecture 15 in Stanford CNN lecture Sequential estimation and Sparse models 

16  June 3 
Question to lecture 16 in Stanford CNN lecture Sparse models 
Read Chapter 2 for an introduction to sparse problems Murphy 13.1, 13.3, 13.6.1 
Homework 6 (Kalman Filter and Sparse signal recovery) due 6/10

2  June 5  Final project poster presentation, Atkinson Hall and Lobby, 58 pm  Final report due Saturday 15 June. 