STAT 451: Tentative Schedule
(Syllabus)
Day #: Date | Subject | Homework Due (11:59 p.m.) |
1: 1/24/23 |
Help with Anaconda installation 01 Introduction (6): course overview; SVM 01 Python as a Calculator (1) (Note: Python links are green) |
Read introductory email Q00: background survey (extended to Sa 2/4 to accommodate late-add students) |
2: 1/26 |
I corrected typo in schedule (below) and syllabus: midterm is "Th 3/23", not "Tu 3/23" Please remind to record; and to stay by the podium 02 Juypter Notebook (4) (JupyterExample.ipynb/.html) Discuss survey 01 Introduction, continued: SVM |
Q01: calculator (extended to 2/4) (login help) |
3: 1/31 | (01separatingHyperplane.ipynb/.html) [02 Notation and Definitions: optional reading] 03 Sequences (2) (stringsDemo.ipynb, TuplesListsDemo.ipynb) |
Q02: Jupyter (extended to 2/4) |
4: 2/2 |
03 Fundamental Algorithms, Part 1: linear regression |
Q03: sequences (extended to 2/4) |
5: 2/7 |
04_NumPy (2) (04_numpy1demo.ipynb) |
|
6: 2/9 |
03 Fundamental Algorithms, Part 2: logistic regression:
logistic regression (4) (03logisticRegression.ipynb/.html) |
HW01 2/10: SVM, linear regression |
7: 2/14 |
05 pandas (2) (05_pandasDemo.ipynb) |
Q04: NumPy |
8: 2/16 |
03 Fundamental Algorithms, Part 3: decision tree (4)
(03decisionTree.ipynb/.html) |
Q05: pandas |
9: 2/21 |
06 matplotlib (3) (06_matplotlibDemo.ipynb) continue decision tree |
|
10: 2/23 |
03 Fundamental Algorithms, Part 4: more on SVM (4)
(03SVM.ipynb/.html) |
HW02 2/24: logistic regression, decision tree |
11: 2/28 |
03 Fundamental Algorithms, Part 5: k-NN (2)
(03kNN.ipynb/.html) |
Q06 matplotlib |
12: 3/2 |
07 write functions (2) 04 Anatomy of a Learning Algorithm: 04.pdf (4): gradient descent, scikit-learn (04gradientDescent.ipynb/.html) |
|
13: 3/7 |
05 Basic Practice, Part 1: feature
engineering (4)
(05featureEngineering.ipynb/.html) |
Q07 functions |
14: 3/9 |
Discuss exam rules conditional expressions (2) 05 Basic Practice, Part 2: algorithm / data split / model fit / regularize (4) (05modelFitRegularize.ipynb/.html) |
HW03 3/10: more SVM, kNN, gradient descent, SGD, feature engineering |
[3/14,16] | [spring break] | |
15: 3/21 |
introduce project (optional) Q&A review |
Q08 |
16: 3/23 | Midterm exam |
Midterm exam Th 3/23 in class |
17: 3/28 |
05 Basic Practice, Part 2, continued |
Project Tu 3/28: form a group |
18: 3/30 |
05 Basic Practice, Part 3: assessment / hyperparameter tuning / cross-validation (5) (05assessmentTuningCV.ipynb/.html) |
HW04 3/31: feature engineering, data split,
model fit and regularization |
19: 4/4 |
07 Problems and Solutions, Part 1:
kernel regression (2) (07kernelRegression.ipynb/.html) |
|
20: 4/6 |
07 Problems and Solutions, Part
2: multiclass,
one-class, and multilable classification (4) |
|
21: 4/11 |
07 Problems and Solutions, Part 3: ensemble learning (3) (07ensemble.ipynb/.html) |
Project Tu 4/11: proposal |
22: 4/13 |
project proposal feedback: meet in class with teacher and/or TA |
HW05 4/14: algorithm selection, multiclass
classification, assesment, tuning, ensemble
learning, imbalance |
23: 4/18 |
08 Advanced Practice: 08.pdf (5):
imbalance, combining/stacking, efficiency, multicore
(08stackingTiming.ipynb/.html) |
|
24: 4/20 |
Assign
presentation order 09: Unsupervised Learning: 09.pdf (4) (09densityEstimation.ipynb/.html, 09clustering.ipynb/.html, 09PCA.ipynb/.html) |
|
25: 4/25 |
project help |
Project We 4/26: slides |
26: 4/27 | Project: first 1/2 of presentations |
Project Th 4/27: presentations and peer feedback |
27: 5/2 |
Project: second 1/2 of presentations |
Project Th 5/2: presentations and peer feedback Project We 5/3: report Project Fr 5/5: report peer feedback |
28: 5/4 |
(optional) Q&A review |
|
Exam week | Final exam 2:45-4:45 Tu 5/9 |
Final exam 2:45-4:45 Tu 5/9 |
Note: