(Syllabus)
Day #: Date | Subject | Homework Due (11:59 p.m.) |
1: Th 9/5/24 |
Help with Anaconda installation 01 Introduction (6): course overview; SVM 01 Python as a Calculator (1) |
Read introductory email Q00: background survey (extended to Th 9/17 to accommodate Fr 9/13 add class deadline) |
2: Tu 9/10 |
02 Juypter Lab (4) (JupyterExample.ipynb/.html) 02 Juypter Lab (4) 01 Introduction, continued: SVM |
Q01: calculator (extended to 9/17) (login help) |
3: Th 9/12 |
(01separatingHyperplane.ipynb/.html) [02 Notation and Definitions: optional reading] 03 Sequences (2) (stringsDemo.ipynb, TuplesListsDemo.ipynb) |
Q02: Jupyter (extended to 9/17) |
4: Tu 9/17 |
03 Fundamental Algorithms, Part 1: linear
regression (4) (03linearRegression.ipynb/.html) Discuss HW01 |
|
5: Th 9/19 |
(04_numpy was here, but I am a little behind my planned schedule) |
Q03: sequences |
6: Tu 9/24 |
04_NumPy (2) (04_numpy.ipynb/.html) 03 Fundamental Algorithms, Part 2: logistic regression: logistic regression (4) (03logisticRegression.ipynb/.html) |
HW01: SVM, linear regression |
7: Th 9/26 |
05 pandas (2) (05_pandas.ipynb/.html) |
Q04: NumPy |
8: Tu 10/1 |
continue 05_pandas from value_counts() (8:00), Create(?) (11:00) 03 Fundamental Algorithms, Part 3: decision tree (4) (03decisionTree.ipynb/.html) |
(Q05 pandas was here) |
9: Th 10/3 |
Mention new links in "Midterm exam" line below. 06 matplotlib (3) (06_matplotlib.ipynb/.html) continue decision tree |
Q05: pandas |
10: Tu 10/8 |
03 Fundamental Algorithms, Part 4: more on SVM (4)
(03SVM.ipynb/.html) |
HW02: logistic regression, decision tree |
11: Th 10/10 |
03 Fundamental Algorithms, Part 5: k-NN (2)
(03kNN.ipynb/.html) 07 write functions (2) |
Q06 matplotlib |
12: Tu 10/15 |
04 Anatomy of a Learning Algorithm (4): gradient descent, scikit-learn
(04gradientDescent.ipynb/.html) |
|
13: Th 10/17 |
05 Basic Practice, Part 1: feature
engineering (6)
(05featureEngineering.ipynb/.html) |
Q07 functions |
14: Tu 10/22 |
Discuss exam rules 08 conditional expressions (2) 05 Basic Practice, Part 2: algorithm / data split / model fit / regularize (4) (05modelFitRegularize.ipynb/.html) |
HW03: more SVM, kNN, gradient descent, feature engineering |
15: Th 10/24 |
(optional) Q&A review |
Q08 conditional expressions |
16: Tu 10/29 |
Midterm exam in class (rules, spring2022/key, fall2022/key, spring2023/key, fall2023/key; spring 2024/key) |
Midterm exam |
17: Th 10/31 |
05 Basic Practice, Part 3: assessment / hyperparameter tuning / cross-validation (5) (05assessmentTuningCV.ipynb/.html) |
Project: form a group |
18: Tu 11/5 |
05 Basic Practice, Part 3, continued: tuning & CV 07 Problems and Solutions, Part 1: kernel regression (1) (07kernelRegression.ipynb/.html) |
|
19: Th 11/7 |
(FYI: Undergraduate Research Quick Talks) 07 Problems and Solutions, Part 2: multiclass, one-class, and multilable classification (4) |
HW04: feature engineering, data split,
model fit and regularization |
20: Tu 11/12 |
07 Problems and Solutions, Part 3: ensemble learning (3) (07ensemble.ipynb/.html) |
Project: proposal |
21: Th 11/14 |
08 Advanced Practice: 08.pdf (5):
imbalance, combining/stacking, efficiency, multicore
(08imbalance_stacking_timing_multicore.ipynb/.html) |
|
22: Tu 11/19 |
project proposal feedback: meet in class with teacher
and/or TA |
Project proposal feedback meeting HW05: algorithm selection, multiclass classification, assesment, tuning, ensemble learning, imbalance |
23: Th 11/21 |
09 Unsupervised Learning: 09.pdf (7)
(09densityEstimation.ipynb/.html, 09clustering.ipynb/.html, 09PCA.ipynb/.html) |
|
24: Tu 11/26 |
presentation schedule 09 Unsupervised Learning, continued: k-means code, DBSCAN, PCA project help |
Mo 12/2: Project: turn in slides |
[Th 11/28] |
[no class--Thanksgiving] |
|
25: Tu 12/3 |
Project: first 1/2 of presentations |
Project: peer feedback on first 1/2 |
26: Th 12/5 |
Project: second 1/2 of presentations |
Project: peer feedback on second 1/2 |
27: Tu 12/10 |
(optional) Q&A review |
Project: report |
Tu 12/17 | Final exam: Tu 12/17 7:45-9:45 a.m. |
Final exam |
Note: