Regarding the Readings

Below I’ve outlined the complete reading list for PPOL564. I’ve broken the readings up into “Required” and “Additional Resources and Suggested Materials” for each week of class. For some weeks, there are a lot of different readings (e.g., Week 7), and for other weeks there are fewer readings but the readings are more involve (e.g., Week 12). The readings for this course are meant to support the concepts and materials covered during lecture. I encourage students to first lightly skim the readings prior to consuming the asynchronous lecture material, and then to read/heavy-skim the readings before the synchronous lecture. This will help you better absorb the main points. It’s always easier to read when you have a sense of where you’re going.

For most weeks, the reading are readings are freely available online. If links break for any of the readings, please let the professor and/or TA know as soon as possible.




Week 1: Choosing your Poison
Introductions, Installations, and IDEs




Week 2: Time Travel and Other Necessities
Version Control, Workflow, and Reproducibility




Week 3: Learning Parseltongue
Object-Oriented Programming in Python




Week 4: On Time and Space
Introduction to Algorithms




Week 5: Long Live the Data Frame
From Nested Lists to Data Frames



Week 6: Modern Snake Charming
Approaches to Data Manipulation in Python




Week 7: Interrogation Techniques
Data Visualization and Exploration




Week 8: Automated Heists
Drawing from (Un-)Structured Data Sources




Week 9: The Signal and the Noise
Introduction to Statistical Learning


  • Required Readings:
    • What is statistical learning? - James et al. Ch. 2
    • Resampling Methods - James et al. Ch. 5



Week 10: Casting Shadows in \(N\)-Dimensions
Continuous Outcomes and Linear Regression



  • Additional Resources and Suggested Materials


Week 11: Hot Dog, Not Hot Dog
Probability, Bayes Theorem, and Classification



  • Additional Resources and Suggested Materials
    • Scikit-learn documentation
      • Naive Bayes - scikit-learn.org
      • On using a linear model fo classification


Week 12: Trees and Neighbors
Algorithmic Approaches to Supervised Learning


  • Required Readings:
    • K Nearest Neighbors - James et al. Ch. 2.2.3 & 3.5
    • Tree-Based Methods - James et al. Ch. 8.1 - 8.2



Week 13: Peeking inside the Black Box
Interpretable Machine Learning