2️⃣ Week 2: Model Training and Evaluation (August 11 - August 25)

Introduction

Welcome to Week 2 of our AI/ML study group!

This week, we're building a solid base by reviewing the Core Concepts of AI/ML.

We'll cover the essential concepts that power how models learn, including Regression and Classification (making predictions), Loss Functions (measuring model error), and the fundamentals of Optimization Algorithms (how models improve over time). These topics form the bedrock of understanding how we train machine learning models.

Don't worry if it seems like a lot; the focus is on making progress and learning together.

2.1 Regression and Classification

2.1.1 Logistic Regression | ML-005 Lecture 6 | Stanford University | Andrew Ng 01 Classification

https://www.youtube.com/watch?v=4u81xU7BIOc

2.1.2 ML Lecture 1: Regression - Case Study, Prof. Hung-Yi Lee, Optional, In Chinese

https://www.youtube.com/watch?v=fegAeph9UaA

2.1.3 ML Lecture 4: Classification Prof. Hung-Yi Lee, Optional, In Chinese

https://www.youtube.com/watch?v=fZAZUYEeIMg&list=PLJV_el3uVTsPy9oCRY30oBPNLCo89yu49&index=9&pp=iAQB

2.2 Loss Functions (Mean Squared Error, Cross-Entropy, etc)

2.2.1 Cost Function by Andrew Ng

https://www.youtube.com/watch?v=0DqnDGV_p9c

2..2.2 IBM Loss Function

IBM Topic on Loss Function Tutorials

2.3 Optimization Algorithms