Skip to content

Commit

Permalink
Merge pull request #299 from slds-lmu/new_chunks
Browse files Browse the repository at this point in the history
Add new chapters to regu and advrisk and add jupyter notebook solutio…
  • Loading branch information
chriskolb authored Mar 28, 2024
2 parents 0054630 + 8450412 commit 2116051
Show file tree
Hide file tree
Showing 19 changed files with 115 additions and 63 deletions.
15 changes: 15 additions & 0 deletions content/chapters/11_advriskmin/11-03-regression-l2-l1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 11.03: L2- and L1 Loss"
weight: 11003
---
In this section, we revisit the \\(L2\\) and \\(L1\\) loss and present their risk minimizers -- the conditional mean -- and optimal constant model -- the empirical mean of observed target values is derived for the \\(L2\\) loss. The conditional median -- and optimal constant model -- the empirical median of observed target values is introduced for the \\(L1\\) loss.

<!--more-->

### Lecture video

{{< video id="agQQzTI_6HI" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-advriskmin-regression-l2-l1.pdf" >}}
15 changes: 0 additions & 15 deletions content/chapters/11_advriskmin/11-03-regression-l2.md

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: "Chapter 11.04: L1 Loss"
title: "Chapter 11.04: L1 Loss Deepdive"
weight: 11004
---
In this section, we revisit \\(L1\\) loss and derive its risk minimizer -- the conditional median -- and optimal constant model -- the empirical median of observed target values.
Expand All @@ -12,4 +12,4 @@ In this section, we revisit \\(L1\\) loss and derive its risk minimizer -- the c

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-advriskmin-regression-l1.pdf" >}}
{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-advriskmin-regression-l1-deepdive.pdf" >}}
15 changes: 0 additions & 15 deletions content/chapters/15_regularization/15-02-l1l2.md

This file was deleted.

15 changes: 15 additions & 0 deletions content/chapters/15_regularization/15-02-l2.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 15.02: Ridge Regression"
weight: 15002
---
We introduce Ridge regression as a key approach to regularizing linear models.

<!--more-->

### Lecture video

{{< video id="yeN-xRfheYU" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-l2.pdf" >}}
15 changes: 15 additions & 0 deletions content/chapters/15_regularization/15-03-l1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 15.03: Lasso Regression"
weight: 15003
---
We introduce Lasso regression as a key approach to regularizing linear models.

<!--more-->

### Lecture video

{{< video id="yeN-xRfheYU" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-l1.pdf" >}}
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.03: Lasso vs Ridge Regression"
weight: 15003
title: "Chapter 15.04: Lasso vs Ridge Regression"
weight: 15004
---
This section provides a detailed comparison between Lasso and Ridge regression.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.04: Elastic Net and Regularization for GLMs"
weight: 15004
title: "Chapter 15.05: Elastic Net and Regularization for GLMs"
weight: 15005
---
In this section, we introduce the elastic net as a combination of Ridge and Lasso regression and discuss regularization for logistic regression.

Expand Down
15 changes: 0 additions & 15 deletions content/chapters/15_regularization/15-05-l0.md

This file was deleted.

15 changes: 15 additions & 0 deletions content/chapters/15_regularization/15-06-other.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 15.06: Other Types of Regularization"
weight: 15006
---
In this section, we introduce other regularization approaches besides the important special cases \\(L1\\) and \\(L2\\).

<!--more-->

### Lecture video

{{< video id="gw6yLFoQzdQ" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-others.pdf" >}}
15 changes: 15 additions & 0 deletions content/chapters/15_regularization/15-07-nonlin.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 15.07: Regularization in NonLinear Models"
weight: 15007
---
In this section, we demonstrate regularization in non-linear models like neural networks.

<!--more-->

### Lecture video

{{< video id="MdwK9e2wR_U" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-nonlin.pdf" >}}
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.06: Regularization in NonLinear Models and Bayesian Priors"
weight: 15006
title: "Chapter 15.08: Regularization and Bayesian Priors"
weight: 15008
---
In this section, we motivate regularization from a Bayesian perspective, showing how different penalty terms correspond to different Bayesian priors.

Expand All @@ -12,4 +12,4 @@ In this section, we motivate regularization from a Bayesian perspective, showing

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-nonlin-bayes.pdf" >}}
{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-bayes.pdf" >}}
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.07: Geometric Analysis of L2 Regularization and Weight Decay"
weight: 15007
title: "Chapter 15.09: Geometric Analysis of L2 Regularization and Weight Decay"
weight: 15009
---
In this section, we provide a geometric understanding of \\(L2\\) regularization, showing how parameters are shrunk according to the eigenvalues of the Hessian of empirical risk, and discuss its correspondence to weight decay.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.08: Geometric Analysis of L1 Regularization"
weight: 15008
title: "Chapter 15.10: Geometric Analysis of L1 Regularization"
weight: 15010
---
In this section, we provide a geometric understanding of \\(L1\\) regularization and show that it encourages sparsity in the parameter vector.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.10: Early Stopping"
weight: 15010
title: "Chapter 15.11: Early Stopping"
weight: 15011
---
In this section, we introduce early stopping and show how it can act as a regularizer.

Expand Down
11 changes: 11 additions & 0 deletions content/chapters/15_regularization/15-12-ridge-deep.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
title: "Chapter 15.12: Details on Ridge Regression: Deep Dive"
weight: 15012
---
In this section, we consider Ridge regression as row-augmentation and as minimizing risk under feature noise. We also discuss the bias-variance tradeoff.

<!--more-->

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-ridge-deepdive.pdf" >}}
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 15.09: Soft-thresholding and L1 regularization: Deep Dive"
weight: 15009
title: "Chapter 15.13: Soft-thresholding and L1 regularization: Deep Dive"
weight: 15013
---
In this section, we prove the previously stated proposition regarding soft-thresholding and L1 regularization.

Expand Down
11 changes: 11 additions & 0 deletions content/chapters/15_regularization/15-14-bagging-deep.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
title: "Chapter 15.14: Bagging as Regularization Method: Deep Dive"
weight: 15014
---
In this section, we consider bagging as a form of regularization. We also discuss which factors influence the effectiveness of bagging.

<!--more-->

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-regu-bagging-deepdive.pdf" >}}
2 changes: 1 addition & 1 deletion content/exercises/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ __Exercises for Chapters 1-10 (LMU Lecture I2ML):__
| Exercise 9 &nbsp;| [Random forests](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/ex_forests.pdf) | [Random forests](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/sol_forests.pdf) | [Random forests](https://github.com/slds-lmu/lecture_i2ml/blob/master/exercises/forests/sol_forests_py.ipynb) |
| Exercise 10 &nbsp;| [Neural networks](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/ex_nn.pdf) | [Neural networks](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/sol_nn.pdf) |
| Exercise 11 &nbsp;| [Tuning](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/ex_tuning.pdf) | [Tuning](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/sol_tuning.pdf) | [Tuning](https://github.com/slds-lmu/lecture_i2ml/blob/master/exercises/tuning/sol_tuning_py.ipynb) |
| Exercise 12 &nbsp;| [Nested resampling](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/ex_nested_resampling.pdf) &emsp;| [Nested resampling](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/sol_nested_resampling.pdf) &emsp;| |
| Exercise 12 &nbsp;| [Nested resampling](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/ex_nested_resampling.pdf) &emsp;| [Nested resampling](https://github.com/slds-lmu/lecture_i2ml/raw/master/exercises-pdf/sol_nested_resampling.pdf) &emsp;| [Tuning](https://github.com/slds-lmu/lecture_i2ml/blob/master/exercises/nested-resampling/sol_nested_resampling_py.ipynb) |

<br>

Expand Down

0 comments on commit 2116051

Please sign in to comment.