Skip to content

Latest commit

 

History

History
100 lines (72 loc) · 2.59 KB

File metadata and controls

100 lines (72 loc) · 2.59 KB
author title documentclass
Wu Zhenyu (SA21006096)
Homework 2 for Statistical Learning
article

6

We already know that if we use $\hat{y} = \frac{\sum_i y_i}N$ to estimate a variable, it corresponds to minimizing least squares $\min_y \sum_i {(y_i - y)}^2$. Now we use $\hat{y} = \sqrt[N]{\prod_i y_i}$ to estimate a variable, what can be the corresponding minimization problem?


$$\min_{y > 0}\sum_i y\Big(\ln\frac{y}{y_i} - 1\Big)$$ $$\begin{aligned} \frac{\mathrm{d}}{\mathrm{d}y}\sum_i y\Big(\ln\frac{y}{y_i} - 1\Big) \Big|{y = \hat{y}} & = 0\ \frac{\mathrm{d}}{\mathrm{d}y}\sum_i\Big(y(\ln y - 1) - y\ln y_i\Big) \Big|{y = \hat{y}} & = 0\ \sum_i\Big(\ln \hat{y} - \ln y_i\Big) & = 0\ \sum_i \ln\frac{\hat{y}}{y_i} & = 0\ \ln\prod_i\frac{\hat{y}}{y_i} & = 0\ \prod_i\frac{\hat{y}}{y_i} & = 1\ \hat{y} & = \sqrt[N]{\prod_i y_i} \end{aligned}$$

$\hat{y}$ is extreme value point.

$$\begin{aligned} & \frac{\mathrm{d}^2}{\mathrm{d}y^2}\sum_i y\Big(\ln\frac{y}{y_i} - 1\Big)\\ = & \frac{\mathrm{d}}{\mathrm{d}y}\sum_i \Big(\ln y - \ln y_i\Big)\\ = & \frac{N}{y} > 0 \end{aligned}$$

$\hat{y}$ is minima point.

8

Solve the weighted least squares problem:

$$\min_\mathbf{w}\frac12\sum_{i = 1}^N r_i{(y_i - \mathbf{w} \cdot \mathbf{x}_i)}^2$$

where $r_i > 0$ is the weight of $(\mathbf{x}_i , y_i)$.


$$\begin{aligned} \frac{\mathrm{d}}{\mathrm{d}\mathbf{w}} \frac12\sum_{i = 1}^N r_i{(y_i - \mathbf{x}i^\mathsf{T}\mathbf{w})}^2 & = 0\ -\sum{i = 1}^N r_i \mathbf{x}_i^\mathsf{T}(y_i - \mathbf{x}i^\mathsf{T}\mathbf{w}) & = 0\ \sum{i = 1}^N r_i \mathbf{x}_i(y_i - \mathbf{x}i^\mathsf{T}\mathbf{w}) & = 0\ \sum{i = 1}^N r_i y_i\mathbf{x}i & = \Big(\sum{i = 1}^N r_i\mathbf{x}_i\mathbf{x}i^\mathsf{T}\Big)\mathbf{w}\ \mathbf{w} & = \Big(\sum{i = 1}^N r_i\mathbf{x}_i\mathbf{x}i^\mathsf{T}\Big) ^{-1}\sum{i = 1}^N r_i y_i\mathbf{x}_i \end{aligned}$$

12

In the following, $\lambda > 0$.

1

Solve the following optimization problem.

$$x_1 = \mathrm{argmin} x^2 + ax + \lambda x^2$$

In which condition, $x_1 = 0$?


$$\begin{aligned} \frac{\mathrm{d}}{\mathrm{d}x}\Big(x^2 + ax + \lambda x^2\Big) \Big|_{x = x_1} & = 0\\ x_1 & = -\frac{a}{2(1 + \lambda)} = 0\\ a & = 0 \end{aligned}$$

2

Solve the following optimization problem.

$$x_2 = \mathrm{argmin} x^2 + ax + \lambda\lvert x \rvert$$

In which condition, $x_2 = 0$?


$$\frac{\mathrm{d}}{\mathrm{d}x}\Big(x^2 + ax + \lambda\lvert x \rvert\Big) \Big|_{x = x_2} = 0$$

$$x_2 = \begin{cases} -\frac{a + \lambda}{2} & a < -\lambda\\ 0 & -\lambda \leqslant a \leqslant \lambda\\ -\frac{a - \lambda}{2} & a > \lambda \end{cases}$$