From d7d9f34734127a08f363b6a0478ffc7e46b47e33 Mon Sep 17 00:00:00 2001 From: mbaugustyn Date: Thu, 1 Feb 2024 17:22:55 +0100 Subject: [PATCH] fixed PCA Lagrangian --- Lectures/07-notes.ipynb | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/Lectures/07-notes.ipynb b/Lectures/07-notes.ipynb index 61846e0..21fd396 100644 --- a/Lectures/07-notes.ipynb +++ b/Lectures/07-notes.ipynb @@ -99,27 +99,25 @@ "\n", "where we defined $S = \\frac{1}{N}X^TX \\in\\R^{D\\times D}$ to be the _data covariance matrix_ (recall that $X$ is centered).\n", "\n", - "We wish to maximize the variance, with respect to the _direction_ of $v$, in other words, length of $v$ is irrelevant and we can fix $\\|v\\|=1$. Then, the 1D PCA problem is simply:\n", + "We wish to maximize the variance, with respect to the _direction_ of $v$, in other words, length of $v$ is irrelevant and we can fix $\\|v\\|=1$. Then, the 1D PCA problem can be presented as:\n", "\n", "$$\n", "\\begin{align}\n", - "\\underset{v}{\\operatorname{argmax}}\\; & \\frac{1}{2}v^TSv \\\\\n", + "\\underset{v}{\\operatorname{argmax}}\\; & v^TSv \\\\\n", "\\sjt & v^Tv = 1\n", "\\end{align}\n", "$$\n", "\n", - "where we have multiplied the objective by 1/2 for convenience.\n", - "\n", "We can solve this problem using the standard method of [Lagrange Multipliers](https://en.wikipedia.org/wiki/Lagrange_multiplier). First let's construct the Lagrangian:\n", "\n", "$$\n", - "\\mathcal{L} = \\frac{1}{2}v^TSv - \\lambda(v^Tv - 1)\n", + "\\mathcal{L} = v^TSv - \\lambda(v^Tv - 1)\n", "$$\n", "\n", - "differentiating the lagrangian with respect to $v$ yields:\n", + "differentiating the Lagrangian with respect to $v$ yields:\n", "\n", "$$\n", - "\\frac{\\partial \\mathcal{L}}{\\partial v} = Sv - \\lambda v\n", + "\\frac{\\partial \\mathcal{L}}{\\partial v} = 2 Sv - 2 \\lambda v = 2 (Sv - \\lambda v)\n", "$$\n", "\n", "At optimum $\\frac{\\partial \\mathcal{L}}{\\partial v} = 0$, or in other words $Sv = \\lambda v$. This means that $v$ needs to be an eigenvector of $S$!\n"