Skip to content

Commit 5b35fde

Browse files
committed
About to add new poem
1 parent d0b6bad commit 5b35fde

File tree

14 files changed

+1512
-22
lines changed

14 files changed

+1512
-22
lines changed

content/posts/tensors-signals-kernels/index.md

Lines changed: 151 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -3,34 +3,174 @@ title: "Perspectives into Tensors, Signals, and Kernel Methods"
33
category: technical
44
date: 2025-09-08
55
math: true
6-
draft: true
6+
draft: true
77
---
88

99
{{< toc >}}
1010

1111
## Abstract
1212

13-
[Linear algebra](https://linear.axler.net/LADR4e.pdf), [signal processing](https://en.wikipedia.org/wiki/Signal_processing), and [machine learning](https://en.wikipedia.org/wiki/Machine_learning) are (one of many groups of) topics that enjoy a beautiful relationship protected by a nutshell of mathematics. Inside we can find a kernel of perspectives which are simply a pleasure to entertain. Here, I provide the [linear-operator](https://en.wikipedia.org/wiki/Linear_map) perspective into [systems of many dimensions](https://en.wikipedia.org/wiki/Multidimensional_system), characterizing properties like [time-invariance](https://en.wikipedia.org/wiki/Time-invariant_system) and [causality](https://en.wikipedia.org/wiki/Causal_system) via [tensor](https://en.wikipedia.org/wiki/Tensor) representations. I finish with an original perspective that connects the [convolution kernel](https://en.wikipedia.org/wiki/Convolution) to the [reproducing kernel](https://en.wikipedia.org/wiki/Reproducing_kernel_Hilbert_space#:~:text=then%20called%20the-,reproducing%20kernel,-%2C%20and%20it%20reproduces).
13+
Linear algebra, signal processing, and machine learning methods are (one of many groups of) topics that enjoy beautiful relationships enclosed in a dense shell of mathematics. Within the shell, one finds a kernel of surprisingly diverse perspectives which are simply a pleasure to entertain. This article hopes to give the reader a romantic and thematic glimpse of the truths in this particular group of topics.
1414

1515
---
1616

17-
## Background
17+
## Overview
1818

19-
This section is no more than a refresher on select topics. I will not make an attempt to explain them completely. If this is what you need, I recommend you explore these (completely free) resources:
19+
I follow an essay-like structure including an introduction, body, and conclusion. The introduction sets the stage building from "class-style" knowledge, assuming a first course in linear algebra, signal processing, and machine learning. For readers lacking this background, I leave pointers to free resources.
2020

21-
1. **Vector spaces.** [_Linear Algebra Done Right_](https://linear.axler.net/) by Axler.
22-
2. **Signals and Systems.** [_Signals & Systems: Theory and Applications_](https://ss2-2e.eecs.umich.edu/) by Ulaby and
23-
Yagle.
24-
3. **Kernel methods.** [_Foundations of Machine Learning_](https://cs.nyu.edu/~mohri/mlbook/) by Mohri, Rostamizadeh, and Talwalkar.
21+
The introduction arrives at the [linear operator](https://en.wikipedia.org/wiki/Linear_map) perspective of [systems of many dimensions](https://en.wikipedia.org/wiki/Multidimensional_system), defining system properties like [time-invariance](https://en.wikipedia.org/wiki/Time-invariant_system) and [causality](https://en.wikipedia.org/wiki/Causal_system) in terms of [tensor](https://en.wikipedia.org/wiki/Tensor) representations. This results in the marriage of many linear algebra and signal processing concepts. These perspectives then help the body, where I present a cross-cutting perspective of the [convolution kernel](https://en.wikipedia.org/wiki/Convolution) and the [reproducing kernel](https://en.wikipedia.org/wiki/Reproducing_kernel_Hilbert_space#:~:text=then%20called%20the-,reproducing%20kernel,-%2C%20and%20it%20reproduces).
2522

26-
### Vector Spaces
23+
Finally, the conclusion provides some subjective thematization to the concepts emphasized in the rest of the piece, summarizing mathematical details and indulging in a little bit of sensationalism. An attempt is made of providing pointers to further reading on adjacent concepts.
2724

2825

26+
{{% hint title="Clarification" %}}
2927

30-
### Signals and Systems
28+
The word "kernel" is criminally polysemous in mathematics, and it will be used a lot in this piece. I mostly use the word under two semantics: The convolution kernel from signal processing and the reproducing kernel from machine learning. All other instances of the word should simply refer to its english meaning.
3129

32-
### Kernel Methods
30+
However, a primary objective of this piece is to reach a perspective of the convolution and reproducing kernels that allows them to be seen through a common lense. So an acute reader may interpret this as an explanation of why they are both called "kernel," assuming that they were both named that way because they represent the same kind of mathematical object in some profound way.
31+
32+
What is interesting is that they both received their names for a superficial reason -- because the symbols that represent them show up inside other symbols. One could imagine that people started calling them "kernel" independently just to avoid saying the phrase "that term in in the middle" while pointing at a blackboard.
33+
34+
As such, the connecting view of the convolution and reproducing kernels that we will work towards only applies to these two kernels (convolution and reproducing). If these were the only two kernels in mathematics, maybe one could now say that they are named the same for a profound reason. But there are numerous other kinds of kernels for which the constructions in this piece simply do not apply.
35+
36+
{{% /hint %}}
3337

3438
---
3539

40+
## Introduction
41+
42+
This section provides not much more than a definition-based refresher on select topics from first courses in linear algebra, signal processing, and machine learning. For readers in need of comprehensive review or first-time coverage, I leave these free resources on said topics:
43+
44+
* **Kernel methods.** [_Foundations of Machine Learning_](https://cs.nyu.edu/~mohri/mlbook/) by Mohri, Rostamizadeh, and Talwalkar.
45+
* **Signals and Systems.** [_Signals & Systems: Theory and Applications_](https://ss2-2e.eecs.umich.edu/) by Ulaby and Yagle.
46+
* **Linear algebra.** [_Linear Algebra Done Right_](https://linear.axler.net/) by Axler.
47+
48+
### Bases of Vector Spaces
49+
50+
In what is nowadays close to being a canon of linear algebra education, Sheldon Axler opens with the statement below to set the stage for the rest of _Linear Algebra Done Right_:
51+
52+
> Linear algebra is the study of linear maps on finite-dimensional vector spaces.
53+
54+
Here, the restriction of vector spaces to the finite-dimensional case was one the most mathematically respectful ways to negotiate generality with practical pedagogy. However, the spirit of linear algebra is alive way beyond the finite-dimensional case.
55+
56+
#### Hamel Bases
57+
58+
Most engineers are familiar with the concept of a (Hamel) basis of a vector space. If we have a vector space $V$ over a field $\mathbb{F}$ and a Hamel basis $\mathcal{B}$, then "$\mathcal{B}$ spans $V$" translates to
59+
60+
$$
61+
\begin{equation}
62+
\forall v \in V, \; v = \sum_{i \, = \, 0}^{k} c_i b_i \;\; \text{s.t.} \;\; c_i \in \mathbb{F}, \, b_i \in \mathcal{B}, \, k \in \mathbb{N}.
63+
\end{equation}
64+
$$
65+
66+
Importantly, for $B$ to be a Hamel basis, the sum in $(1)$ must have finite terms. Note that this is allowed even in cases where $\mathcal{B}$ is infinite (or in other words, where $V$ is infinite-dimensional), as one does not necessarily assign nonzero coefficients $c_i$ to each element of $\mathcal{B}$.
67+
68+
{{% hint title="Example" %}}
69+
70+
The vector space of polynomials (of finite terms) with coefficients in a field $\mathbb{F}$,
71+
72+
$$
73+
\mathbb{F}[x] = \left\{ \sum_{i=0}^n a_i x^i \;\Big|\; n \in \mathbb{N},\ a_i \in \mathbb{F} \right\},
74+
$$
75+
76+
has the infinite basis $\mathcal{B}_{\mathbb{F}[x]} = \\{1, x, x^2, x^3, \dots \\}$. Each of its elements, however, is the linear combination of a finite number of basis elements. For example, the polynomial
77+
78+
$$
79+
p(x) = 3 + 4x^2 + x^3
80+
$$
81+
82+
can be expressed as a (finite) linear combination of basis elements,
83+
84+
$$
85+
p(x) = 3
86+
\begin{bmatrix}
87+
1 \\
88+
0 \\
89+
0 \\
90+
0 \\
91+
\vdots
92+
\end{bmatrix} + 4
93+
\begin{bmatrix}
94+
0 \\
95+
0 \\
96+
1 \\
97+
0 \\
98+
\vdots
99+
\end{bmatrix} + 1
100+
\begin{bmatrix}
101+
0 \\
102+
0 \\
103+
0 \\
104+
1 \\
105+
\vdots
106+
\end{bmatrix}.
107+
$$
108+
109+
Here we imposed a (canonical) representation such that, for example, $x^2 = \left[ 0, \\, 0, \\, 1, \\, 0, \\, {\dots} \right]^\top$. We see that, despite each basis vector being infinite-dimensional, all polynomials are determined by a finite number of them.
110+
111+
{{% /hint %}}
112+
113+
#### Schauder Bases
114+
115+
Interpreting Axler strictly, $\mathbb{F}[x]$ is already beyond linear algebra because it is of [semi-infinite](https://en.wikipedia.org/wiki/Semi-infinite) dimension. But definitionally, it is a perfectly valid vector space. Just as finite dimensionality is not necessary in order to access the theorems of linear algebra, having a countable Hamel basis is also not necessary; all vector spaces do have a Hamel basis[^axiom-choice], but not all of them have a countable one.
116+
117+
{{% hint title="Note" %}}
118+
119+
Countable bases are desireable not for being countable per se, but rather that, in most cases where a vector space does not have a countable Hamel basis, the uncountable Hamel basis is unconstructive and unutterable. Put another way, the most useful fact about an uncountable Hamel basis, in many cases, is that it exists.
120+
121+
{{% /hint %}}
122+
123+
For some vector spaces that do not have a countable Hamel basis, one can relax the definition of a basis itself to obtain one that is countable. Specifically, we redefine the phrase "$\mathcal{B}$ spans $V$" to
124+
125+
$$
126+
\begin{equation}
127+
\forall v \in V, \; v = \sum_{i \, = \, 0}^{\infty} c_i b_i \;\; \text{s.t.} \;\; c_i \in \mathbb{F}, \, b_i \in \mathcal{B}.
128+
\end{equation}
129+
$$
130+
131+
If the above is true for a vector space $V$ over $\mathbb{F}$, then $\mathcal{B}$ is a Schauder basis of said space. The critical difference to a Hamel basis is of course the generality afforded by the possiblity of infinite terms for the sum in $(2)$, giving us a new countably infinite flavor of linear combination.
132+
133+
{{% hint title="Example" %}}
134+
135+
The vector space of square-summable sequences,
136+
137+
$$
138+
\ell^2 = \left\{ (x_1, x_2, x_3, \dots) \;:\; \sum_{n=1}^\infty |x_n|^2 < \infty \right\},
139+
$$
140+
141+
has no Hamel basis because, no matter how you define one, you can come up with an element of $\ell^2$ which requires a decomposition into an infinite number of basis elements (which is not allowed). However, it does have the countably infinite Schauder basis
142+
143+
$$
144+
\mathcal{B}_{\ell^2} = \left\{
145+
\left(
146+
1, \,
147+
0, \,
148+
0, \,
149+
\dots
150+
\right), \,
151+
\left(
152+
0, \,
153+
1, \,
154+
0, \,
155+
\dots
156+
\right), \,
157+
\left(
158+
0, \,
159+
0, \,
160+
1, \,
161+
\dots
162+
\right), \,
163+
{\dots}
164+
\right\}.
165+
$$
166+
167+
This is quite remarkable, as the uncountably infinite $\ell^2$ is spanned by the countably infinite $\mathcal{B}_{\ell^2}$. Allowing infinite linear combinations bridges this countability barrier via convergence.
168+
169+
{{% /hint %}}
170+
171+
### Signals and Systems
172+
173+
### Kernel Methods
174+
36175

176+
[^axiom-choice]: When considering infinite-dimensional vector spaces, this statement is true if and only if one admits the axiom of choice. Perhaps this was another motivation of Axler's restriction to finite-dimensional vector spaces.
Lines changed: 108 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,108 @@
1+
<!DOCTYPE html>
2+
<html lang="en"><head><script src="/livereload.js?mindelay=10&amp;v=2&amp;port=1313&amp;path=livereload" data-no-instant defer></script>
3+
<meta charset="utf-8">
4+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
5+
6+
<meta name="description" content="Max&#39;s personal site">
7+
8+
9+
<link rel="icon" type="image/x-icon" href="/favicon.ico" media="(prefers-color-scheme: light)">
10+
<link rel="icon" type="image/x-icon" href="/favicon-dark.ico" media="(prefers-color-scheme: dark)">
11+
12+
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/[email protected]/dist/katex.min.css" integrity="sha384-n8MVd4RsNIU0tAv4ct0nTaAbDJwPJzDEaqSD1odI+WdtXRGWt2kTvGFasHpSy3SV" crossorigin="anonymous">
13+
<script defer src="https://cdn.jsdelivr.net/npm/[email protected]/dist/katex.min.js" integrity="sha384-XjKyOOlGwcjNTAIQHIpgOno0Hl1YQqzUOEleOLALmuqehneUG+vnGctmUb0ZY0l8" crossorigin="anonymous"></script>
14+
<script defer src="https://cdn.jsdelivr.net/npm/[email protected]/dist/contrib/auto-render.min.js" integrity="sha384-+VBxd3r6XgURycqtZ117nYw44OOcIax56Z4dCRWbxyPt0Koah1uHoK0o4+/RRE05" crossorigin="anonymous"></script>
15+
<script>
16+
document.addEventListener("DOMContentLoaded", function() {
17+
renderMathInElement(document.body, {
18+
19+
20+
delimiters: [
21+
{left: '$$', right: '$$', display: true},
22+
{left: '$', right: '$', display: false},
23+
{left: '\\(', right: '\\)', display: false},
24+
{left: '\\[', right: '\\]', display: true}
25+
],
26+
27+
throwOnError : false
28+
});
29+
});
30+
</script>
31+
32+
33+
34+
<link rel="stylesheet" href="/css/style.min.css">
35+
36+
<link rel="canonical" href="http://localhost:1313/a-perspective-on-tensors-signals-and-kernel-methods/" />
37+
<title>A Perspective on Tensors, Signals, and Kernel Methods</title>
38+
</head>
39+
<body><header id="banner">
40+
<a href="http://localhost:1313/"
41+
><img src="/logo.svg" alt="Logo" class="site-logo"
42+
/></a>
43+
<h2><a href="http://localhost:1313/">Max Fierro</a></h2>
44+
<nav>
45+
<ul>
46+
<li>
47+
<a href="/about/" title="about"
48+
>about</a
49+
>
50+
</li><li>
51+
<a href="/resume/" title="resume"
52+
>resume</a
53+
>
54+
</li><li>
55+
<a href="/index.xml" title=""
56+
>rss</a
57+
>
58+
</li>
59+
</ul>
60+
</nav>
61+
</header>
62+
<main id="content">
63+
<article>
64+
<header id="post-header">
65+
<h1>A Perspective on Tensors, Signals, and Kernel Methods</h1>
66+
<div>
67+
<time>September 8, 2025</time>
68+
</div>
69+
</header><aside id="toc">
70+
<details>
71+
<summary>&nbsp;<strong> Table of contents</strong></summary>
72+
<nav id="TableOfContents">
73+
<ul>
74+
<li><a href="#abstract">Abstract</a></li>
75+
<li><a href="#background">Background</a>
76+
<ul>
77+
<li><a href="#vector-spaces">Vector Spaces</a></li>
78+
<li><a href="#signals-and-systems">Signals and Systems</a></li>
79+
<li><a href="#kernel-methods">Kernel Methods</a></li>
80+
</ul>
81+
</li>
82+
</ul>
83+
</nav>
84+
</details>
85+
</aside>
86+
87+
<h2 id="abstract">Abstract</h2>
88+
<p><a href="https://linear.axler.net/LADR4e.pdf">Linear algebra</a>, <a href="https://en.wikipedia.org/wiki/Signal_processing">signal processing</a>, and <a href="https://en.wikipedia.org/wiki/Machine_learning">machine learning</a> are (one of many groups of) topics that enjoy a beautiful relationship protected by a nutshell of mathematics. Inside we can find a kernel of perspectives which are simply a pleasure to entertain. Here, I provide the <a href="https://en.wikipedia.org/wiki/Linear_map">linear-operator</a> perspective into <a href="https://en.wikipedia.org/wiki/Multidimensional_system">systems of many dimensions</a>, characterizing properties like <a href="https://en.wikipedia.org/wiki/Time-invariant_system">time-invariance</a> and <a href="https://en.wikipedia.org/wiki/Causal_system">causality</a> via <a href="https://en.wikipedia.org/wiki/Tensor">tensor</a> representations. I finish with an original perspective that connects the <a href="https://en.wikipedia.org/wiki/Convolution">convolution kernel</a> to the <a href="https://en.wikipedia.org/wiki/Reproducing_kernel_Hilbert_space#:~:text=then%20called%20the-,reproducing%20kernel,-%2C%20and%20it%20reproduces">reproducing kernel</a>.</p>
89+
<hr>
90+
<h2 id="background">Background</h2>
91+
<p>This section is no more than a refresher on select topics. I will not make an attempt to explain them completely. If this is what you need, I recommend you explore these (completely free) resources:</p>
92+
<ol>
93+
<li><strong>Vector spaces.</strong> <a href="https://linear.axler.net/"><em>Linear Algebra Done Right</em></a> by Axler.</li>
94+
<li><strong>Signals and Systems.</strong> <a href="https://ss2-2e.eecs.umich.edu/"><em>Signals &amp; Systems: Theory and Applications</em></a> by Ulaby and
95+
Yagle.</li>
96+
<li><strong>Kernel methods.</strong> <a href="https://cs.nyu.edu/~mohri/mlbook/"><em>Foundations of Machine Learning</em></a> by Mohri, Rostamizadeh, and Talwalkar.</li>
97+
</ol>
98+
<h3 id="vector-spaces">Vector Spaces</h3>
99+
<h3 id="signals-and-systems">Signals and Systems</h3>
100+
<h3 id="kernel-methods">Kernel Methods</h3>
101+
<hr>
102+
</article>
103+
104+
</main><footer id="footer">
105+
Copyright © 2024 Max Fierro
106+
</footer>
107+
</body>
108+
</html>

0 commit comments

Comments
 (0)