Skip to content

Latest commit

 

History

History
13 lines (10 loc) · 1.01 KB

ideas-around-interpolation.md

File metadata and controls

13 lines (10 loc) · 1.01 KB
tags
interpolation
idea

Ideas Around Interpolation

Dump of thoughts from dream:

  • if my theory about interpolation is correct, it would suggest that even if you interpolate your training points, you should be able to distinguish those points that are somehow outliers, mostly from the curvature around that point. that seems to give some notion of certainty?
  • it almost feels like there should be a way to decompose neural networks in such a way as to start with the low-order functions and move towards higher-order functions. however, that just gets us back to the more classic ML style algorithms that depend on fitting basis functions to the data. what is nice about those algorithms is that you get this decomposition into signal and noise, sort of.
  • my feeling is that most of how I think about things is from a regression perspective, even though it's probably the case that the best wins for neural networks is still in classification (actually, [[neural-representations]] is technically regression).