-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.json
1 lines (1 loc) · 6.34 KB
/
index.json
1
[{"authors":["admin"],"categories":null,"content":"I am a graduate student in Machine Learning at the University of Toronto and the Vector Institute. I am currently pursuing follow-up research to my work on Neural Ordinary Differential Equations, and am generally interested in approximate inference for latent variable models. I have recently completed an M.Sc. supervised by Drs. David Duvenaud and Roger Grosse, and am continuing as a Ph.D. student under David Duvenaud. In Winters 2018 and 2019 I was the instructor for CSC412/2506: Probabilistic Learning and Reasoning.\nWinter 2020 I will be co-instructing, with David Duvenaud, the course STA414: Statistical Methods in Machine Learning II.\n","date":-62135596800,"expirydate":-62135596800,"kind":"taxonomy","lang":"en","lastmod":-62135596800,"objectID":"2525497d367e79493fd32b198b28f040","permalink":"https://jessebett.github.io/authors/admin/","publishdate":"0001-01-01T00:00:00Z","relpermalink":"/authors/admin/","section":"authors","summary":"I am a graduate student in Machine Learning at the University of Toronto and the Vector Institute. I am currently pursuing follow-up research to my work on Neural Ordinary Differential Equations, and am generally interested in approximate inference for latent variable models. I have recently completed an M.Sc. supervised by Drs. David Duvenaud and Roger Grosse, and am continuing as a Ph.D. student under David Duvenaud. In Winters 2018 and 2019 I was the instructor for CSC412/2506: Probabilistic Learning and Reasoning.","tags":null,"title":"Jesse Bettencourt","type":"authors"},{"authors":null,"categories":null,"content":"","date":1493265600,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1493265600,"objectID":"ebe4d3d9ffd9fd6cb94dd50f96be1116","permalink":"https://jessebett.github.io/project/torus-knot/","publishdate":"2017-04-27T00:00:00-04:00","relpermalink":"/project/torus-knot/","section":"project","summary":"Master's summer project.","tags":["thesis"],"title":"Torus Knot Fibration","type":"project"},{"authors":null,"categories":null,"content":"","date":1430107200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1430107200,"objectID":"efd58481ffff264f253567212edc6792","permalink":"https://jessebett.github.io/project/penrose-thesis/","publishdate":"2015-04-27T00:00:00-04:00","relpermalink":"/project/penrose-thesis/","section":"project","summary":"Undergraduate thesis.","tags":["thesis"],"title":"Penrose Aperiodic Tiling of the Plane and Graphical Geodesics","type":"project"},{"authors":null,"categories":null,"content":"","date":1401163200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1401163200,"objectID":"b767db1bc47c0969d5e05aa0d9b61e19","permalink":"https://jessebett.github.io/project/rbf-usra/","publishdate":"2014-05-27T00:00:00-04:00","relpermalink":"/project/rbf-usra/","section":"project","summary":"NSERC funded summer undergraduate research project on Radial Basis Functions","tags":["other"],"title":"Radial Basis Functions","type":"project"},{"authors":null,"categories":null,"content":"","date":1369627200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1369627200,"objectID":"739d46a125be035159ec3ee4d2e8def1","permalink":"https://jessebett.github.io/project/egypt/","publishdate":"2013-05-27T00:00:00-04:00","relpermalink":"/project/egypt/","section":"project","summary":"Repository of information about astronomical documents from the pharaonic period of ancient Egypt.","tags":["other"],"title":"Ancient Egyptian Astronomy Database","type":"project"},{"authors":["Will Grathwohl\\*","Ricky T. Q. Chen\\*","**Jesse Bettencourt**","Ilya Sutskever","David Duvenaud"],"categories":null,"content":"","date":1529294400,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1529294400,"objectID":"adf2c0ba61bcb5a8f6bf87794fa3bd9c","permalink":"https://jessebett.github.io/publication/ffjord/","publishdate":"2018-06-18T00:00:00-04:00","relpermalink":"/publication/ffjord/","section":"publication","summary":"We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a black-box differential equation solver. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. We demonstrate these properties in continuous-depth residual networks and continuous-time latent variable models. We also construct continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models. ","tags":[],"title":"FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models","type":"publication"},{"authors":["Ricky T. Q. Chen\\*","Yulia Rubanova\\*","**Jesse Bettencourt\\***","David Duvenaud"],"categories":null,"content":"Best Paper at NeurIPS 2018\n","date":1529294400,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1529294400,"objectID":"cce1ca167687ef6a2525ce2ba441c8da","permalink":"https://jessebett.github.io/publication/neural-odes/","publishdate":"2018-06-18T00:00:00-04:00","relpermalink":"/publication/neural-odes/","section":"publication","summary":"We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a black-box differential equation solver. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. We demonstrate these properties in continuous-depth residual networks and continuous-time latent variable models. We also construct continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models. ","tags":[],"title":"Neural Ordinary Differential Equations","type":"publication"}]