Skip to content
View dsevero's full-sized avatar
🎲
🎲

Block or report dsevero

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
dsevero/README.md

Research Engineer

Meta Superintelligence Labs - Fundamental AI Research (MSL - FAIR)

I work on advancing pre-training and inference-time compute of autoregressive next-token prediction, diffusion, and flow matching models at scale. In 2024 I obtained a Ph.D. from the University of Toronto and the Vector Institute for A.I. in generative modelling and information theory. I spent most of grad school interning at Meta (FAIR Labs) and Google AI. Before grad school I worked as an electronics engineer (hardware/firmware for embedded systems), as well as a machine learning engineer in recommendation systems and ML for health.

Google Scholar X (Twitter) CV

Selected projects from 2024/2025

For a complete list, please see my Google Scholar profile.

More about me

Originally, I am from Florianópolis (Brazil) but I've lived in NYC, Orlando, Toronto, São Paulo, and (now) Montréal, as well as other smaller cities in the south of Brazil.

I obtained a Ph.D. from the University of Toronto and the Vector Institute for A.I. in Information Theory and Generative Modelling. My thesis studies, and proposes algorithms for, lossless compression of combinatorial objects such as graphs, multisets, and partitions. Thesis: Random Permutation Codes: Lossless Source Coding of Non-Sequential Data

Tutorials, Workshops, and Talks in data compression and other things

Pinned Loading

  1. facebookresearch/NeuralCompression facebookresearch/NeuralCompression Public archive

    A collection of tools for neural compression enthusiasts.

    Python 577 52

  2. facebookresearch/multiset-compression facebookresearch/multiset-compression Public archive

    Official code accompanying the arXiv paper Compressing Multisets with Large Alphabets

    Python 31 3

  3. j-towns/craystack j-towns/craystack Public

    Compression tools for machine learning researchers

    Python 85 8

  4. Linear-Autoregressive-Similarity-Index Linear-Autoregressive-Similarity-Index Public

    Code for "The Unreasonable Effectiveness of Linear Prediction as a Perceptual Metric"

    Python 22 1

  5. craystack craystack Public

    Forked from j-towns/craystack

    Compression tools for machine learning researchers

    Python