In this age of powerful GenAI capabilities for coding, I get wistful sometimes that new devs may never experience that feeling of manually typing out the first few lines of code for a new project
Posts by Kris Reyes
📅 Jan 21 at 10:00 AM ET: Join AC staff scientist Sergio Pablo-García Carrillo to explore lab automation + workflow orchestration in chemistry in a webinar co-hosted by Smart Labs + @SiLA!
Register: www.eventbrite.com/e/towards-th...
December Run 100K Challenge
“I am excited to announce” the achievement of a “key metric” that I actually care about. I don’t, however, have any snappy backroymn to tie this into AI.
@thecrashcourse.bsky.social "futures of AI" first episode is out and is confirming my fear about the series when I first heard about it. Why are bigger channels (Wired was another one) bringing on non AI experts to talk about AI? An important lesson pick your AI gurus more carefully sheesh
Why on earth is emailing around an Excel spreadsheet still considered a viable means of organization and project management in 2025??
@notion.com i really want to use notion as a replacement for Word/Google docs. Stopping me is the lack of control of how a page is formatted when exporting a PDF. Esp. important is the need for documents that look like they came from Word/GDocs: removing title, DB properties, footers, etc.
I’m going through chemotherapy again (ugh), and one side effect this time is tinnitus. I learned it can happen when the brain fills in for damaged auditory nerves—generating sound where input is missing. Not unlike imputation, or how multimodal models handle absent signals. Bug or feature?
Or (and this may seem blasphemous to ML people), you can just use educated guesstimates of the parameters. This is, I would argue, more Bayesian, than MLE-based hyperparameter tuning, as they reflect prior knowledge of your system.
Or you can using MAP estimates instead of likelihoods to incorporate prior knowledge to regularize the ill-posedness of the maximum likelihood calculation.
What is the alternative to maximum-likelihood estimates to hyperparameters of a GP model? You can use hierarchical beliefs on these hyperparameters. This shifts the computational burden from likelihood optimization to "train" a model to methods such as MCMC to sample from the posterior distribution.
Hyperparameters need to be set based on prior information or tuned to data (if you must) using empirical bayesian methods.
Second is that when you're using GPs in a Bayesian context -- representing priors for an unknown function -- naively tuning hyperparameters to the prior based on data goes against the Bayesian philosophy.
First, "training" the model, i.e. hyperparameter tuning by calculating maximum likelihood estimates is ill-posed:
www.jmlr.org/papers/v24/2...
This is especially magnified in low-data settings.
So there is a disproportionate (IMO) amount of effort by both implementers of GP libraries and users dedicated to optimization of hyperparameters -- at least in the context of small-data settings. This is not great for a few reasons:
That is, people who first work with GPs from an ML perspective look for parameters to optimize off of data, and this becomes their primary preoccupation. Desperate to fit into the ML perspective, they turn to the only "parameters" present in a GP, hyperparameters in mean and covariance functions.
Gaussian Processes are not ML "things". This is akin to saying Normal distributions are ML -- in fact, it IS almost literally saying this -- which is silly, because they existed before ML was a thing. Why care? When viewed as ML, people incorrectly assume it must fit into the "train/test" modality.
You know who was cool? Mr. Wizard.
Aug 11-14, 2025: Accelerate is back at the University of Toronto this summer. Join us for 4 days of talks, workshops and poster sessions on AI, automation, and the future of materials discovery. Early-bird registration and our call for abstracts are now open: accelerate25.ca
*make variational inference infinite-dimensional again
It was featured on "CNN Headlines", but I don't know if they are running the stories in a loop:
www.cnn.com/videos/fast/...
You are on the front page!