First day and the chicken rice @ HKU
Posts by Zhidi Lin
AI that can improve itself: A deep dive into self-improving AI and the Darwin-GΓΆdel Machine.
richardcsuwandi.github.io/blog/2025/dgm/
Excellent blog post by Richard Suwandi reviewing the Darwin GΓΆdel Machine (DGM) and future implications.
Duffing oscillator
en.wikipedia.org/wiki/Duffing...
Excited to share that our latest work on grid spectral mixture product (GSMP) kernel has been featured in Prof. Sergios Theodoridis' latest book "Machine Learning: From the Classics to Deep Networks, Transformers and Diffusion Models"! π
maybe add me if it is possible, thanks.
Make sure to get your tickets to AABI if you are in Singapore on April 29 (just after #ICLR2025) and interested in probabilistic modeling, inference, and decision-making!
Tickets (free but limited!): lu.ma/5syzr79m
More info: approximateinference.org
#Bayes #MachineLearning #ICLR2025 #AABI2025
cool
These sparse Gaussian Processes have been around longer than some grad students, but still fun to code! (and today was my first time coding one...)
test
I already advertised for this document when I posted it on arXiv, and later when it was published.
This week, with the agreement of the publisher, I uploaded the published version on arXiv.
Less typos, more references and additional sections including PAC-Bayes Bernstein.
arxiv.org/abs/2110.11216
This paper on sparse variational Gaussian processes is quite intriguing... I always find great enjoyment in reading Titsias's work.
I first read the preprint version of this book online, and it was already fascinating. It seems that even more interesting topics and chapters have been added, awesome!
π
Schematic illustration of a scalar-valued residual deep GP with L hidden layers. The last layer is a scalar-valued GP on the manifold. If it is not present, the model is manifold-valued. If it is replaced with a Gaussian vector field (GVF), the model is a vector field on the manifold.
Excited to share our ICLR 2025 oral "Residual Deep Gaussian Processes on Manifolds"!
With @vabor112.bsky.social & @arkrause.bsky.social, we introduce manifold-to-manifold GPs that can be composed together, generalising deep GPs to manifolds. Applications include wind prediction & Bayes opt! 1/n
The deadline for abstract submission of contributed papers and posters for BayesComp2025 has been extended to 28 February. Decisions by 14 March. Submit here! bayescomp2025.sg/abstract-sub...
The deadline for early bird registration has been extended to March 22. Hope to see you in Singapore!
Nothing compares to the moment my newborn smiled at me. Pure, unfiltered joy and love. My heart is so full... π₯Ήπ
Check out our paper if youβre interested: ieeexplore.ieee.org/document/108...
2. The SLIM-KL framework, combining quantized ADMM for privacy and communication efficiency with Distributed Successive Convex Approximation for scalable optimization.
3. Theoretical convergence guarantees and superior performance on diverse datasets, showcasing scalability and efficiency.
In this work, we addressed challenges in Gaussian process (GP) regression for multidimensional and large-scale data. Our key contributions:
1. A new GP kernel that reduces hyperparameters while maintaining strong performance, promoting sparsity for efficient optimization.
Thrilled to share that our paper has been accepted by IEEE TNNLS! This is my first journal paper as a mentor and a co-first author. Iβm incredibly proud to have collaborated with @richardcsuwandi.bsky.social. Richardβs dedication made this a truly rewarding experience.
π cool
One #postdoc position is still available at the National University of Singapore (NUS) to work on sampling, high-dimensional data-assimilation, and diffusion/flow models. Applications are open until the end of January. Details:
alexxthiery.github.io/jobs/2024_di...
Langevin Monte Carlo (LMC). Just tried to generate a video to visualize the process π
Inventors of flow matching have released a comprehensive guide going over the math & code of flow matching!
Also covers variants like non-Euclidean & discrete flow matching.
A PyTorch library is also released with this guide!
This looks like a very good read! π₯
arxiv: arxiv.org/abs/2412.06264
A common question nowadays: Which is better, diffusion or flow matching? π€
Our answer: Theyβre two sides of the same coin. We wrote a blog post to show how diffusion models and Gaussian flow matching are equivalent. Thatβs great: It means you can use them interchangeably.
The 41st Conference on #Uncertainty in #AI will be held in Rio de Janeiro π§π·, July 21-25!
The CfP is out π www.auai.org/uai2025/call...
π¨ Feb 10: Paper submission
π£οΈ Apr 3-10: rebuttal period
π/π May 6: Author notification
#UAI2025 #ML #stats #learning #reasoning #uncertainty
awesome
Generating cat videos is nice, but what if you could tackle real scientific problems with the same methods? π§ͺπ
Introducing The Well: 16 datasets (15TB) for Machine Learning, from astrophysics to fluid dynamics and biology.
π: github.com/PolymathicAI...
π: openreview.net/pdf?id=00Sx5...
π