The recording from this session is now available: youtu.be/gUE-RIOwQKI?...
#Slang #autodiff #shader #graphics #neural #rendering #gpu
Efficiently calculating Jacobians and Hessians is vital. Automatic differentiation (AD) in languages like Julia automates this, providing exact derivatives without symbolic complexity or numerical errors. A game-changer for large-scale models. #AutoDiff 6/6
A functional framework for nonsmooth autodiff with {\it maxpooling} functions
Bruno Després
Action editor: Samuel Vaiter
https://openreview.net/forum?id=qahoztvThX
#maxpooling #autodiff #lipschitz
A comparison of automatic differentiation paradigms between Python and Julia: - In Python, one chooses the autodiff framework first (PyTorch / JAX), then the appropriate scientific library - In Julia, one writes the scientific library first, then one tries to make it compatible with several autodiff frameworks (Enzyme, Zygote, etc)
How to make #autodiff user-friendly? What lies beyond the safety of Python-world? Why does it matter for scientific machine learning?
All this, and more, in our latest preprint with @adrhill.bsky.social! Spoiler alert: it describes the most useful software I ever wrote.
arxiv.org/abs/2505.05542
17/17 That's a wrap on the Automatic Sparse Differentiation discussion! #autodiff #hackerNews #summary 🎉
12/17 Others familiar with autodiff found the ideas novel & exciting! nathan_douglas: "...some of these ideas are very new to me. This seems really, really exciting though." #autodiff #innovation #excited 🎉
5/17 Sparse Jacobians save compute when input/output dependencies are low. rdyro: "Discovering this automatically through coloring is very appealing." #sparsematrix #optimization #autodiff 💡
1/17 Diving into Automatic Sparse Differentiation on Hacker News! 🧵 Discussion highlights practical uses, math behind it, & tools. From basic explanations to advanced apps! #autodiff #sparse #math 🧮
A meme following the "They have played us for absolute fools" format. It reads: " STOP DOING AUTODIFF COMPUTER PROGRAMS WERE NOT TO BE DIFFERENTIATED! HUNDREDS OF AD SYSTEMS and yet NO REAL-WORLD USE FOUND for anything more than applying the CHAINRULE on derivative functions written BY HAND. Want to differentiate arbitary functions anyway for a laugh? We had a tool for that it was called FINITE DIFFERENCING! "The pullback is product of the jacobian transpose with the sensitivity" — statements dreamed up by the utterly deranged. (This is REAL AD, used in REAL AD systems) " Then are shown two pictures. A category theory optic diagram labelled "?BINOCULARS?". and a line-graph showing subderivatives labelled "LINEY-BOY" The text then concludes: " `gradient(x->println("$x apples"), 3)` They have played us for absolute fools "
package :https://www.kernel-operations.io/rkeops/
useR! video: www.youtube.com/watch?v=5DDd...
#rstats #kernels #gpu #autodiff
Automatic differentiation in Julia, demystified! Whether you're building packages or using them, learn how to navigate Julia’s AD ecosystem and choose the right tools for gradients. youtu.be/ww3ntpyxNtI?... #JuliaLang #AutoDiff #MachineLearning #ScientificComputing
Today is the last day to submit a proposal for our #autodiff minisymposium at JuliaCon25! Hurry up!
A new series is up on LA for Programmers: Matrix Calculus and AutoDiff.
www.linearalgebraforprogrammers.com/series/matri...
#linearalgebra #math #autodiff
Example of Slang kernel that uses autodiff, running in the browser to optimize a gaussian mixture to fit an image.
Among its many modern language features, #Slang provides #autodiff (forward and backward), which comes very handy for #machine-learning applications! And also a #reflection API, which we use for code generation. 6/8