Advertisement · 728 × 90
#
Hashtag
#AutoDiff
Advertisement · 728 × 90

The recording from this session is now available: youtu.be/gUE-RIOwQKI?...

#Slang #autodiff #shader #graphics #neural #rendering #gpu

9 5 0 0

Efficiently calculating Jacobians and Hessians is vital. Automatic differentiation (AD) in languages like Julia automates this, providing exact derivatives without symbolic complexity or numerical errors. A game-changer for large-scale models. #AutoDiff 6/6

0 0 0 0

A functional framework for nonsmooth autodiff with {\it maxpooling} functions

Bruno Després

Action editor: Samuel Vaiter

https://openreview.net/forum?id=qahoztvThX

#maxpooling #autodiff #lipschitz

0 0 0 0
A comparison of automatic differentiation paradigms between Python and Julia:
- In Python, one chooses the autodiff framework first (PyTorch / JAX), then the appropriate scientific library
- In Julia, one writes the scientific library first, then one tries to make it compatible with several autodiff frameworks (Enzyme, Zygote, etc)

A comparison of automatic differentiation paradigms between Python and Julia: - In Python, one chooses the autodiff framework first (PyTorch / JAX), then the appropriate scientific library - In Julia, one writes the scientific library first, then one tries to make it compatible with several autodiff frameworks (Enzyme, Zygote, etc)

How to make #autodiff user-friendly? What lies beyond the safety of Python-world? Why does it matter for scientific machine learning?
All this, and more, in our latest preprint with @adrhill.bsky.social! Spoiler alert: it describes the most useful software I ever wrote.
arxiv.org/abs/2505.05542

51 15 1 1

17/17 That's a wrap on the Automatic Sparse Differentiation discussion! #autodiff #hackerNews #summary 🎉

0 0 0 0

12/17 Others familiar with autodiff found the ideas novel & exciting! nathan_douglas: "...some of these ideas are very new to me. This seems really, really exciting though." #autodiff #innovation #excited 🎉

0 0 1 0

5/17 Sparse Jacobians save compute when input/output dependencies are low. rdyro: "Discovering this automatically through coloring is very appealing." #sparsematrix #optimization #autodiff 💡

0 0 1 0

1/17 Diving into Automatic Sparse Differentiation on Hacker News! 🧵 Discussion highlights practical uses, math behind it, & tools. From basic explanations to advanced apps! #autodiff #sparse #math 🧮

0 0 1 0
A meme following the "They have played us for absolute fools" format.

It reads:
"
STOP DOING AUTODIFF
COMPUTER PROGRAMS WERE NOT TO BE DIFFERENTIATED!

HUNDREDS OF AD SYSTEMS and yet NO REAL-WORLD USE FOUND for anything more than applying the CHAINRULE on derivative functions written BY HAND.

Want to differentiate arbitary functions anyway for a laugh? We had a tool for that it was called FINITE DIFFERENCING!

"The pullback is product of the jacobian transpose with the sensitivity" — statements dreamed up by the utterly deranged.

(This is REAL AD, used in REAL AD systems)
"
Then are shown two pictures.

A category theory optic diagram labelled "?BINOCULARS?".
and a line-graph showing subderivatives labelled "LINEY-BOY"

The text then concludes:
"
`gradient(x->println("$x apples"), 3)`
They have played us for absolute fools
"

A meme following the "They have played us for absolute fools" format. It reads: " STOP DOING AUTODIFF COMPUTER PROGRAMS WERE NOT TO BE DIFFERENTIATED! HUNDREDS OF AD SYSTEMS and yet NO REAL-WORLD USE FOUND for anything more than applying the CHAINRULE on derivative functions written BY HAND. Want to differentiate arbitary functions anyway for a laugh? We had a tool for that it was called FINITE DIFFERENCING! "The pullback is product of the jacobian transpose with the sensitivity" — statements dreamed up by the utterly deranged. (This is REAL AD, used in REAL AD systems) " Then are shown two pictures. A category theory optic diagram labelled "?BINOCULARS?". and a line-graph showing subderivatives labelled "LINEY-BOY" The text then concludes: " `gradient(x->println("$x apples"), 3)` They have played us for absolute fools "

#AutoDiff

12 1 0 0
Kernel Operations on GPU or CPU, with Autodiff, without Memory Overflows The KeOps library lets you compute generic reductions of very large arrays whose entries are given by a mathematical formula with CPU and GPU computing support. It combines a tiled reduction ...

package :https://www.kernel-operations.io/rkeops/
useR! video: www.youtube.com/watch?v=5DDd...
#rstats #kernels #gpu #autodiff

2 1 0 0
Gradients for everyone: a quick guide to autodiff in Julia | Dalle, Hill | JuliaCon 2024
Gradients for everyone: a quick guide to autodiff in Julia | Dalle, Hill | JuliaCon 2024 YouTube video by The Julia Programming Language

Automatic differentiation in Julia, demystified! Whether you're building packages or using them, learn how to navigate Julia’s AD ecosystem and choose the right tools for gradients. youtu.be/ww3ntpyxNtI?... #JuliaLang #AutoDiff #MachineLearning #ScientificComputing

14 1 0 0

Today is the last day to submit a proposal for our #autodiff minisymposium at JuliaCon25! Hurry up!

5 1 0 0

A new series is up on LA for Programmers: Matrix Calculus and AutoDiff.

www.linearalgebraforprogrammers.com/series/matri...

#linearalgebra #math #autodiff

0 0 0 0

Hey friends, what's the status on #autodiff in #Rust? Any package I should keep an eye on?

4 1 2 0
Example of Slang kernel that uses autodiff, running in the browser to optimize a gaussian mixture to fit an image.

Example of Slang kernel that uses autodiff, running in the browser to optimize a gaussian mixture to fit an image.

Among its many modern language features, #Slang provides #autodiff (forward and backward), which comes very handy for #machine-learning applications! And also a #reflection API, which we use for code generation. 6/8

0 0 1 0