Advertisement ยท 728 ร— 90

Posts by Samuel Ainsworth

Well we finally know what's harder... Gold medal @ IMO or a few hundred random GitHub issues

8 months ago 0 0 0 0
AMD gpu can't multiply matrices
AMD gpu can't multiply matrices YouTube video by Samuel Ainsworth

AMD gpu can't multiply matrices... I want my money back :/ youtube.com/shorts/Imj5j...

10 months ago 2 0 0 0

new conspiracy theory: language models write such verbose, spaghetti code bc they charge you per token

11 months ago 1 0 0 0
GitHub - samuela/torch2jax: Run PyTorch in JAX. ๐Ÿค Run PyTorch in JAX. ๐Ÿค. Contribute to samuela/torch2jax development by creating an account on GitHub.

Code: github.com/samuela/torc...

Install with `pip install torch2jax` to get started!

1 year ago 1 0 0 0
Extending PyTorch โ€” PyTorch 2.6 documentation

Adding support for random ops and batch norm required a near complete rewrite of the library internals. v0.1.0 now makes extensive use of PyTorch Modes (pytorch.org/docs/stable/...), an underrating part of the PyTorch API IMHO.

Shout out to Nick Boyd who inspired me to undertake this rewrite!

1 year ago 1 0 1 0
Post image

As a reminder, torch2jax enables running PyTorch code in JAX. Mix-and-match PyTorch and JAX code with seamless, end-to-end autodiff, use JAX classics like jit, grad, and vmap on PyTorch code, and run PyTorch models on TPUs.

See x.com/SamuelAinswo... for the initial project announcement.

1 year ago 0 0 1 0
Post image Post image

I'm excited to announce the release of torch2jax v0.1.0!

Now with support for random functions like `torch.rand` and modules that use mutable buffers like BatchNorm ๐ŸŽฒ

1 year ago 5 0 1 0
Advertisement

the wild thing, for me, is that i really do not think the lina khan era of tech regulation was anywhere close to reining in the free-for-all of consumer harm

1 year ago 725 32 16 4