[The 3D plot of ReLU above is from addxorrol.blogspot.com/2024/07/some... ]. Guido Montufar has an excellent paper showing that the number of regions in the piecewise-affine function defined this way is exponential in the depth of the network: proceedings.neurips.cc/paper_files/... 3/3.
Posts by Antoine Deleforge
1 year ago
0
0
0
0
I will use that for the deep learning class I teach. Feel free to reuse ! In general I love the "origami effect" explanation of why deep learning works well. This is particularly clear for "ReLU" non linearities. You can then view each layer as folding your space in two. 2/n
1 year ago
0
0
1
0
I just had a lot of fun creating this animation, inspired by a quote from @fchollet.bsky.social 's book "Deep Learning with Python" ๐. You can view the layers of a DNN as progressively unfolding the intertwined manifolds of the classes you want to separate, like two crumbled sheets of paper. ๐งต 1/n
1 year ago
2
0
1
0
Hi Jonathan! ๐
1 year ago
1
0
1
0
Hello there! Checking out whether the sky is bluer here ๐
1 year ago
3
0
1
0