srihari radhakrishna
A favorite sound
Is that of the ceiling fan
As it turned back on after a long power cut
On a humid summer night in Alleppy.
And the cartoon sounds from the TV
From the hallroom where my brother’s watching
As I wake up on Sunday morning.
There’s no school and mom’s home.
And the Azan from the adjacent mosque
At dawn when I’m down with the books
Letting me know there’s not been a riot or an earthquake
And everything’s still alright this new day.
i used to like twitter but i’m starting to feel differently because the algorithm incentivizes me to think about the same things as everyone else on there
i’ll never be able to think favorably about delivery apps like sw*ggy or z*pto despite the convenience because they grow when our cities antagonize people and segregate spaces based on class and wealth
Automatic differentiation, or autodiff is a set of techniques for computing the gradient of a function using the chain rule of differentiation and is at the core of various deep learning frameworks like PyTorch, TensorFlow and Theano. JAX is one such framework that can perform autodiff on functions defined in native Python or NumPy code and provides other transformation that make gradient-based optimizations easy and intuitive. This post attempts to understand the mechanism of autodiff while working with JAX.
I won’t pretend this post isn’t a ruse to test features of this new blog, but the method I detail below has served me well.
After having finally made a few changes that I’ve been putting off, the website looks good to go.
From r/WritingPrompts: You’re beginning to grow suspicious your long time girlfriend of ten years is actually just Shaqille O’Neal in a wig.