Differentiable Programming in Kotlin

Over the last few years, several frameworks have been developed to support differentiability. The most popular are PyTorch, TensorFlow and JAX which are all built on Python. These frameworks are oriented towards machine learning which involve building a model, performing batched computations on large tensors, and then using backpropagation to make incremental corrections to that model. These frameworks have been optimized for that purpose.

There are many other uses for differentiability which don't fit that paradigm as well. Physics simulations, computer graphics, and some novel machine learning techniques also need differentiability but have additional requirements. In particular, along with differentiability, fast bespoke calculations are often needed. This is difficult to do in Python whose performance is usually an order of magnitude slower than C or Java.

At Facebook, we are developing a new differentiable programming framework for Kotlin. While we provide a library to support typical machine learning use cases, we have taken an approach that is compiler-aware. As a result, we can detect tensor shape errors at compile time and also benefit from compile-time optimizations. We also support differentiability on sparse tensors which are gaining more popularity as the data being collected continues to grow at a staggering pace.



Wednesday May 19 / 09:10AM EDT (40 minutes)

TRACK Modern CS in the Real World TOPICS Applied Computer Science ADD TO CALENDAR Calendar IconAdd to calendar