Code
Folx - Forward Laplacian for JAX
Nicholas Gao, Jonas Köhler, Adam Foster.
We (primarly Nick Gao!) wrote a Python module that implements the forward laplacian from LapNet. It is implemented as a custom interpreter for Jaxprs.
PyTorch SimCLR
This is a full PyTorch implementation of the paper A Simple Framework for Contrastive Learning of Visual Representations. The focus of this repository is to accurately reproduce the results in the paper using PyTorch. We use the original paper and the official tensorflow repo as our sources.
Pyro Optimal Experiment Design
I am the primary author of Pyro’s support for optimal experimental design. For any model written in Pyro, it is possible to estimate the Expected Information Gain (EIG) for that model for a particular design using a number of estimators such as Nested Monte Carlo, Laplace approximation, Donsker-Varadhan (aka MINE) and LFIRE. We also include the four key estimators that we introduce in the paper Variational Bayesian Optimal Experimental Design. These are the posterior, marginal, marginal + likelihood and Variational NMC estimators. To get to grips with using these estimators as part of an adaptive experimentation loop, we also provide two tutorials: on an adaptive psychology experiment to study working memory and on predicting the outcome of a US presidential election with an OED-driven polling strategy.
Datasketch
I contribute to the open-source machine learning library Datasketch. Datasketch implements probabilistic data structures such as MinHash, and Locality Sensitive Hashing for sublinear search across these structures. I contributed support for a Redis storage layer to allow Datasketch to be deployed at scale.
RDBGenerate
I wrote a utility for generating Redis dump (.rdb) files from Python native objects