Last update: December 2022.
Lightweight normalizing flows for generative modeling in PyTorch.
where
Here
We typically choose a simple distribution over the latent space,
Suppose we compose functions
Sampling can be done easily, as long as
Planar and radial flows [1]. Note these have no algebraic inverse
Real NVP [2]. Partition the vector
Here the diagonal of the Jacobian is simply
Invertible 1x1 Convolution [3]. Use an LU decomposition for computational efficiency.
ActNorm [3]. Even more straightforward.
Masked Autoregressive Flow [4]. For each dimension of
Here the diagonal of the Jacobian is
Neural Spline Flow [5]. Two versions: auto-regressive and coupling.
Below we show examples (in 1D and 2D) transforming a mixture of Gaussians into a unit Gaussian.
[1] Rezende, D. J. & Mohamed, S. Variational Inference with Normalizing Flows. in Proceedings of the 32nd International Conference on Machine Learning - Volume 37 - Volume 37 1530–1538 (JMLR.org, 2015).
[2] Dinh, L., Krueger, D., and Bengio, Y. (2014). NICE: Non-linear Independent Components Estimation.
[3] Kingma, D.P., and Dhariwal, P. (2018). Glow: Generative Flow with Invertible 1x1 Convolutions. In Advances in Neural Information Processing Systems 31, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, eds. (Curran Associates, Inc.), pp. 10215–10224.
[4] Papamakarios, G., Pavlakou, T., and Murray, I. (2017). Masked Autoregressive Flow for Density Estimation. In Advances in Neural Information Processing Systems 30, I. Guyon, U.V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, eds. (Curran Associates, Inc.), pp. 2338–2347.
[5] Durkan, C., Bekasov, A., Murray, I., and Papamakarios, G. (2019). Neural Spline Flows.
This code is available under the MIT License.