Generative models such as Generative Adversarial Nets (GANs), Variational Autoencoders and Normalizing Flows have been very successful in the unsupervised learning task of generating samples from a high-dimensional probability distribution. However, the task of conditioning a high-dimensional distribution from limited empirical samples has attracted less attention in the literature but it is a central problem in Bayesian inference and supervised learning. In this talk we will discuss some ideas in this direction by viewing generative modelling as a measure transport problem. In particular, we present a simple recipe using block-triangular maps and monotonicity constraints that enables standard models such as the original GAN to perform conditional sampling. We demonstrate the effectiveness of our method on various examples ranging from synthetic test sets to image in-painting and function space inference in porous medium flow.