Women in Optimal Transport

Women In Optimal Transport 2024 - UBC

Organizers

  • Inwon Kim (University of California, Los Angeles, Department of Mathematics)
  • Katy Craig (University of California, Santa Barbara, Department of Mathematics)
  • Yunan Yang (Cornell University, Department of Mathematics)
  • Li Wang (University of Minnesota, Department of Mathematics)
  • Yanqin Fan (University of Washington, Department of Economics)

Confirmed Speakers

  • Jingwei Hu (University of Washington, Department of Applied Mathematics)
  • Olga Turanova (Michigan State University, Department of Mathematics)
  • Ruiyu Han (Carnegie Mellon, Mathematical Sciences)
  • Caroline Moosmuller (University of North Carolina, Mathematics)
  • Christina Frederick (New Jersey Institute of Technology, Mathematical Sciences)
  • Elisa Negrini (University of California, Los Angeles, Department of Mathematics)
  • Silvana Pesenti (University of Toronto, Department of Statistical Sciences)
  • Xiaohong Chen (Yale University, Department of Economics)
  • Ruidi Chen (Microsoft)
  • Yuan Gao (Purdue University, Department of Mathematics)
  • Siting Liu (University of California, Los Angeles, Department of Mathematics)
  • Jiao Jiao Fan (Georgia Institute of Technology, Aerospace Engineering)
  • Maria Oprea (Cornell University, Mathematics)
  • Yao Xie (Georgia Institute of Technology, H. Milton Stewart School of Industrial and Systems Engineering (ISyE))
  • Jiajin Li (Stanford University, Department of Management Science and Engineering (MS&E)

Workshop Schedule

Wednesday

  • 8:30-9:15: Breakfast
  • 9:15-9:30: Welcome
  • 9:30-10:30: An introduction to Optimal Transport
    Caroline Moosmuller (University of North Carolina)
    This talk provides an overview of optimal transport intended to establish a foundation for the conference. Key topics discussed include Monge, Kantorovich, and Benamou-Brenier formulations, Wasserstein distances, and linearized optimal transport. We will also explore domains that have significantly benefitted from optimal transport-related tools, including data science, machine learning, and biology, while also pointing to current research direction in these fields.
  • 10:30-11:00: Coffee Break
  • 11:00-12:00: Gradient flows and PDEs
    Olga Turanova (Michigan State University)
    In the first half of the talk I will provide a very brief introduction to gradient flows on the space of probability measures, with an emphasis on the connection to partial differential equations. This will then be used in the second half of the talk, when I present some recent work (joint with Craig, Elamvazhuthi, and Haberland) on a deterministic particle approximation of the inhomogeneous porous medium equation.
  • 12:00-1:40: Lunch
  • 1:40-2:00: Group Photo
  • 2:00-3:00: Discussion: What should be the future goals for Women in OT?
  • 3:00-3:30: Trade-off among Infeasibility, Efficiency and Accuracy for Gromov-Wasserstein Computation
    Jiajin Li (Stanford University)
    In this talk, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning tasks. Armed with the Luo-Tseng error bound condition, two proposed algorithms, called Bregman Alternating Projected Gradient (BAPG) and hybrid Bregman Proximal Gradient (hBPG) enjoy the convergence guarantees. Upon task-specific properties, our analysis further provides novel theoretical insights to guide how to select the best-fit method. As a result, we are able to provide comprehensive experiments to validate the effectiveness of our methods on a host of tasks, including graph alignment, graph partition, and shape matching. In terms of both wall-clock time and modeling performance, the proposed methods achieve state-of-the-art results.
  • 3:30-4:00: Coffee
  • 4:00-5:00: (Penalized) Sieve Estimation and Inference on Semi-nonparametric Models: a Brief Overview
    Xiaohong Chen (Yale University)
    TBD
  • 6:00: Conference Dinner

Thursday

  • 8:30-9: Breakfast
  • 9:00-9:30: Wasserstein gradient flows in an inhomogeneous media: convergence and the effective Wasserstein metric
    Yuan Gao (Purdue University)
    The Fokker-Planck equation with fast oscillated coefficients can be regarded as a gradient flow in a Wasserstein space with inhomogeneous dissipation metric and oscillated free energy. We will use an evolutionary Gamma convergence approach to obtain the homogenized dynamics, which preserves the gradient flow structure in a limiting homogenized Wasserstein space. The comparison between the gradient flow induced limiting Wasserstein distance and the direct Gromov-Hausdorff limiting Wasserstein distance will also be discussed.
  • 9:30-10:00: Improving Autoencoder Image Interpolation via Dynamic Optimal Transport
    Xue Feng (University of California, Davis)
    This work integrates dynamic optimal transport with autoencoder to improve its generative ability under data limitations. By viewing image interpolation as a mass transfer problem, we introduce a novel regularization term to the loss function of autoencoder based on dynamic OT, encouraging the output to follow the geodesic paths of the $ L2$ Wasserstein space. This method not only enhances the semantic meaningfulness of the autoencoder’s output but also adapts to complex cases when the environment is with obstacles or unbalanced mass transfers. The application to signal recovery will be our future work,
    Sampling via Nonlinear Diffusion Equations
    Claire Murphy (University of California, Santa Barbara)
    Given a target probability measure, a fundamental problem is to approximate it with samples; that is, to create empirical measures that converge to the target measure. Classically, the method of Langevin dynamics provides a stochastic differential equation for evolving the particles in an empirical measure to this target measure, at least when the target measure is log-concave. In this talk, I will introduce a new approach, based on nonlinear diffusion, that allows us to consider a broader class of target probability measures via the generalized Fokker-Planck equation.
    Probabilistic Taken’s Embedding through the Wasserstein Tangent Space
    Maria Oprea (Cornell University)
    In this work, we generalize the Takens embedding theorem to the Eulerian framework by considering an embedding between Wasserstein spaces. We show that the classic delay embedding map as a push-forward map provides an embedding between Wasserstein spaces. We present theoretical guarantees for reconstructing the attractor from noisy data and when the dynamics are inherently stochastic. Moreover, the weaker condition we impose when learning the delay embedding map can help improve the algorithm’s stability.
  • 10:00-10:30: Applications of no-collision transportation maps in manifold learning
    Elisa Negrini (University of California, Los Angeles)
    We investigate applications of no-collision transportation maps introduced by Nurbekyan et al. in 2020 in manifold learning for image data. Recently, there has been a surge in applying transportation-based distances and features for data representing motion-like or deformation-like phenomena. Indeed, comparing intensities at fixed locations often does not reveal the data structure. No-collision maps and distances developed in [L. Nurbekyan, A. Iannantuono, and A. M. Oberman, J. Sci. Comput., 82 (2020), 45] are sensitive to geometric features similar to optimal transportation (OT) maps but much cheaper to compute due to the absence of optimization. In this work, we prove that no-collision distances provide an isometry between translations (respectively, dilations) of a single probability measure and the translation (respectively, dilation) vectors equipped with a Euclidean distance. Furthermore, we prove that no-collision transportation maps, as well as OT and linearized OT maps, do not in general provide an isometry for rotations. The numerical experiments confirm our theoretical findings and show that no-collision distances achieve similar or better performance on several manifold learning tasks compared to other OT and Euclidean-based methods at a fraction of the computational cost.
  • 10:30-11:00: Coffee Break
  • 11:00-11:30: Score-Based Generative Models through the Lens of Wasserstein Proximal Operators
    Siting Liu (University of California, Los Angeles)
    In this presentation, I will explore how score-based generative models (SGMs) function as the Wasserstein proximal operator (WPO) of cross-entropy. This connection is clarified through the lens of mean-field games (MFG). Moreover, by applying this mathematical structure, we present an interpretable kernel-based model for interpreting score functions. This model significantly improves the efficiency of SGMs by reducing the need for training samples and shortening the training time. Additionally, the use of this kernel-based approach, together with the terminal condition of the MFG, reveals new explanations into the manifold learning and generalization properties of SGMs, and provides a solution to their memorization effects.
  • 11:30-12:00: HV geometry for signal processing
    Ruiyu Han (Carnegie Mellon)
    I first introduce a Riemann geometry on the space of signals which allows both horizontal and vertical deformations and then introduce a numerical scheme to compute the induced geodesics.
  • 12:00-1:00pm: Lunch

Friday

  • 8:30-9:30: Breakfast
  • 9:30-10:00: Structure-Preserving Particle Method for the Vlasov-Maxwell-Landau Equation
    Jingwei Hu (University of Washington)
    The Vlasov-Maxwell-Landau equation is often regarded as the first-principle physics model for plasmas. In this talk, we introduce a novel particle method for this equation that collectively models particle transport, electromagnetic field effects, and Coulomb collisions. The method arises from a regularization of the variational formulation of the Landau collision operator, leading to a discretization of the operator that conserves mass, momentum, and energy, as well as dissipates the entropy. The collisional effects appear as a fully deterministic effective force, which can be naturally coupled with the classical particle-in-cell (PIC) method. We validate the method on several plasma benchmark tests, including collisional Landau damping, two-stream instability, and Weibel instability.
  • 10:00-10:30: Optimal Transport Divergences induced by Scoring Functions
    Silvia Pesenti (University of Toronto)
    We employ scoring functions, used in statistics for eliciting risk functionals, as cost functions in the Monge-Kantorovich (MK) optimal transport problem. The novel MK divergences, which can be efficiently calculated, open an array of applications in robust stochastic optimisation. We derive sharp bounds on distortion risk measures under a Bregman-Wasserstein divergence constraint, and solve for cost-efficient portfolio strategies under benchmark constraints.This gives raise to a rich variety of novel asymmetric MK divergences, which subsume the family of Bregman-Wasserstein divergences. We show that for distributions on the real line, the comonotonic coupling is optimal for the majority the new divergences. Specifically, we derive the optimal coupling of the MK divergences induced by functionals including the mean, generalised quantiles, expectiles, and shortfall measures. Furthermore, we show that while any elicitable law-invariant convex risk measure gives raise to infinitely many MK divergences, the comonotonic coupling is simultaneously optimal.
  • 10:30-11:00: Coffee Break
  • 11:00-11:30: Multi-robot motion planning with intermittent diffusion
    Christina Frederick (New Jersey Institute of Technology)
    This work applies ideas from optimal transport to problems in robotics in which swarms of mobile sensors must achieve collective tasks, such as path-planning. We develop an algorithm with guaranteed convergence due to optimal transport and accelerate the method using intermittent diffusion. Doing this prevents common problems such as deadlocks, local minima, and less-than-ideal ending distributions.
  • 11:30-12:00: Computing high-dimensional optimal transport by flow neural networks
    Yao Xie (Georgia Institute of Technology)
    Flow-based models are widely used in generative tasks, including normalizing flow, where a neural network transports from a data distribution P to a normal distribution. This work develops a flow-based model that transports from P to an arbitrary Q where both distributions are only accessible via finite samples. We propose to learn the dynamic optimal transport between P and Q by training a flow neural network. The model is trained to optimally find an invertible transport map between P and Q by minimizing the transport cost. The trained optimal transport flow subsequently allows for performing many downstream tasks, including infinitesimal density ratio estimation (DRE) and distribution interpolation in the latent space for generative models. The effectiveness of the proposed model on high-dimensional data is demonstrated by strong empirical performance on high-dimensional DRE, OT baselines, and image-to-image translation.
  • 12:00-2:00: Lunch, discussion, and departure

Photos

Women in Optimal Transport - UBC
Women in Optimal Transport - UBC
Women in Optimal Transport - UBC Ropes Course
Women in Optimal Transport - UBC Ropes Course