Hyemin Gu

PhD candidate at Mathematics, University of Massachusetts Amherst.

prof_pic.jpg

In my advisor professor Markos Katsoulakis’s lab alongside professor Luc Rey-Bellet, postdocs Benjamin Zhang and Ziyu Chen and broadly professors Panagiota Birmpa and Yannis Pantazis, I am engaged in research that translates generative models into mathematical objects supported by underlying theories. My work focuses on

  • Formulating well-posed models based on these theories, and
  • Developing mathematically derived estimators to improve training processes.

This research aims to address chronic issues in the training of generative models, such as instability in loss optimization and the dependency of quality of results on implementation. To achieve these improvements, we leverage mathematical concepts like Wasserstein proximal regularization and entropic regularization. Additionally, I am interested in applying and adapting our proposed models to applied fields.

Barbara Oakley, in her book A Mind for Numbers, highlights a common challenge in both mathematics and applied fields:

The challenge is that it’s often easier to pick up on a mathematical idea if it is applied directly to a concrete problem – even though that can make it more difficult to transfer the mathematical idea to new areas later. Unsurprisingly, there ends up being a constant tuscle between concrete and abstract approaches to learning mathematics. Mathematicians try to hold the high ground by stepping back to make sure that abstract approaches are central to the learning process. In contrast, engineering, business, and many other professions all naturally gravitate toward math that focuses on their specific areas to help build student engagement and avoid the complaint of “When am I ever going to use this?”

My ultimate goal is to reduce the gap between mathematics and applied fields by continuously reflecting insights from one onto the other.

news

Jun 11, 2024 Paper for Wasserstein-1/Wasserstein-2 generative flow was released at Arxiv.
Jun 11, 2024 Paper for Lipschitz regularized generative particles algorithm was published at SIAM Data Science.
May 09, 2023 Presented a poster at Optimal Transport in Data Science – ICERM, Brown university.
Apr 12, 2023 Passed oral exam.
Sep 05, 2022 Initiated a role as a TWIGS coordinator.

selected publications

  1. GPA
    kl-lipschitz_1_4096_4096_00_test_sierpinski2movie.gif
    Lipschitz-Regularized Gradient Flows and Generative Particle Algorithms for High-Dimensional Scarce Data
    Hyemin Gu, Panagiota Birmpa, Yannis Pantazis, Luc Rey-Bellet, and Markos A. Katsoulakis
    SIAM J.Data Science, to appear, 2024
  2. W-Proximals
    Combining Wasserstein-1 and Wasserstein-2 proximals: robust manifold learning via well-posed generative flows
    Hyemin Gu, Markos A. Katsoulakis, Luc Rey-Bellet, and Benjamin J. Zhang
    2024
  3. Heavytail-W-Proximals
    Learning heavy-tailed distributions with Wasserstein-proximal-regularized α-divergences
    Ziyu Chen, Hyemin Gu, Markos A. Katsoulakis, Luc Rey-Bellet, and Wei Zhu
    2024