Hyemin Gu
PhD candidate at Mathematics, University of Massachusetts Amherst.
In my advisor professor Markos Katsoulakis’s lab alongside professor Luc Rey-Bellet, postdocs Benjamin Zhang and Ziyu Chen and broadly professors Panagiota Birmpa and Yannis Pantazis, I am engaged in research that translates generative models into mathematical objects supported by underlying theories. My work focuses on
- Formulating well-posed models based on these theories, and
- Developing mathematically derived estimators to improve training processes.
This research aims to address chronic issues in the training of generative models, such as instability in loss optimization and the dependency of quality of results on implementation. To achieve these improvements, we leverage mathematical concepts like Wasserstein proximal regularization and entropic regularization. Additionally, I am interested in applying and adapting our proposed models to applied fields.
Barbara Oakley, in her book A Mind for Numbers, highlights a common challenge in both mathematics and applied fields:
The challenge is that it’s often easier to pick up on a mathematical idea if it is applied directly to a concrete problem – even though that can make it more difficult to transfer the mathematical idea to new areas later. Unsurprisingly, there ends up being a constant tuscle between concrete and abstract approaches to learning mathematics. Mathematicians try to hold the high ground by stepping back to make sure that abstract approaches are central to the learning process. In contrast, engineering, business, and many other professions all naturally gravitate toward math that focuses on their specific areas to help build student engagement and avoid the complaint of “When am I ever going to use this?”
My ultimate goal is to reduce the gap between mathematics and applied fields by continuously reflecting insights from one onto the other.
news
Jun 11, 2024 | Paper for Wasserstein-1/Wasserstein-2 generative flow was released at Arxiv. |
---|---|
Jun 11, 2024 | Paper for Lipschitz regularized generative particles algorithm was published at SIAM Data Science. |
May 09, 2023 | Presented a poster at Optimal Transport in Data Science – ICERM, Brown university. |
Apr 12, 2023 | Passed oral exam. |
Sep 05, 2022 | Initiated a role as a TWIGS coordinator. |
selected publications
- GPALipschitz-Regularized Gradient Flows and Generative Particle Algorithms for High-Dimensional Scarce DataSIAM J.Data Science, to appear, 2024
- W-ProximalsCombining Wasserstein-1 and Wasserstein-2 proximals: robust manifold learning via well-posed generative flows2024
- Heavytail-W-ProximalsLearning heavy-tailed distributions with Wasserstein-proximal-regularized α-divergences2024