DMS Graduate Student Seminar
Oct 22, 2025 03:00 PM
354 Parker Hall
Speaker: Professor Stu Baldwin (Auburn University)
Title: The Mandelbrot Set and Julia Sets
DMS Graduate Student Seminar
Oct 15, 2025 03:00 PM
354 Parker Hall
Speaker: Dr. Yimin Zhong
Title: Learn PDE from its solution
Abstract: In this talk, I will discuss our motivation and address a few basic questions about learning the PDE from observed solution data, as well as how I considered these questions in the project. The talk will take a few types of PDEs as examples to show 1) how the approximate dimension (richness) of the data space spanned by all snapshots along a solution trajectory depends on the differential operator and initial data, and 2) identifiability of a differential operator from solution data on local patches. Then I will discuss the consistent and sparse local regression method (CaSLR) for general PDE identification. The process is data-driven and requires minimal local measurements in space and time from a single solution trajectory, enforcing global consistency and sparsity.
DMS Graduate Student Seminar
Oct 08, 2025 03:00 PM
354 Parker Hall
Speaker: Dr. Hans-Werner van Wyk (Auburn University)
Title: Statistical Sampling for Uncertainties in Partial Differential Equations
Abstract: Statistical sampling is the process of selecting a subset from a population with the aim of estimating population-level properties. For uncertain natural or engineered systems governed by partial differential equations, sampling presents a computational bottleneck, since the per-sample cost is high. We discuss several strategies for enhancing the sampling efficiency for both the prediction and control of these systems.
DMS Graduate Student Seminar
Oct 01, 2025 03:00 PM
354 Parker Hall
Speaker: Dr. Jordan Eckert (Auburn University)
Title: Prototype Selection Using Topological Data Analysis
Abstract: Prototype selection has become a large area of current research in statistical learning. We introduce a novel topological data analysis (TDA)-based framework for selecting representative prototypes from large datasets. We show that this approach preserves classification performance while substantially reducing data size. Such methods are crucial in resource-constrained environments where memory and computation are limited. Together, these contributions advance both algorithmic and geometric aspects of prototype learning and offer practical tools for scalable, interpretable, and potentially efficient classification.
DMS Graduate Student Seminar
Sep 24, 2025 03:00 PM
354 Parker Hall

Speaker: Dr. Luke Oeding (Auburn University)
Title: Invariant Theory in Quantum Information
Abstract: Entanglement is a resource that is crucial for the so-called quantum advantage promised by quantum computing. How can algebra help us to understand quantum entanglement? I’ll explain our recent work that combines algebra and optimization to find new instances of maximally entangled states.
This is based on joint work with my recent PhD student, Ian Tan, who’s now working as a Postdoc at Charles University in Prague.
DMS Graduate Student Seminar
Sep 17, 2025 03:00 PM
354 Parker Hall
Speaker: Dr. John Cobb
Title: Algebraic Geometry and Its Applications
Abstract: I will give an overview of (some of) the sorts of things people think about in algebraic geometry, focusing on applications to other fields of math and the sciences. I will give an overview of several different threads of research, highlighting work by your resident Auburn algebraists and geometers. This will be light on details and contain approximately a million pictures. Maybe even an animation.
DMS Graduate Student Seminar
Sep 10, 2025 03:00 PM
Parker Hall 354

Speaker: Dr. James Scott (Auburn University)
Title: New Horizons in Nonlocal Modeling
Abstract: In recent years, continuum models that incorporate nonlocal effects have seen greatly increased use. They have been applied in many areas, including diffusion modeling, image processing, and mechanics. These models are characterized by partial integro-differential equations; that is, in place of partial derivatives, integral operators that act on difference quotients of multi-variable functions are used. In this talk, we will introduce some of these nonlocal models and discuss recent contributions to their mathematically rigorous underpinning across several different contexts. Such contributions include the well-posedness and regularity of nonlocal equations, the robust nature of their discretizations, rigorous characterizations of long-range and other phenomena captured by the equations, and the consistency of nonlocal models with classical models in suitable asymptotic regimes. The contexts include continuum mechanics, semi-supervised learning, fractional PDEs, and coupled local-nonlocal equations. We will conclude with a presentation of some unanswered questions.
DMS Graduate Student Seminar
Sep 03, 2025 03:00 PM
354 Parker Hall
Speaker: Dr. Le Chen
Title: How do surfaces grow?
Abstract: How do surfaces grow, and why do so many look statistically alike? This talk connects intuitive simulations with modern probability to explore surface growth and universality. We begin with a CLT refresher as a baseline for randomness, then show why it fails for growing interfaces: local interactions and spatial-temporal dependencies break independence. Using Tetris-like ballistic deposition (sticky and non-sticky) as model systems, we compare simulated interfaces and empirical fluctuation scaling, and discuss their relation to the KPZ universality class. We then highlight experimental evidence from thin-film growth where universal scaling emerges in real materials. Along the way, we emphasize what "universality" means, how scaling exponents organize phenomena, and where open questions remain—such as identifying non-KPZ behaviors. The goal is a concrete, visual understanding of stochastic growth, bridging simulations, data, and theory.
DMS Graduate Student Seminar
Aug 27, 2025 03:00 PM
354 Parker Hall
Please note the change of location: Parker Hall 354
Speaker: Dr. Rob Molinari
Title: More of Less: A Rashomon Algorithm for Sparse Model Sets
Abstract: The current paradigm of machine learning consists in finding a single best model to deliver predictions and, if possible, interpretations for a specific problem. This paradigm has however been strongly challenged in recent years through the study of the “Rashomon Effect” which was coined initially by Leo Breiman. This phenomenon occurs when there exist many good predictive models for a given dataset/problem, with considerable practical implications in terms of interpretation, usability, variable importance, replicability and many others. The set of models (within a specific class of functions) which respect this definition is referred to as the “Rashomon set” and an important amount of recent work has been focused on ways of finding these sets as well as studying their properties. Developed in parallel to current research on the Rashomon Effect and motivated by sparse latent representations for high-dimensional problems, we present a heuristic procedure that aims to find sets of sparse models with good predictive power through a greedy forward-search that explores the low-dimensional variable space. Throughout this algorithm, good low-dimensional models identified from the previous steps are used to build models with more variables in the following steps. While preserving almost-equal performance with respect to a single reference model in a given class (i.e., a Rashomon set), the sparse model sets from this algorithm include diverse models which can be combined into networks that deliver additional layers of interpretation and new insights into how variable combinations can explain the Rashomon Effect.
DMS Graduate Student Seminar
Apr 23, 2025 03:00 PM
ACLC 010
Speaker: Dr. Phuong Hoang (Auburn University)
Title: Domain decomposition-based exponential time differencing methods for stiff evolution problems
Abstract: Exponential integrators, among them the Exponential Time Differencing (ETD) methods, have been widely used for solving stiff, nonlinear equations due to their accuracy, stability and ability of maintaining exponential behavior. The cost of these methods is dominated by the computing of matrix exponentials and their products with vectors. In this talk, we will use domain decomposition methods to enhance the efficiency (in terms of computational cost) of ETD schemes. In particular, we will present localized ETD Runge-Kutta schemes for the rotating shallow water equations and discuss their numerical performance compared to the explicit Runge-Kutta or strong stability preserving Runge-Kutta (SSP-RK) algorithms.
More Events...