Fall 2024
Time & Location: All talks are on Wednesdays in Gibson 126 at 4:00 PM unless otherwise noted.
Organizer: Xiang ji, Michelle Lacey and Yuwei Bao
October 9
Title: SOFARI: High-Dimensional Manifold-Based Inference
Jinchi Lv – University of Southern California
Abstract: Multi-task learning is a widely used technique for harnessing information from various tasks. Recently, the sparse orthogonal factor regression (SOFAR) framework, based on the sparse singular value decomposition (SVD) within the coefficient matrix, was introduced for interpretable multi-task learning, enabling the discovery of meaningful latent feature-response association networks across different layers. However, conducting precise inference on the latent factor matrices has remained challenging due to the orthogonality constraints inherited from the sparse SVD constraints. In this paper, we suggest a novel approach called the high-dimensional manifold-based SOFAR inference (SOFARI), drawing on the Neyman near-orthogonality inference while incorporating the Stiefel manifold structure imposed by the SVD constraints. By leveraging the underlying Stiefel manifold structure that is crucial to enabling inference, SOFARI provides easy-to-use bias-corrected estimators for both latent left factor vectors and singular values, for which we show to enjoy the asymptotic mean-zero normal distributions with estimable variances. We introduce two SOFARI variants to handle strongly and weakly orthogonal latent factors, where the latter covers a broader range of applications. We illustrate the effectiveness of SOFARI and justify our theoretical results through simulation examples and a real data application in economic forecasting. This is a joint work with Yingying Fan, Zemin Zheng and Xin Zhou.
Time: 4:00 pm
Location: Gibson Hall 126
October 16
Title: MCMC Importance Sampling via Moreau-Yosida EnvelopesEric Chi – Rice University
Abstract: Markov chain Monte Carlo (MCMC) is the workhorse computational algorithm employed for inference in Bayesian statistics. Gradient-based MCMC algorithms are known to yield faster converging Markov chains. In modern parsimonious models, the use of non-differentiable priors is fairly standard, yielding non-differentiable posteriors. Without differentiability, gradient-based MCMC algorithms cannot be employed effectively. Recently proposed proximal MCMC approaches, however, can partially remedy this limitation. These approaches employ the Moreau-Yosida (MY) envelope to smooth the nondifferentiable prior enabling sampling from an approximation to the target posterior. In this work, we leverage properties of the MY envelope to construct an importance sampling paradigm to correct for this approximation error. We establish asymptotic normality of the importance sampling estimators with an explicit expression for the asymptotic variance which we use to derive a practical metric of sampling efficiency. Numerical studies show that the proposed scheme can yield lower variance estimators compared to existing proximal MCMC alternatives.
Time: 4:00 pmLocation: Gibson Hall 126
November 20
Title: Probabilistic Interpretations of the Boltzmann and Enskog Equations
Christian Ennis – Louisiana State University
Abstract: The Boltzmann equation describes the time evolution of the density function in a phase (position-velocity) space for a classical particle (molecule) under the influence of other particles in a diluted (or rarified) gas, evolving in vacuum for a given initial distribution. The Enskog equation introduces a function in the collision operator for the Boltzmann equation, allowing one to take into account the interactions between molecules at a small distance away, rather than solely at the point of collision. In this talk, we discuss modern results on the stochastic treatment of the spatially homogeneous Boltzmann equation, the Enskog equation, and the connection between methods used in each system. Regularity results and the motivation behind this probabilistic treatment will be given.
Time: 4:00 pm
Location: Gibson Hall 126
December 4
Title: Shrinkage-based phylogenetic modeling
Alexander Fisher – Duke University
Abstract: In many phylogenetic models, the number of parameters to estimate grows with the number of taxa under study. However, parsimonious models of evolution demand local similarity in parameters on subtrees. To achieve scalable inference in such a setting, we employ auto-correlated, shrinkage-based models. We compare inference under these models to previous state-of-the art in a variety of applied settings. In one example, we investigate the heritable clock structure of various surface glycoproteins of influenza A virus in the absence of prior knowledge about molecular clock placement. In another example, we estimate the phylogenetic location of environmental shifts in the ancestry of Anolis lizards.
Time: 4:00 pm
Location: Gibson Hall 126