Ruprecht Karls Universität Heidelberg
Institut für Angewandte
Mathematik Mathematical Statistics
Kolloquium "Mathematische Statistik"
Ort: Mathematikon Im Neuenheimer Feld 205, Hörsaal
14:30 - 15:15 Mark Podolskij, Universität Luxemburg Semiparametric estimation of McKean-Vlasov stochastic differential equations
15:15 - 16:00 Markus Reiß, Humboldt Universität Berlin Statistics for SPDEs
16:00 - 17:00 Kaffee im Seminarraum A
17:00 - 17:45 Tilmann Gneiting, HITS und Karlsruher Institut für Technologie (KIT) Isotonic Distributional Regression
Mark Podolskij:Semiparametric estimation of McKean-Vlasov stochastic differential equations.
In this talk we study the problem of semiparametric estimation for a class of McKean-Vlasov stochastic differential equations. Our aim is to estimate the drift coefficient of a MV-SDE based on observations of the corresponding particle system. We propose a semiparametric estimation procedure and derive the rates of convergence for the resulting estimator. We further prove that the obtained rates are essentially optimal in the minimax sense.
Markus Reiß:Statistics for SPDEs
Stochastic partial differential equations (SPDEs) are used more and more often to model real-world phenomena. Currently, statistical methodology for these equations driven by space-time white noise is developing rapidly. Based on the classical spectral method for parametric drift estimation, we shall exhibit fundamental differences with the case of stochastic ordinary differential equations. This method, however, is restricted to simple parametric situations and we discuss the local estimation method in detail, which allows to estimate varying coefficients in the differential operator of a parabolic SPDE nonparametrically with optimal rates. Extensions to deal with additional measurement errors and to estimate change points in the diffusivity are presented. As an application we consider cell motility experiments with repolarisation described by a stochastic Meinhardt model.
The ultimate goal of regression analysis is to model the conditional distribution of an outcome, given a set of explanatory variables or covariates. This new approach is called "distributional regression", and marks a clear break from the classical view of regression, which has focused on estimating a conditional mean or quantile only. Isotonic Distributional Regression (IDR) learns conditional distributions that are simultaneously optimal relative to comprehensive classes of relevant loss functions, subject to monotonicity constraints in terms of a partial order on the covariate space. This IDR solution is exactly computable and does not require approximations nor implementation choices, except for the selection of the partial order. In case studies and benchmarks problems, IDR is competitive with state-of-the-art methods for postprocessing in numerical weather prediction, and for uncertainty quantification (UQ) within modern neural network learning.