10:00 – 10:35
Asymptotics beats Monte Carlo: The case of correlated local vol baskets
Christian Bayer
We consider a basket of stocks with both positive and negative?weights, in the case where each asset has a smile, e.g., evolves?according to its own local volatility and the driving Brownian motions?are correlated. In the case of positive weights, the model has been?considered in a previous work by Avellaneda, BoyerOlson, Busca and?Friz [Risk, 2004]. We derive highly accurate analytic formulas for the?prices and the implied volatilities of such baskets. These formulas?are based on a basket CarrJarrow formula, a heat kernel expansion for?the (multidimensional) density of of the asset at expiry and the?Laplace approximation. The formulas are almost explicit, up to a?minimization problem, which can be handled with simple Newton?iteration, coupled with good initial guesses as derived in the paper.?Moreover, we also provide asymptotic formulas for the greeks.?Numerical experiments in the context of the CEV model indicate that?the relative errors of these formulas are of order $10^{4}$ (or?better) for $T=\frac{1}{2}$, $10^{3}$ for $T=2$, and $10^{2}$ for?$T=10$ years, for low, moderate and high dimensions. The computational?time required to calculate these formulas is under two seconds even in?the case of a basket on 100 assets. The combination of accuracy and?speed makes these formulas potentially attractive both for calibration?and for pricing. In comparison, simulation based techniques are?prohibitively slow in achieving a comparable degree of accuracy. Thus?the present work opens up a new paradigm in which asymptotics may?arguably be used for pricing as well as for calibration. (Joint work?with Peter Laurence.)
10:35 – 11:10?
Brownian motion approach for spatial PDEs with stochastic data
Marcel Ladkau
We deal with PDE problems of elliptic type that are classically solved by the finite element method in the case of deterministic data.?Via the FeymanKac formula we give a stochastic representation providing us with?pointwise solutions. We show how to use the stochastic representation to generate a spatial approximation of the solution. Later we?extend this to the case of stochastic data and discuss the advantages?compared to FEM for such situations. Our main application will be Darcy's law.
11:10 – 11:40
Coffeebreak
11:40 – 12:15?
Semiparametric Bernstein  von Mises theorem: nonasymptotic approach.
Maxim Panov
Classical asymptotic Bernsteinvon Mises theorem will be reconsidered in nonasymptotic setup. Special attention will be paid to applicability of the results in case of growing parameter dimension and to the notion of effective dimension. The results for growing dimension will be extended to the semiparametric estimation with nuisance from Sobolev class. General results will be accompanied by particular examples illustrating the theory.
12:15 – 12:50?
Conditional moment restrictions estimation
Nikita Zhivotovskiy
We are interested in statistical models where parameters are identified by a set of conditional
estimating equations (or moment restrictions). Using modern tools proposed by Spokoiny (2011) we will reconsider the properties of the estimator in generalised method of moments and derive Wilks expansion for this model.
All the results are nonasymptotic and stated for a deterministic design.
12:50 – 14:20?
Lunch
14:20 – 14:55?
Additive Regularization for Probabilistic Topic Modelling
Konstantin Vorontsov
Probabilistic topic modeling is a powerful tool for statistical text analysis, which has been recently developing mainly within the framework of graphical models and Bayesian inference. We propose an alternative approach  Additive Regularization of Topic Models (ARTM). Our framework is free of redundant probabilistic assumptions and dramatically simplifies the inference of multiobjective topic models. Also we hold a nonprobabilistic view of the EMalgorithm as a simple iteration method for solving a system of equations for a stationary point of the optimization problem.
14:55 – 15:30?
Stochastic online gradientfree method with inexact oracle and hugescale optimization
Gasnikov Alexander
In the talk we generalize results by Nesterov Yu. Random gradientfree minimization of convex functions. CORE Discussion Paper 2011/1. 2011.
First of all we consider online (twopoint) case. Moreover we consider general proxcase, that is we don't resrtrict ourself euclidian structure. But, the main ingridient of our generalization is assumption about inexact zero order oracle (this oracle return the value of the function). And the nature of inexactness has not only stochastic nature, but also determinated part. This guy leads to the bias in the stochastic gradient approximation due to the finite difference. The unpredictable and rather unexpected result is that we have the same estimation the complexity of properly modified mirror descent algorithm for inexact oracle with level of noise is no more than desirable accuracy (on function). All the results obtained in terms of probabilities of large deviations. Moreover, proposed algorithms reach unimprovable estimotion of complexity to within a multiplicative constant. This algorithms can be used in special class of hugescale optimization problem. One of such problems (comes from Yandex) we plane to describe briefly if we have enough time. Joint work with Lagunovskaia Anastasia.
15:30 – 16:05?
Еxponential weighting in the presence of colored noise.
Ekaterina Krymova
We present new oracle inequalities for exponential aggregation of regression function estimates in assumption of heteroscedasic Gaussian noise.
16:05 – 16:35?
Coffeebreak
16:35?  17:10
Quantification of noise in MR experiments
Joerg Polzehl
We present a novel method for local estimation of the noise level in?magnetic resonance images in the presence of a signal. The procedure? uses a multiscale approach to adaptively infer on local neighborhoods? with similar data distribution. It exploits a maximumlikelihood? estimator for the local noise level. Information assessed by this? method is essential in a correct modeling in diffusion magnetic ?resonance experiments as well as in adequate preprocessing.? The validity of the method is evaluated on repeated diffusion data of?a phantom and simulated data. We illustrate the gain from using the?method in data enhancement and modeling of a highresolution diffusion? data set.
17:10 – 17:45?
Simultaneous Bayesian analysis of contingency tables in genetic association studies
Thorsten Dickhaus
Genetic association studies lead to simultaneous categorical data analysis. The sample for every genetic locus
consists of a contingency table containing the numbers of observed genotypephenotype combinations. Under
casecontrol design, the row counts of every table are identical and fixed, while column counts are random. Aim of
the statistical analysis is to test independence of the phenotype and the genotype at every locus.
We present an objective Bayesian methodology for these association tests, utilizing the Bayes factor F_2 proposed
by Good (1976) and Crook and Good (1980). It relies on the conjugacy of Dirichlet and multinomial distributions,
where the hyperprior for the Dirichlet parameter is logCauchy. Being based on the likelihood principle, the
Bayesian tests avoid looping over all tables with given marginals. Hence, their computational burden does not
increase with the sample size, in contrast to frequentist exact tests.
Making use of data generated by The Wellcome Trust Case Control Consortium (2007), we illustrate that the
ordering of the Bayes factors shows a good agreement with that of frequentist pvalues.
Finally, we deal with specifying prior probabilities for the hypotheses, by taking linkage disequilibrium structure
into account and exploiting the concept of effective numbers of tests (cf. Dickhaus (2014)).
17:45? – 18:20?
Computation of an effective number of simultaneous X^2 (Chi Square) tests
Jens Stange
Common X^2 tests are very well known and frequently applied in statistical analyses in particular for discrete models. An application to genetic association studies is considered, where a large number M, say, of 2x3 contingency tables is simultaneously tested. A method controlling the family wise error rate is shown, which makes use of an effective number of tests in Sidak multiplicity correction favor. This method considers an approximation of the full Mdimensional distribution of the involved X^2 test statistics, by a product of kdimensional marginal distributions. A challenge of this procedure is an efficient computation of the kdimensional distributions. Besides time consuming Monte Carlo procedures, there are only few implementations for even smaller dimensions of multivariate distributions. Existing formulas for the cumulative distribution function of a multivariate X^2 distribution are now implemented for an approximations with k equal to up to 4.
18:30
Banquet
