m
• E

F Nous contacter

0

# Documents  65C05 | enregistrements trouvés : 65

O

P Q

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Markov Chain Monte Carlo Methods - Part 1 Robert, Christian P. | CIRM H

Post-edited

Research talks;Probability and Statistics

In this short course, we recall the basics of Markov chain Monte Carlo (Gibbs & Metropolis sampelrs) along with the most recent developments like Hamiltonian Monte Carlo, Rao-Blackwellisation, divide & conquer strategies, pseudo-marginal and other noisy versions. We also cover the specific approximate method of ABC that is currently used in many fields to handle complex models in manageable conditions, from the original motivation in population genetics to the several reinterpretations of the approach found in the recent literature. Time allowing, we will also comment on the programming developments like BUGS, STAN and Anglican that stemmed from those specific algorithms. In this short course, we recall the basics of Markov chain Monte Carlo (Gibbs & Metropolis sampelrs) along with the most recent developments like Hamiltonian Monte Carlo, Rao-Blackwellisation, divide & conquer strategies, pseudo-marginal and other noisy versions. We also cover the specific approximate method of ABC that is currently used in many fields to handle complex models in manageable conditions, from the original motivation in population ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Monte Carlo and quasi-Monte Carlo methods 2010.Selected papers based on the presentations at the 9th international conference on Monte Carlo and quasi Monte Carlo in scientific computing (MCQMC 2010)Warsaw # august 15-20, 2010 Plaskota, Leszek ; Wozniakowski, Henryk | Springer 2012

Congrès

- xii; 732 p.
ISBN 978-3-642-27439-8

Springer proceedings in mathematics & statistics

Localisation : Colloque 1er étage (WARS)

méthode de Monte Carlo # méthode de quasi-Monte Carlo # statistique en grande dimension # finance # analyse numérique # probabilités # chaîne de Markov

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Monte Carlo and quasi-Monte Carlo methods 2008.Proceedings of the 8th international conference Monte Carlo and quasi-Monte Carlo methods in scientific computingMontréal # july 6-11, 2008 L'Ecuyer, Pierre ; Owen, Art B. | Springer 2009

Congrès

- xii; 672 p.
ISBN 978-3-642-04106-8

Localisation : Colloque 1er étage (MONT)

chaîne de Markov # méthode de Monte-Carlo # Quasi Monte-Carlo

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Monte Carlo and quasi-Monte Carlo methods 2006 Keller, Alexander ; Heinrich, Stefan ; Niederreiter, Harald | Springer 2008

Congrès

- 698 p.
ISBN 978-3-540-74495-5

Localisation : Colloque 1er étage (ULM)

analyse numérique # méthide de Monte Carlo # généralisation des nombres aléatoires # nombres pseudo-aléatoires # intégarion numérique # quadrature # irrégularité de distribution # géométrie algorithmique # équation intégrale # mathématiques pour l'économie

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Monte Carlo methods :workshop on ...#Oct. 25-29 Madras, Neal | American Mathematical Society 2000

Congrès

- 228 p.
ISBN 978-0-8218-1992-0

Fields institute communications , 0026

Localisation : Collection 1er étage

probabilité # analyse numérique # méthode de Monte-Carlo # chaîne de Markov # physique statistique # processus de Markov # simulation # modélisation numérique

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Quantum Monte Carlo methods in physics and chemistry :proceedings of the NATO Advanced Study Institute held at Cornell University Ithaca#July 12-24 Nightingale, M. P. ; Umrigar, C. J. | Kluwer Academic Publishers 1999

Congrès

- 467 p.
ISBN 978-0-7923-5551-9

NATO science series : series c : mathematical and physical sciences , 0525

Localisation : Colloque 1er étage (NEW)

analyse numérique # chimie # mécanique statistique # méthode de Monté Carlo # méthode de probabilité # physique # physique quantique # simulaton # structure de la matière # théorie des champs quantique # équation différentielle stochastique # équilibre

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Statistical multiple integration :proceedings of a joint summer research conference on... was held at humboldt state university#June 17-23 Flournoy, Nancy ; Tsutakawa, Robert K. | American Mathematical Society 1990

Congrès

- 276 p.
ISBN 978-0-8218-5122-7

Contemporary mathematics , 0115

Localisation : Collection 1er étage

analyse multivariée # décidabilité # intégration numérique # intégration statistique # statistique décision # théorie de la décision

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Monte-Carlo methods and applications in neutronics, photonics and statistical physics :proceedings of the joint Los Alamos National Laboratory, CEA meetings#April 22-26 Alcouffe, R. ; Dautray, R. ; Forster, A. | Springer-Verlag 1985

Congrès

ISBN 978-3-540-16070-0

Lecture notes in physics , 0240

Localisation : Colloque 1er étage (CADA)

methodes # monte carlo # physique statistique

65C05

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## An introduction to particle filters Chopin, Nicolas | CIRM H

Multi angle

Research School

This course will give a gentle introduction to SMC (Sequential Monte Carlo algorithms):
• motivation: state-space (hidden Markov) models, sequential analysis of such models; non-sequential problems that may be tackled using SMC.
• Formalism: Markov kernels, Feynman-Kac distributions.
• Monte Carlo tricks: importance sampling and resampling
• standard particle filters: bootstrap, guided, auxiliary
• maximum likelihood estimation of state-stace models
• Bayesian estimation of these models: PMCMC, SMC$^2$.
This course will give a gentle introduction to SMC (Sequential Monte Carlo algorithms):
• motivation: state-space (hidden Markov) models, sequential analysis of such models; non-sequential problems that may be tackled using SMC.
• Formalism: Markov kernels, Feynman-Kac distributions.
• Monte Carlo tricks: importance sampling and resampling
• standard particle filters: bootstrap, guided, auxiliary
• maximum likelihood estimation of state-stace ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Bayesian computational methods Robert, Christian P. | CIRM H

Multi angle

Research School

This is a short introduction to the many directions of current research in Bayesian computational statistics, from accelerating MCMC algorithms, to using partly deterministic Markov processes like the bouncy particle and the zigzag samplers, to approximating the target or the proposal distributions in such methods. The main illustration focuses on the evaluation of normalising constants and ratios of normalising constants.

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## From cluster algorithms to PDMP algorithms: a Monte Carlo story of symmetry exploitation Michel, Manon | CIRM H

Multi angle

Research talks

During this talk, I will present how the development of non-reversible algorithms by piecewise deterministic Markov processes (PDMP) was first motivated by the impressive successes of cluster algorithms for the simulation of lattice spin systems. I will especially stress how the spin involution symmetry crucial to the cluster schemes was replaced by the exploitation of more general symmetry, in particular thanks to the factorization of the energy function. During this talk, I will present how the development of non-reversible algorithms by piecewise deterministic Markov processes (PDMP) was first motivated by the impressive successes of cluster algorithms for the simulation of lattice spin systems. I will especially stress how the spin involution symmetry crucial to the cluster schemes was replaced by the exploitation of more general symmetry, in particular thanks to the factorization of the ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Rare event simulation for molecular dynamics Guyader, Arnaud | CIRM H

Multi angle

Research talks

This talk is devoted to the presentation of algorithms for simulating rare events in a molecular dynamics context, e.g., the simulation of reactive paths. We will consider $\mathbb{R}^d$ as the space of configurations for a given system, where the probability of a specific configuration is given by a Gibbs measure depending on a temperature parameter. The dynamics of the system is given by an overdamped Langevin (or gradient) equation. The problem is to find how the system can evolve from a local minimum of the potential to another, following the above dynamics. After a brief overview of classical Monte Carlo methods, we will expose recent results on adaptive multilevel splitting techniques. This talk is devoted to the presentation of algorithms for simulating rare events in a molecular dynamics context, e.g., the simulation of reactive paths. We will consider $\mathbb{R}^d$ as the space of configurations for a given system, where the probability of a specific configuration is given by a Gibbs measure depending on a temperature parameter. The dynamics of the system is given by an overdamped Langevin (or gradient) equation. The ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Splitting algorithm for nested events Goudenège, Ludovic | CIRM H

Multi angle

Research talks;Probability and Statistics

Consider a problem of Markovian trajectories of particles for which you are trying to estimate the probability of a event.
Under the assumption that you can represent this event as the last event of a nested sequence of events, it is possible to design a splitting algorithm to estimate the probability of the last event in an efficient way. Moreover you can obtain a sequence of trajectories which realize this particular event, giving access to statistical representation of quantities conditionally to realize the event.
In this talk I will present the "Adaptive Multilevel Splitting" algorithm and its application to various toy models. I will explain why it creates an unbiased estimator of a probability, and I will give results obtained from numerical simulations.
Consider a problem of Markovian trajectories of particles for which you are trying to estimate the probability of a event.
Under the assumption that you can represent this event as the last event of a nested sequence of events, it is possible to design a splitting algorithm to estimate the probability of the last event in an efficient way. Moreover you can obtain a sequence of trajectories which realize this particular event, giving access to ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Metamodels for uncertainty quantification and reliability analysis Marelli, Stefano | CIRM H

Multi angle

Research schools;Probability and Statistics

Uncertainty quantification (UQ) in the context of engineering applications aims aims at quantifying the effects of uncertainty in the input parameters of complex models on their output responses. Due to the increased availability of computational power and advanced modelling techniques, current simulation tools can provide unprecedented insight in the behaviour of complex systems. However, the associated computational costs have also increased significantly, often hindering the applicability of standard UQ techniques based on Monte-Carlo sampling. To overcome this limitation, metamodels (also referred to as surrogate models) have become a staple tool in the Engineering UQ community. This lecture will introduce a general framework for dealing with uncertainty in the presence of expensive computational models, in particular for reliability analysis (also known as rare event estimation). Reliability analysis focuses on the tail behaviour of a stochastic model response, so as to compute the probability of exceedance of a given performance measure, that would result in a critical failure of the system under study. Classical approximation-based techniques, as well as their modern metamodel-based counter-parts will be introduced. Uncertainty quantification (UQ) in the context of engineering applications aims aims at quantifying the effects of uncertainty in the input parameters of complex models on their output responses. Due to the increased availability of computational power and advanced modelling techniques, current simulation tools can provide unprecedented insight in the behaviour of complex systems. However, the associated computational costs have also increased ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Subsurface flow with uncertainty : applications and numerical analysis issues Charrier, Julia | CIRM H

Multi angle

Research schools;Mathematical Physics

In this talk we first quickly present a classical and simple model used to describe flow in porous media (based on Darcy's Law). The high heterogeneity of the media and the lack of data are taken into account by the use of random permability fields. We then present some mathematical particularities of the random fields frequently used for such applications and the corresponding theoretical and numerical issues.
After giving a short overview of various applications of this basic model, we study in more detail the problem of the contamination of an aquifer by migration of pollutants. We present a numerical method to compute the mean spreading of a diffusive set of particles representing a tracer plume in an advecting flow field. We deal with the uncertainty thanks to a Monte Carlo method and use a stochastic particle method to approximate the solution of the transport-diffusion equation. Error estimates will be established and numerical results (obtained by A.Beaudoin et al. using PARADIS Software) will be presented. In particular the influence of the molecular diffusion and the heterogeneity on the asymptotic longitudinal macrodispersion will be investigated thanks to numerical experiments. Studying qualitatively and quantitatively the influence of molecular diffusion, correlation length and standard deviation is an important question in hydrogeolgy.
In this talk we first quickly present a classical and simple model used to describe flow in porous media (based on Darcy's Law). The high heterogeneity of the media and the lack of data are taken into account by the use of random permability fields. We then present some mathematical particularities of the random fields frequently used for such applications and the corresponding theoretical and numerical issues.
After giving a short overview of ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Least squares regression Monte Carlo for approximating BSDES and semilinear PDES Turkedjiev, Plamen | CIRM H

Multi angle

Research schools;Control Theory and Optimization;Probability and Statistics

In this lecture, we shall discuss the key steps involved in the use of least squares regression for approximating the solution to BSDEs. This includes how to obtain explicit error estimates, and how these error estimates can be used to tune the parameters of the numerical scheme based on complexity considerations.
The algorithms are based on a two stage approximation process. Firstly, a suitable discrete time process is chosen to approximate the of the continuous time solution of the BSDE. The nodes of the discrete time processes can be expressed as conditional expectations. As we shall demonstrate, the choice of discrete time process is very important, as its properties will impact the performance of the overall numerical scheme. In the second stage, the conditional expectation is approximated in functional form using least squares regression on synthetically generated data - Monte Carlo simulations drawn from a suitable probability distribution. A key feature of the regression step is that the explanatory variables are built on a user chosen finite dimensional linear space of functions, which the user specifies by setting basis functions. The choice of basis functions is made on the hypothesis that it contains the solution, so regularity and boundedness assumptions are used in its construction. The impact of the choice of the basis functions is exposed in error estimates.
In addition to the choice of discrete time approximation and the basis functions, the Markovian structure of the problem gives significant additional freedom with regards to the Monte Carlo simulations. We demonstrate how to use this additional freedom to develop generic stratified sampling approaches that are independent of the underlying transition density function. Moreover, we demonstrate how to leverage the stratification method to develop a HPC algorithm for implementation on GPUs.
Thanks to the Feynmann-Kac relation between the the solution of a BSDE and its associated semilinear PDE, the approximation of the BSDE can be directly used to approximate the solution of the PDE. Moreover, the smoothness properties of the PDE play a crucial role in the selection of the hypothesis space of regressions functions, so this relationship is vitally important for the numerical scheme.
We conclude with some draw backs of the regression approach, notably the curse of dimensionality.
In this lecture, we shall discuss the key steps involved in the use of least squares regression for approximating the solution to BSDEs. This includes how to obtain explicit error estimates, and how these error estimates can be used to tune the parameters of the numerical scheme based on complexity considerations.
The algorithms are based on a two stage approximation process. Firstly, a suitable discrete time process is chosen to approximate the ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Multilevel and multi-index sampling methods with applications - Lecture 2: Multilevel and Multi-index Monte Carlo methods for the McKean-Vlasov equation Tempone, Raul | CIRM H

Multi angle

Research schools;Partial Differential Equations;Probability and Statistics

We describe and analyze the Multi-Index Monte Carlo (MIMC) and the Multi-Index Stochastic Collocation (MISC) method for computing statistics of the solution of a PDE with random data. MIMC is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Instead of using first-order differences as in MLMC, MIMC uses mixed differences to reduce the variance of the hierarchical differences dramatically. These mixed differences yield new and improved complexity results, which are natural generalizations of Giles's MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence. On the same vein, MISC is a deterministic combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. Provided enough mixed regularity, MISC can achieve better complexity than MIMC. Moreover, we show that, in the optimal case, the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one-dimensional spatial problem. We propose optimization procedures to select the most effective mixed differences to include in MIMC and MISC. Such optimization is a crucial step that allows us to make MIMC and MISC computationally efficient. We show the effectiveness of MIMC and MISC in some computational tests using the mimclib open source library, including PDEs with random coefficients and Stochastic Interacting Particle Systems. Finally, we will briefly discuss the use of Markovian projection for the approximation of prices in the context of American basket options. We describe and analyze the Multi-Index Monte Carlo (MIMC) and the Multi-Index Stochastic Collocation (MISC) method for computing statistics of the solution of a PDE with random data. MIMC is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Instead of using first-order differences as in MLMC, ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Multilevel and multi-index sampling methods with applications - Lecture 1: Adaptive strategies for Multilevel Monte Carlo Tempone, Raul | CIRM H

Multi angle

Research schools;Partial Differential Equations;Probability and Statistics

We will first recall, for a general audience, the use of Monte Carlo and Multi-level Monte Carlo methods in the context of Uncertainty Quantification. Then we will discuss the recently developed Adaptive Multilevel Monte Carlo (MLMC) Methods for (i) It Stochastic Differential Equations, (ii) Stochastic Reaction Networks modeled by Pure Jump Markov Processes and (iii) Partial Differential Equations with random inputs. In this context, the notion of adaptivity includes several aspects such as mesh refinements based on either a priori or a posteriori error estimates, the local choice of different time stepping methods and the selection of the total number of levels and the number of samples at different levels. Our Adaptive MLMC estimator uses a hierarchy of adaptively refined, non-uniform time discretizations, and, as such, it may be considered a generalization of the uniform discretization MLMC method introduced independently by M. Giles and S. Heinrich. In particular, we show that our adaptive MLMC algorithms are asymptotically accurate and have the correct complexity with an improved control of the multiplicative constant factor in the asymptotic analysis. In this context, we developed novel techniques for estimation of parameters needed in our MLMC algorithms, such as the variance of the difference between consecutive approximations. These techniques take particular care of the deepest levels, where for efficiency reasons only few realizations are available to produce essential estimates. Moreover, we show the asymptotic normality of the statistical error in the MLMC estimator, justifying in this way our error estimate that allows prescribing both the required accuracy and confidence level in the final result. We present several examples to illustrate the above results and the corresponding computational savings. We will first recall, for a general audience, the use of Monte Carlo and Multi-level Monte Carlo methods in the context of Uncertainty Quantification. Then we will discuss the recently developed Adaptive Multilevel Monte Carlo (MLMC) Methods for (i) It Stochastic Differential Equations, (ii) Stochastic Reaction Networks modeled by Pure Jump Markov Processes and (iii) Partial Differential Equations with random inputs. In this context, the notion ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Optimal vector quantization: from signal processing to clustering and numerical probability Pagès, Gilles | CIRM H

Multi angle

Research schools;Computer Science;Probability and Statistics

Optimal vector quantization has been originally introduced in Signal processing as a discretization method of random signals, leading to an optimal trade-off between the speed of transmission and the quality of the transmitted signal. In machine learning, similar methods applied to a dataset are the historical core of unsupervised classification methods known as “clustering”. In both case it appears as an optimal way to produce a set of weighted prototypes (or codebook) which makes up a kind of skeleton of a dataset, a signal and more generally, from a mathematical point of view, of a probability distribution.
Quantization has encountered in recent years a renewed interest in various application fields like automatic classification, learning algorithms, optimal stopping and stochastic control, Backward SDEs and more generally numerical probability. In all these various applications, practical implementation of such clustering/quantization methods more or less rely on two procedures (and their countless variants): the Competitive Learning Vector Quantization $(CLV Q)$ which appears as a stochastic gradient descent derived from the so-called distortion potential and the (randomized) Lloyd's procedure (also known as k- means algorithm, nu ees dynamiques) which is but a fixed point search procedure. Batch version of those procedures can also be implemented when dealing with a dataset (or more generally a discrete distribution).
In a more formal form, if is probability distribution on an Euclidean space $\mathbb{R}^d$, the optimal quantization problem at level $N$ boils down to exhibiting an $N$-tuple $(x_{1}^{*}, . . . , x_{N}^{*})$, solution to

argmin$_{(x1,\dotsb,x_N)\epsilon(\mathbb{R}^d)^N} \int_{\mathbb{R}^d 1\le i\le N} \min |x_i-\xi|^2 \mu(d\xi)$

and its distribution i.e. the weights $(\mu(C(x_{i}^{*}))_{1\le i\le N}$ where $(C(x_{i}^{*})$ is a (Borel) partition of $\mathbb{R}^d$ satisfying

$C(x_{i}^{*})\subset \lbrace\xi\epsilon\mathbb{R}^d :|x_{i}^{*} -\xi|\le_{1\le j\le N} \min |x_{j}^{*}-\xi|\rbrace$.

To produce an unsupervised classification (or clustering) of a (large) dataset $(\xi_k)_{1\le k\le n}$, one considers its empirical measure

$\mu=\frac{1}{n}\sum_{k=1}^{n}\delta_{\xi k}$

whereas in numerical probability $\mu = \mathcal{L}(X)$ where $X$ is an $\mathbb{R}^d$-valued simulatable random vector. In both situations, $CLV Q$ and Lloyd's procedures rely on massive sampling of the distribution $\mu$.
As for clustering, the classification into $N$ clusters is produced by the partition of the dataset induced by the Voronoi cells $C(x_{i}^{*}), i = 1, \dotsb, N$ of the optimal quantizer.
In this second case, which is of interest for solving non linear problems like Optimal stopping problems (variational inequalities in terms of PDEs) or Stochastic control problems (HJB equations) in medium dimensions, the idea is to produce a quantization tree optimally fitting the dynamics of (a time discretization) of the underlying structure process.
We will explore (briefly) this vast panorama with a focus on the algorithmic aspects where few theoretical results coexist with many heuristics in a burgeoning literature. We will present few simulations in two dimensions.
Optimal vector quantization has been originally introduced in Signal processing as a discretization method of random signals, leading to an optimal trade-off between the speed of transmission and the quality of the transmitted signal. In machine learning, similar methods applied to a dataset are the historical core of unsupervised classification methods known as “clustering”. In both case it appears as an optimal way to produce a set ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## The Metropolis Hastings algorithm: introduction and optimal scaling of the transient phase Jourdain, Benjamin | CIRM H

Multi angle

Research schools;Probability and Statistics

We first introduce the Metropolis-Hastings algorithm. We then consider the Random Walk Metropolis algorithm on $R^n$ with Gaussian proposals, and when the target probability measure is the $n$-fold product of a one dimensional law. It is well-known that, in the limit $n$ tends to infinity, starting at equilibrium and for an appropriate scaling of the variance and of the timescale as a function of the dimension $n$, a diffusive limit is obtained for each component of the Markov chain. We generalize this result when the initial distribution is not the target probability measure. The obtained diffusive limit is the solution to a stochastic differential equation nonlinear in the sense of McKean. We prove convergence to equilibrium for this equation. We discuss practical counterparts in order to optimize the variance of the proposal distribution to accelerate convergence to equilibrium. Our analysis confirms the interest of the constant acceptance rate strategy (with acceptance rate between 1/4 and 1/3). We first introduce the Metropolis-Hastings algorithm. We then consider the Random Walk Metropolis algorithm on $R^n$ with Gaussian proposals, and when the target probability measure is the $n$-fold product of a one dimensional law. It is well-known that, in the limit $n$ tends to infinity, starting at equilibrium and for an appropriate scaling of the variance and of the timescale as a function of the dimension $n$, a diffusive limit is obtained ...

#### Filtrer

##### Codes MSC

Ressources Electroniques (Depuis le CIRM)

Books & Print journals

Recherche avancée

0
Z