m

F Nous contacter

0

Documents  65C40 | enregistrements trouvés : 21

O

-A +A

P Q

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research talks;Probability and Statistics

In this short course, we recall the basics of Markov chain Monte Carlo (Gibbs & Metropolis sampelrs) along with the most recent developments like Hamiltonian Monte Carlo, Rao-Blackwellisation, divide & conquer strategies, pseudo-marginal and other noisy versions. We also cover the specific approximate method of ABC that is currently used in many fields to handle complex models in manageable conditions, from the original motivation in population genetics to the several reinterpretations of the approach found in the recent literature. Time allowing, we will also comment on the programming developments like BUGS, STAN and Anglican that stemmed from those specific algorithms. In this short course, we recall the basics of Markov chain Monte Carlo (Gibbs & Metropolis sampelrs) along with the most recent developments like Hamiltonian Monte Carlo, Rao-Blackwellisation, divide & conquer strategies, pseudo-marginal and other noisy versions. We also cover the specific approximate method of ABC that is currently used in many fields to handle complex models in manageable conditions, from the original motivation in population ...

65C05 ; 65C40 ; 60J10 ; 62F15

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research schools

This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to so-called “convex imaging problems”. This will provide an opportunity to establish connections with the convex optimisation and machine learning approaches to imaging, and to discuss some of their relative strengths and drawbacks. Examples of topics covered in the course include: efficient stochastic simulation and optimisation numerical methods that tightly combine proximal convex optimisation with Markov chain Monte Carlo techniques; strategies for estimating unknown model parameters and performing model selection, methods for calculating Bayesian confidence intervals for images and performing uncertainty quantification analyses; and new theory regarding the role of convexity in maximum-a-posteriori and minimum-mean-square-error estimation. The theory, methods, and algorithms are illustrated with a range of mathematical imaging experiments. This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to ...

49N45 ; 65C40 ; 65C60 ; 65J22 ; 68U10 ; 62C10 ; 62F15 ; 94A08

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xxiv; 446 p.
ISBN 978-3-642-33304-0

Lecture notes in mathematics , 2068

Localisation : Collection 1er étage

géométrie stochastique # analyse spatiale # champs aléatoires # statistiques spatiales

60D05 ; 52A22 ; 60G55 ; 60G60 ; 60G57 ; 60F05 ; 60F15 ; 60J25 ; 62M30 ; 65C40 ; 60-06 ; 00B25

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research School

This talk focuses on the estimation of the distribution of unobserved nodes in large random graphs from the observation of very few edges. These graphs naturally model tournaments involving a large number of players (the nodes) where the ability to win of each player is unknown. The players are only partially observed through discrete valued scores (edges) describing the results of contests between players. In this very sparse setting, we present the first nonasymptotic risk bounds for maximum likelihood estimators (MLE) of the unknown distribution of the nodes. The proof relies on the construction of a graphical model encoding conditional dependencies that is extremely efficient to study n-regular graphs obtained using a round-robin scheduling. This graphical model allows to prove geometric loss of memory properties and deduce the asymptotic behavior of the likelihood function. Following a classical construction in learning theory, the asymptotic likelihood is used to define a measure of performance for the MLE. Risk bounds for the MLE are finally obtained by subgaussian deviation results derived from concentration inequalities for Markov chains applied to our graphical model. This talk focuses on the estimation of the distribution of unobserved nodes in large random graphs from the observation of very few edges. These graphs naturally model tournaments involving a large number of players (the nodes) where the ability to win of each player is unknown. The players are only partially observed through discrete valued scores (edges) describing the results of contests between players. In this very sparse setting, we ...

62F15 ; 62C10 ; 65C60 ; 65C40

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research talks

During this talk, I will present how the development of non-reversible algorithms by piecewise deterministic Markov processes (PDMP) was first motivated by the impressive successes of cluster algorithms for the simulation of lattice spin systems. I will especially stress how the spin involution symmetry crucial to the cluster schemes was replaced by the exploitation of more general symmetry, in particular thanks to the factorization of the energy function. During this talk, I will present how the development of non-reversible algorithms by piecewise deterministic Markov processes (PDMP) was first motivated by the impressive successes of cluster algorithms for the simulation of lattice spin systems. I will especially stress how the spin involution symmetry crucial to the cluster schemes was replaced by the exploitation of more general symmetry, in particular thanks to the factorization of the ...

65C05 ; 65C40 ; 60K35 ; 68K87

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research talks;Probability and Statistics

Consider a problem of Markovian trajectories of particles for which you are trying to estimate the probability of a event.
Under the assumption that you can represent this event as the last event of a nested sequence of events, it is possible to design a splitting algorithm to estimate the probability of the last event in an efficient way. Moreover you can obtain a sequence of trajectories which realize this particular event, giving access to statistical representation of quantities conditionally to realize the event.
In this talk I will present the "Adaptive Multilevel Splitting" algorithm and its application to various toy models. I will explain why it creates an unbiased estimator of a probability, and I will give results obtained from numerical simulations.
Consider a problem of Markovian trajectories of particles for which you are trying to estimate the probability of a event.
Under the assumption that you can represent this event as the last event of a nested sequence of events, it is possible to design a splitting algorithm to estimate the probability of the last event in an efficient way. Moreover you can obtain a sequence of trajectories which realize this particular event, giving access to ...

60J22 ; 65C35 ; 65C05 ; 65C40

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research schools;Probability and Statistics

We first introduce the Metropolis-Hastings algorithm. We then consider the Random Walk Metropolis algorithm on $R^n$ with Gaussian proposals, and when the target probability measure is the $n$-fold product of a one dimensional law. It is well-known that, in the limit $n$ tends to infinity, starting at equilibrium and for an appropriate scaling of the variance and of the timescale as a function of the dimension $n$, a diffusive limit is obtained for each component of the Markov chain. We generalize this result when the initial distribution is not the target probability measure. The obtained diffusive limit is the solution to a stochastic differential equation nonlinear in the sense of McKean. We prove convergence to equilibrium for this equation. We discuss practical counterparts in order to optimize the variance of the proposal distribution to accelerate convergence to equilibrium. Our analysis confirms the interest of the constant acceptance rate strategy (with acceptance rate between 1/4 and 1/3). We first introduce the Metropolis-Hastings algorithm. We then consider the Random Walk Metropolis algorithm on $R^n$ with Gaussian proposals, and when the target probability measure is the $n$-fold product of a one dimensional law. It is well-known that, in the limit $n$ tends to infinity, starting at equilibrium and for an appropriate scaling of the variance and of the timescale as a function of the dimension $n$, a diffusive limit is obtained ...

60J22 ; 60J10 ; 60G50 ; 60F17 ; 60J60 ; 60G09 ; 65C40 ; 65C05

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research schools

This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to so-called “convex imaging problems”. This will provide an opportunity to establish connections with the convex optimisation and machine learning approaches to imaging, and to discuss some of their relative strengths and drawbacks. Examples of topics covered in the course include: efficient stochastic simulation and optimisation numerical methods that tightly combine proximal convex optimisation with Markov chain Monte Carlo techniques; strategies for estimating unknown model parameters and performing model selection, methods for calculating Bayesian confidence intervals for images and performing uncertainty quantification analyses; and new theory regarding the role of convexity in maximum-a-posteriori and minimum-mean-square-error estimation. The theory, methods, and algorithms are illustrated with a range of mathematical imaging experiments. This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to ...

49N45 ; 65C40 ; 65C60 ; 65J22 ; 68U10 ; 62C10 ; 62F15 ; 94A08

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research schools

This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to so-called “convex imaging problems”. This will provide an opportunity to establish connections with the convex optimisation and machine learning approaches to imaging, and to discuss some of their relative strengths and drawbacks. Examples of topics covered in the course include: efficient stochastic simulation and optimisation numerical methods that tightly combine proximal convex optimisation with Markov chain Monte Carlo techniques; strategies for estimating unknown model parameters and performing model selection, methods for calculating Bayesian confidence intervals for images and performing uncertainty quantification analyses; and new theory regarding the role of convexity in maximum-a-posteriori and minimum-mean-square-error estimation. The theory, methods, and algorithms are illustrated with a range of mathematical imaging experiments. This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to ...

49N45 ; 65C40 ; 65C60 ; 65J22 ; 68U10 ; 62C10 ; 62F15 ; 94A08

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

Research schools

This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to so-called “convex imaging problems”. This will provide an opportunity to establish connections with the convex optimisation and machine learning approaches to imaging, and to discuss some of their relative strengths and drawbacks. Examples of topics covered in the course include: efficient stochastic simulation and optimisation numerical methods that tightly combine proximal convex optimisation with Markov chain Monte Carlo techniques; strategies for estimating unknown model parameters and performing model selection, methods for calculating Bayesian confidence intervals for images and performing uncertainty quantification analyses; and new theory regarding the role of convexity in maximum-a-posteriori and minimum-mean-square-error estimation. The theory, methods, and algorithms are illustrated with a range of mathematical imaging experiments. This course presents an overview of modern Bayesian strategies for solving imaging inverse problems. We will start by introducing the Bayesian statistical decision theory framework underpinning Bayesian analysis, and then explore efficient numerical methods for performing Bayesian computation in large-scale settings. We will pay special attention to high-dimensional imaging models that are log-concave w.r.t. the unknown image, related to ...

49N45 ; 65C40 ; 65C60 ; 65J22 ; 68U10 ; 62C10 ; 62F15 ; 94A08

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xiv; 667 p.
ISBN 978-1-4398-4095-5

Texts in statistical science

Localisation : Ouvrage RdC (BAYE)

statistique bayésienne # analyse des données # inférence # modèle linéaire généralisé # modèle multivarié # intégration # simulation des chaînes de Markov # BayesDA # R # BUGS

62-01 ; 62F15 ; 62-07 ; 62Jxx ; 62Pxx ; 65C40

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xvii; 308 p.
ISBN 978-3-540-79225-3

Mathématiques & applications , 0063

Localisation : Collection 1er étage

analyse spatiale # méthode statistique # distribution # modèle latticiel # variogramme # auto-régression # champ de Gibbs-Markov # processus ponctuel # algorithme MCMC

62M30 ; 62H11 ; 86A32 ; 62-02 ; 65C40 ; 65C05 ; 00A71

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xvi; 232 p.
ISBN 978-1-118-51707-9

Wiley series in probability and statistics

Localisation : Ouvrage RdC (GRAH)

processus de Markov # méthode de Monte Carlo # chaîne de Markov # analyse numérique

60-01 ; 60J10 ; 60J22 ; 65C40

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xvi; 447 p.
ISBN 978-0-521-13951-9

Cambridge series in statistical and probabilistic mathematics

Localisation : Ouvrage RdC (MONA)

analyse numérique # analyse mathématique # statistique mathématique # régression non-linéaire # méthode de Monte-Carlos # nombre aléatoire # chaîne de Markov # tri et recherche # algèbre linéaire numérique # transformée de Fourier # algèbre linéaire numérique # système d'équations non-linéaires # quadrature # programmation mathématique

65C60 ; 65-01 ; 62-01 ; 62J02 ; 65C10 ; 65C05 ; 65C40 ; 68P10 ; 65Fxx ; 65T50 ; 65D32 ; 65H10 ; 65K05

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xv; 256 p.
ISBN 978-2-8178-0180-3

Pratique R

Localisation : Ouvrage RdC (ROBE)

méthode de Monte-Carlo # langage de programmation R # simulation statistique # analyse bayésienne

65C05 ; 11K45 ; 65C10 ; 65C20 ; 65C40 ; 65C35 ; 62J10 ; 65K05 ; 65C60 ; 65-02 ; 62D05 ; 60J65 ; 62-01 ; 62-04 ; 90C15 ; 90C27 ; 65Y15

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xii; 243 p.
ISBN 978-0-387-87836-2

Undergraduate texts in mathematics

Localisation : Ouvrage RdC (SHON)

simulation informatique # probabilités # distribution # algorithme # optimisation mathématique # processus stochastique # méthode de Monte-Carlo # théorie des jeux # simulation numérique

65C05 ; 68T05 ; 60J20 ; 60G40 ; 60-01 ; 60J22 ; 65C40 ; 82C20 ; 65-02 ; 65C35 ; 60G05 ; 11K45 ; 65C50 ; 68W30 ; 90C15 ; 60G50

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- xvi; 387 p.
ISBN 978-3-540-44213-4

Applications of mathematics , 0027

Localisation : Ouvrage RdC (WINK)

analyse d'images # champ aléatoire # méthode de Monte Carlo # traitement d'image # simulation # échantillonage # chaîne de Markov # algorithme de Metropolis # texture # réseau de neurones # tomographie # estimation de paramètres

62H35 ; 62M40 ; 68U20 ; 65C05 ; 65C40 ; 65Y05 ; 60J20 ; 60K35 ; 68U10 ; 68-02 ; 65K10 ; 93E10

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- 645 p.
ISBN 978-0-387-21239-5

Springer texts in statistics

Localisation : Ouvrage RdC (ROBE)

méthode de simulation # méthode de Monte Carlo # génération aléatoire # chaîne de Markov # variable patente # statistique bayésienne # algorithme de Métropolis

62-01 ; 65C40 ; 65-01

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- 114 p.
ISBN 978-0-521-89001-4

London mathematical society student texts , 0052

Localisation : Collection 1er étage

probabilité # chaîne de Markov # méthode de Monte Carlo # simulation parfaite # algorithme de Propp-Wilson # modèle d'Ising # problème du "voyageur de commerce"

60-01 ; 60J10 ; 60J22 ; 65C40

... Lire [+]

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

- 309 p.
ISBN 978-0-387-95547-6

Applied mathematical sciences , 0155

Localisation : Ouvrage RdC (CHAL)

traitement d'image # modélisation stochastique # statistique # chaîne de Markov # analyse d'image # modèle de spline # estimation paramétrique # optimisation stochastique # champs gaussien continu

68U10 ; 00A71 ; 62-02 ; 62H35 ; 62M40 ; 65C40 ; 68-02 ; 90C90

... Lire [+]

Z