Bayesian Modeling with RJAGS
In this course, you'll learn how to implement more advanced Bayesian models using RJAGS.
Start Course for Free4 Hours15 Videos58 Exercises
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.Loved by learners at thousands of companies
Course Description
The Bayesian approach to statistics and machine learning is logical, flexible, and intuitive. In this course, you will engineer and analyze a family of foundational, generalizable Bayesian models. These range in scope from fundamental one-parameter models to intermediate multivariate & generalized linear regression models. The popularity of such Bayesian models has grown along with the availability of computing resources required for their implementation. You will utilize one of these resources - the rjags package in R. Combining the power of R with the JAGS (Just Another Gibbs Sampler) engine, rjags provides a framework for Bayesian modeling, inference, and prediction.
- 1
Introduction to Bayesian Modeling
FreeBayesian models combine prior insights with insights from observed data to form updated, posterior insights about a parameter. In this chapter, you will review these Bayesian concepts in the context of the foundational Beta-Binomial model for a proportion parameter. You will also learn how to use the rjags package to define, compile, and simulate this model in R.
The prior model50 xpSimulating a Beta prior100 xpComparing & contrasting Beta priors100 xpWhich prior?50 xpData & the likelihood50 xpSimulating the dependence of X on p100 xpApproximating the likelihood function100 xpInterpreting the likelihood function50 xpThe posterior model50 xpDefine, compile, and simulate100 xpUpdating the posterior100 xpInfluence of the prior & data on the posterior50 xp - 2
Bayesian Models & Markov Chains
The two-parameter Normal-Normal Bayesian model provides a simple foundation for Normal regression models. In this chapter, you will engineer the Normal-Normal and define, compile, and simulate this model using rjags. You will also explore the magic of the Markov chain mechanics behind rjags simulation.
The Normal-Normal model50 xpNormal-Normal priors100 xpSleep study data100 xpInsights from the prior and data50 xpSimulating the Normal-Normal in RJAGS50 xpDefine, compile, & simulate the Normal-Normal100 xpPosterior insights on sleep deprivation50 xpMarkov chains50 xpStoring Markov chains100 xpMarkov chain trace plots100 xpMarkov chain density plots100 xpMarkov chain diagnostics & reproducibility50 xpMultiple chains100 xpNaive standard errors100 xpReproducibility100 xp - 3
Bayesian Inference & Prediction
In this chapter, you will extend the Normal-Normal model to a simple Bayesian regression model. Within this context, you will explore how to use rjags simulation output to conduct posterior inference. Specifically, you will construct posterior estimates of regression parameters using posterior means & credible intervals, you will test hypotheses using posterior probabilities, and you will construct posterior predictive distributions for new observations.
A simple Bayesian regression model50 xpRegression priors100 xpVisualizing the regression priors100 xpWeight & height data100 xpBayesian regression in RJAGS50 xpDefine, compile, & simulate the regression model100 xpRegression Markov chains50 xpPosterior estimation & inference50 xpPosterior point estimates100 xpPosterior credible intervals100 xpPosterior probabilities100 xpPosterior prediction50 xpInference for the posterior trend100 xpCalculating posterior predictions100 xpPosterior predictive distribution100 xp - 4
Multivariate & Generalized Linear Models
In this final chapter, you will generalize the simple Normal regression model for application in broader contexts. You will incorporate categorical predictors, engineer a multivariate regression model with two predictors, and finally extend this methodology to Poisson multivariate regression models for count variables.
Bayesian regression with a categorical predictor50 xpRailTrail sample data100 xpRJAGS simulation with categorical variables100 xpInterpreting categorical coefficients50 xpInference for volume by weekday100 xpMultivariate Bayesian regression50 xpRe-examining the RailTrail data100 xpRJAGS simulation for multivariate regression100 xpInterpreting multivariate regression parameters50 xpPosterior inference for multivariate regression100 xpBayesian Poisson regression50 xpRJAGS simulation for Poisson regression100 xpPlotting the Poisson regression model100 xpInference for the Poisson rate parameter100 xpPoisson posterior prediction100 xpConclusion50 xp
Datasets
Sleep study dataCollaborators



Alicia Johnson
See MoreAssociate Professor, Macalester College
Professor Johnson’s primary research interest is in Markov chain Monte Carlo (MCMC) methods. The focus of her research has been on the convergence rates of chains corresponding to MCMC algorithms on general state spaces. In addition, she enjoys the unlimited applications of statistics. As a consultant for a division of the Centers for Disease Control and Prevention and the University of Minnesota, she has collaborated on projects in entomology, forestry, primatology, public health, and others. In other words, her work in statistics allows her to keep learning a little about a lot!
What do other learners have to say?
Join over 12 million learners and start Bayesian Modeling with RJAGS today!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.