# Bayesian Modeling with RJAGS

In this course, you'll learn how to implement more advanced Bayesian models using RJAGS.
4 Hours15 Videos58 Exercises5,448 Learners
4650 XP

or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA. You confirm you are at least 16 years old (13 if you are an authorized Classrooms user).

## Course Description

The Bayesian approach to statistics and machine learning is logical, flexible, and intuitive. In this course, you will engineer and analyze a family of foundational, generalizable Bayesian models. These range in scope from fundamental one-parameter models to intermediate multivariate & generalized linear regression models. The popularity of such Bayesian models has grown along with the availability of computing resources required for their implementation. You will utilize one of these resources - the rjags package in R. Combining the power of R with the JAGS (Just Another Gibbs Sampler) engine, rjags provides a framework for Bayesian modeling, inference, and prediction.

1. 1

### Introduction to Bayesian Modeling

Free
Bayesian models combine prior insights with insights from observed data to form updated, posterior insights about a parameter. In this chapter, you will review these Bayesian concepts in the context of the foundational Beta-Binomial model for a proportion parameter. You will also learn how to use the rjags package to define, compile, and simulate this model in R.
2. 2

### Bayesian Models & Markov Chains

The two-parameter Normal-Normal Bayesian model provides a simple foundation for Normal regression models. In this chapter, you will engineer the Normal-Normal and define, compile, and simulate this model using rjags. You will also explore the magic of the Markov chain mechanics behind rjags simulation.
3. 3

### Bayesian Inference & Prediction

In this chapter, you will extend the Normal-Normal model to a simple Bayesian regression model. Within this context, you will explore how to use rjags simulation output to conduct posterior inference. Specifically, you will construct posterior estimates of regression parameters using posterior means & credible intervals, you will test hypotheses using posterior probabilities, and you will construct posterior predictive distributions for new observations.
4. 4

### Multivariate & Generalized Linear Models

In this final chapter, you will generalize the simple Normal regression model for application in broader contexts. You will incorporate categorical predictors, engineer a multivariate regression model with two predictors, and finally extend this methodology to Poisson multivariate regression models for count variables.
In the following tracks
Statistician
Collaborators
Nick SolomonChester IsmayEunkyung Park

#### Alicia Johnson

Associate Professor, Macalester College
Professor Johnson’s primary research interest is in Markov chain Monte Carlo (MCMC) methods. The focus of her research has been on the convergence rates of chains corresponding to MCMC algorithms on general state spaces. In addition, she enjoys the unlimited applications of statistics. As a consultant for a division of the Centers for Disease Control and Prevention and the University of Minnesota, she has collaborated on projects in entomology, forestry, primatology, public health, and others. In other words, her work in statistics allows her to keep learning a little about a lot!

## What do other learners have to say?

I've used other sites—Coursera, Udacity, things like that—but DataCamp's been the one that I've stuck with.

Devon Edwards Joseph
Lloyds Banking Group

DataCamp is the top resource I recommend for learning data science.

Louis Maiden