12 Linear regression


This chapter introduces the basics of linear regression modeling. It covers ordinary least-squares (OLS) regression, a maximum-likelihood approach, and finally a Bayesian approach. Impatient readers may wish to skip to the Bayesian analyses directly, but understanding the OLS and MLE approaches helps to see the bigger (historical) picture and also helps appreciating some of the formal results pertaining to the Bayesian analysis.

All concrete calculations in this chapter are based on the same running example using the murder data set.

Based on a different example, the following video gives an overview of the main ideas behind a Bayesian approach to simple linear regression.

The learning goals for this chapter are:

  • understand what a linear regression model is
  • see the conceptual differences between OLS, MLE and Bayesian approaches to linear regression
  • be able to find best fitting values for OLS and MLE regression using R’s built-in functions
  • be able to sample a posteriori credible values for a regression model using the non-informative standard model