Week 1: Introduction to time series and the AR(1) process -Introduction
Welcome to Bayesian Statistics: Time Series
()
Introduction to R
List of References
Week 1: Introduction to time series and the AR(1) process -Stationarity, the ACF and the PACF
Stationarity
()
The autocorrelation function (ACF)
()
The partial autocorrelation function (PACF)
Differencing and Smoothing
ACF, PACF, Differencing and Smoothing: Examples
()
R Code: Differencing and filtering via moving averages
R Code: Simulate data from a white noise process
Week 1: Introduction to time series and the AR(1) process -The AR(1) process: Definition and properties
The AR(1)
()
The PACF of the AR(1) process
Simulating from an AR(1) process
()
R Code: Sample data from AR(1) processes
Week 1: Introduction to time series and the AR(1) process -The AR(1): Maximum likelihood estimation and Bayesian inference
Review of maximum likelihood and Bayesian inference in regression
Maximum likelihood estimation in the AR(1)
()
R code: MLE for the AR(1), examples
Bayesian inference in the AR(1)
()
Bayesian inference in the AR(1): Conditional likelihood example
()
R Code: AR(1) Bayesian inference, conditional likelihood example
Bayesian inference in the AR(1), full likelihood example
Week 2: The AR(p) process -The general AR(p) process
Definition and state-space representation
()
Examples
()
ACF of the AR(p)
()
Simulating data from an AR(p)
()
Rcode: Computing the roots of the AR polynomial
Rcode: Simulating data from an AR(p)
The AR(p): Review
Week 2: The AR(p) process -Bayesian inference in the AR(p)
Bayesian inference in the AR(p): Reference prior, conditional likelihood
()
Rcode: Maximum likelihood estimation, AR(p), conditional likelihood
Model order selection
()
Example: Bayesian inference in the AR(p), conditional likelihood
()
Rcode: Bayesian inference, AR(p), conditional likelihood
Rcode: Model order selection
Spectral representation of the AR(p)
()
Spectral representation of the AR(p): Example
()
Rcode: Spectral density of AR(p)
ARIMA processes
Week 3: Normal dynamic linear models, Part I -The Normal Dynamic Linear Model: Definition, model classes, and the superposition principle
NDLM: Definition
()
Polynomial trend models
()
Regression models
()
Summary of polynomial trend and regression models
The superposition principle
()
Superposition principle: General case
Week 3: Normal dynamic linear models, Part I -Bayesian inference in the NDLM: Part I
Filtering
()
Summary of the filtering distributions
Filtering in the NDLM: Example
()
Rcode. Filtering in the NDLM: Example
Smoothing and forecasting
()
Summary of the smoothing and forecasting distributions
Smoothing in the NDLM: Example
()
Rcode: Smoothing in the NDLM, Example
Second order polynomial: Filtering and smoothing example
()
Using the dlm package in R
()
Rcode: Using the dlm package in R
Week 4: Normal dynamic linear models, Part II-Seasonal NDLMs
Fourier representation
()
Fourier Representation: Example 1
Building NDLMs with multiple components: Examples
()
Summary: DLM Fourier representation
Week 4: Normal dynamic linear models, Part II-Bayesian inference in the NDLM: Part II
Filtering, Smoothing and Forecasting: Unknown observational variance
()
Summary of Filtering, Smoothing and Forecasting Distributions, NDLM unknown observational variance
Specifying the system covariance matrix via discount factors
()
NDLM, Unknown Observational Variance: Example
()
Rcode: NDLM, Unknown Observational Variance Example
Week 4: Normal dynamic linear models, Part II- Case studies
EEG data
()
Google trends
()