## Theory

Dynamic autoregression models extend autoregressive models by allowing the autoregressive coefficients to vary over time. For a univariate time series :

In other words, the autoregressive coefficients follow a random walk. This model can be estimated through the Kalman Filter and Smoother.

## PyFlux

We’ll run an Dynamic Autoregressive (DAR) Model for yearly sunspot data:

```
import numpy as np
import pandas as pd
import pyflux as pf
from datetime import datetime
import matplotlib.pyplot as plt
%matplotlib inline
data = pd.read_csv('https://vincentarelbundock.github.io/Rdatasets/csv/datasets/sunspot.year.csv')
data.index = data['time'].values
plt.figure(figsize=(15,5))
plt.plot(data.index,data['sunspot.year'])
plt.ylabel('Sunspots')
plt.title('Yearly Sunspot Data');
```

Here we specify an arbitrary DAR(9) model:

```
model = pf.DAR(data=data,ar=9,integ=0,target='sunspot.year')
```

Next we estimate the latent variables. For this example we will use a maximum likelihood point mass estimate :```
x = model.fit("MLE")
x.summary()
```

```
DAR(9, integrated=0)
======================================================= =================================================
Dependent Variable: sunspot.year Method: MLE
Start Date: 1709 Log Likelihood: -1179.097
End Date: 1988 AIC: 2380.194
Number of observations: 280 BIC: 2420.1766
=========================================================================================================
Latent Variable Estimate Std Error z P>|z| 95% C.I.
======================================== ========== ========== ======== ======== ========================
Sigma^2 irregular 0.301
Constant 60.0568 23.83 2.5202 0.0117 (13.3499 | 106.7637)
Sigma^2 AR(1) 0.005
Sigma^2 AR(2) 0.0
Sigma^2 AR(3) 0.0005
Sigma^2 AR(4) 0.0001
Sigma^2 AR(5) 0.0002
Sigma^2 AR(6) 0.0011
Sigma^2 AR(7) 0.0002
Sigma^2 AR(8) 0.0003
Sigma^2 AR(9) 0.032
=========================================================================================================
```

Note we have no standard errors in the results table because it shows the transformed parameters. If we want standard errors, we can call x.summary(transformed=False). Next we will plot the in-sample fit and the dynamic coefficients using

**plot_fit**:

```
model.plot_fit(figsize=(15,40))
```

The sharp changes at the beginning reflect the diffuse initialization; together with high initial uncertainty, this leads to stronger updates towards the beginning of the series. We can predict forward using

**plot_predict**:

```
model.plot_predict(h=50,past_values=40,figsize=(15,5))
```

The prediction intervals here are unrealistic and reflect the Gaussian distributional assumption we’ve chosen – we can’t have negative sunspots! – but if we are just want the predictions themselves, we can use the

**predict**method:

```
model.predict(h=50)
```

sunspot.year | |
---|---|

1989 | 139.637190 |

1990 | 166.080697 |

1991 | 158.856260 |

1992 | 123.442112 |

1993 | 84.140754 |

1994 | 42.171881 |

1995 | 15.495871 |

1996 | 4.020670 |

1997 | 40.250429 |

1998 | 84.376278 |

1999 | 136.250978 |

2000 | 163.849298 |

2001 | 162.628458 |

2002 | 134.597983 |

2003 | 90.473426 |

2004 | 45.883060 |

2005 | 9.204294 |

2006 | 7.601138 |

2007 | 30.750374 |

2008 | 80.300545 |

2009 | 128.983109 |

2010 | 162.943127 |

2011 | 166.774942 |

2012 | 142.477788 |

2013 | 98.476385 |

2014 | 47.089022 |

2015 | 12.418343 |

2016 | 2.323678 |

2017 | 26.707932 |

2018 | 71.845696 |

2019 | 124.183167 |

2020 | 160.798035 |

2021 | 170.815410 |

2022 | 149.830120 |

2023 | 104.539250 |

2024 | 53.246798 |

2025 | 12.534159 |

2026 | 0.601882 |

2027 | 19.683715 |

2028 | 65.149369 |

2029 | 117.634307 |

2030 | 159.336605 |

2031 | 173.882239 |

2032 | 155.956235 |

2033 | 112.454194 |

2034 | 58.413341 |

2035 | 15.136978 |

2036 | -2.363851 |

2037 | 14.070445 |

2038 | 57.020995 |