Time Series for Actuaries

Akrit Awasthi
6 min readOct 6, 2021

Introduction

Time series: Set of observation over a time.A time series is a sequence of data points that occur in successive order over some period of time In investing, a time series tracks the movement of the chosen data points, such as a security’s price, over a specified period of time with data points recorded at regular intervals.

Stochastic process:

  1. Discrete time
  2. Continuous state space

Idea that the observations are related to one another Thus not independent Dependency that we want to analyze and take advantage of it Understand the past we can use it for predicting the future

Five key aims

  1. Discrete the data
  2. Fit a model
  3. Forecast future value
  4. Decide if a process is out of control
  5. Look for connections with other time series

Time series stationary

Stationary

Strictly stationary : statistical properties do change over a time Means that the expected value for the time series and the variance of the time series are constant for all t.

But this way is very difficult to test all statistical properties

So another way is weak stationary

Weak stationary

Mean is constant

Covariance depends only on the time difference(lag)

Importance

We can only estimate parameter values with historical values if the series is stationary otherwise we can’t.

If the time series is non-stationary we can take the difference until it becomes stationary.

Markov property

In order to forecast the future, one simply needs the present not the past, or the present value contains all the information about the past, similar to christianity salvation principle.

Auto-covariance and autocorrelation functions

Auto-covariance function(ACVF) is defined as the covariance between two observations of a series.

PACF is a partial auto correlation function that finds the correlation of residuals.

White noise and other common types of time series

white noise

It has a Mean function is 0, average value for the white noise is zero. sometimes is fluctuates up and sometimes it fluctuates down.

ACF(Autocorrelation function ) is 1, if k = 0 otherwise it is 0 every where else

Purely indeterministic process where ACF tends to 0 when k tends to infinity

white noise series graph

Now You can calculate the mean and standard deviation of this series and notice that the series will have a mean close to 0 and a standard deviation close to 1.

> mean(WN)
[1] -0.02708959
> var(WN)
[1] 1.04946
> sd(WN)
[1] 1.024431

When we plot the autocorrelation function we can see we get spike at zero

ACF Plot

ARIMA Time Series

AR stands for autoregressive process

I stands for integrated

MA stands for Moving Average

White noise = ARIMA(0,0,0) When all three of these parameters is zero

Creating a white noise in R code

> WhiteN<- arima.sim(model = list(order = c(0, 0, 0)), n = 500)

Autoregressive- AR(p)

Idea that current value of x is a linear combination of past value with variation

AR2 = arima.sim(list(order=c(2,0,0), ar=c(0.7,0.1)), n = 300)

To be stationary

Moving average — MA(q)

Idea that the current value of x depends on recent past random error terms as well as current one. “Smooth noise” stationary regardless of parameters

AR2 = arima.sim(list(order=c(0,0,2), ma=c(0.7,0.1)), n = 300)

Integrated I(d)

what is random walk?

Random walk is saying that my current position is equal to my previous position plus a random variable. the expected value of an any of the position is actually where i started and the mean is constant.

The use of differencing of raw observations (e.g. subtracting an observation from an observation at the previous time step) in order to make the time series stationary.

Parameters of the ARIMA model are defined as follows:

p-The number of lag observations included in the model, also called the lag order.

d-The number of times that the raw observations are differenced, also called the degree of differencing.

q-The size of the MA(q)window, also called the order of moving average.

Fitting time series to data

Before we fit

Test for stationary

If not then how we transform it to make stationary

Then we fit ARIMA and determine parameters

Then we can forecast the future

How we determine data is not stationary

Need to isolate the deterministic patterns

  1. Trends — growing company
  2. Seasonality — tourism Revenue

To make a stationary we can integrated version of more fundamental process

Ex — random walk is integrated version of white noise

Detecting Non Stationary time series

First we look at our plots

First we will look at the standard one which is just our time series value against time and visually. visually identify trend and seasonality.

Autocorrelation function if stationary then it will converge to zero.

If converge is slow then it needs to be differenced.

If oscillation then there is an underlying cause of the variation.

If it doesn’t converge then that’s an idea that there is a trend.

Least square trend removel

A time series with a trend is called non-stationary.

An identified trend can be modeled. Once modeled, it can be removed from the time series dataset. This is called detrending the time series.

If a dataset does not have a trend or we successfully remove the trend, the dataset is said to be trend stationary.

let’s Do this In R Code

> set.seed(121117)
> y = arima.sim(list(order = c(1,1,0),ar = 0.5), n = 300)
> ts.plot(y, xlab = "Time", ylab = "Simulated Vlue ARIMA(1,1,0)")
Output

you can see that in this plot there is very much trend.

Plot a least square fit in R

> x <- 1:301
> leastsquarefit <- lm(y~x)
> leastsquarefit$coefficient
(Intercept) x
-7.24371389 0.06500729
> abline(leastsquarefit)

Check the Residuals

> plot(leastsquarefit$residuals, xlab = "Time", ylab = "Residuals")
> acf(leastsquarefit$residuals)

First line will draw a this graph we can see that those residuals don’t look like they are randomly distributed anything you can see there is a bit of a pattern going on they are checking those residuals.

ACF graph

2nd line will draw this graph autocorrelation function this is going to show us the following that is a very slow decay that is very much indicating that there still a trend.

Now when it comes to fitting these remodels we can use the autocorrelation function and the partial correlation function to determine a parameters.

PACF Graph

This partial correlation function is showing 1 and the negative.

This is showing that this is not stationary will take difference.

Conclusion

In this blog i have explained about ARIMA Time series, Markov property, and what is stationary and no stationary.and also discussed about white noise and residuals error and I have explained how to detect non stationary data.

--

--