top of page

Understanding White Noise in Time Series Models: The Case of xt = xt-1 + et

Jan 23, 2024

In the world of time series analysis, the concept of white noise plays a crucial role in understanding and modeling dependent variables over time. In this article, we'll examine the equation xt = xt-1 + et, where et is strict white noise, and discuss its implications for time series models.

To delve into this subject, let's first define what we mean by 'white noise.' White noise refers to a random variable that is uncorrelated with its past values and has a mean of zero. In time series models, white noise serves as the basic building block for understanding, modeling, and making inferences about the underlying structure of a given time series.

Now let's consider the equation xt = xt-1 + et, where xt represents our dependent variable at time t, and et is strict white noise. This equation represents a first-order autoregressive model, also known as an AR(1) model.

In this model, the current value of the dependent variable, xt, is determined by the previous value, xt-1, and a random error term, et, which is assumed to be white noise. The random error term adds an element of unpredictability to the model, ensuring that the behavior of the dependent variable is not perfectly determined by its previous values.

So, where is the strict white noise located in this equation? Strict white noise is represented by the term et, which is the random error component of the model. This error term is what allows for the necessary randomness and uncertainty in the model, preventing perfect predictability and maintaining statistical validity.

In summary, strict white noise plays a pivotal role in time series models, and in the equation xt = xt-1 + et, it is represented by the error term et. Understanding the concept of white noise and its role in time series analysis enables analysts to make better inferences, forecasts, and decisions about the underlying data generating processes.

bottom of page