The autocovariance function for two time points, \(s\) and \(t\), is the expected value of the product of the deviations of observations from their mean values at those times.
\[ \gamma(s, t) = \mathbb{E}[(x_s - \mu_s)(x_t - \mu_t)] \]
Given that \(\mathbb{E}[x_t] = \mu_t\), the expectation of \(x_t\) is equal to \(\mu_t\), we can proceed to expand \(\gamma(s, t)\).
Let’s start by expanding \(\mathbb{E}[(x_s - \mu_s)(x_t - \mu_t)]\) and use properties of expectation to simplify it to \(\mathbb{E}[x_s x_t] - \mu_s \mu_t\).
Expanding the product inside the expectation: \[ \mathbb{E}[(x_s - \mu_s)(x_t - \mu_t)] = \mathbb{E}[x_s x_t - x_s \mu_t - \mu_s x_t + \mu_s \mu_t] \]
Using linearity of expectation : \[ = \mathbb{E}[x_s x_t] - \mathbb{E}[x_s \mu_t] - \mathbb{E}[\mu_s x_t] + \mathbb{E}[\mu_s \mu_t] \]
Since \(\mu_s\) and \(\mu_t\) are constants (mean values), their expectation is just themselves: \[ = \mathbb{E}[x_s x_t] - \mu_t \mathbb{E}[x_s] - \mu_s \mathbb{E}[x_t] + \mu_s \mu_t \]
Given that \(\mathbb{E}[x_s] = \mu_s\) and \(\mathbb{E}[x_t] = \mu_t\), substituting these into the equation: \[ = \mathbb{E}[x_s x_t] - \mu_t \mu_s - \mu_s \mu_t + \mu_s \mu_t \]
Simplifying: \[ = \mathbb{E}[x_s x_t] - \mu_s \mu_t \]
So, we’ve shown that the autocovariance function \(\gamma(s, t)\) can indeed be written as \(\gamma(s, t) = \mathbb{E}[(x_s - \mu_s)(x_t - \mu_t)] = \mathbb{E}[x_s x_t] - \mu_s \mu_t\), using the properties of expectation and the given that \(\mathbb{E}[x_t] = \mu_t\).
t <- 1:200
# Both the series differ for the factor of 20, 200
s_a <- c(rep(0, length(t[1:100])), 10*exp(-(t[101:200] - 100)/20)*cos(2*pi*(t[101:200] - 100)/4))
# adding white noise
x_a = s_a + rnorm(200)
s_b <- c(rep(0, length(t[1:100])), 10*exp(-(t[101:200] - 100)/200)*cos(2*pi*(t[101:200] - 100)/4))
x_b = s_b + rnorm(200)
plot.ts(x_a)
# Compute mean functions µx(t) for both series
mu_x_a <- mean(x_a)
mu_x_b <- mean(x_b)
cat("Mean of Series a is: ",mu_x_a)
## Mean of Series a is: 0.09728917
cat("\nMean of Series b is: ",mu_x_b)
##
## Mean of Series b is: -0.09941994
# Plot x_t for series a
plot.ts(x_a, main = "Series a: x_t = s_t + w_t", ylab = "x_t")
abline(h = mu_x_a, col = "red", lty = 2)
# Add legend
legend("topright", legend = paste("Mean µ_x_a ≈", round(mu_x_a, 2)), col = "red", lty = 2)
# Plot x_t for series b
plot.ts(x_b, main = "Series b: x_t = s_t + w_t", ylab = "x_t")
abline(h = mu_x_b, col = "blue", lty = 2)
legend("topright", legend = paste("Mean µ_x_b ≈", round(mu_x_b, 2)), col = "blue", lty = 2)
Given the time series model \(x_t = s_t + w_t\) and knowing that \(\mu_t = s_t\) (since the mean function \(\mu_x(t)\) is equal to \(s_t\)), we can express the autocovariance function \(\gamma_x(s, t)\) as follows:
\[ \gamma_x(s, t) = E[(x_s - \mu_s)(x_t - \mu_t)] \]
Substituting \(x_t = s_t + w_t\) and \(\mu_t = s_t\), we obtain:
\[ \gamma_x(s, t) = E[((s_s + w_s) - s_s)((s_t + w_t) - s_t)] = E[w_sw_t] \]
For any \(t = s\), the autocovariance function simplifies to:
\[ \gamma(t, s) = \sigma^2_w = 1 \]
This indicates that at lag 0, the variance of the white noise is 1.
For any \(t \neq s\), the autocovariance function becomes:
\[ \gamma(t, s) = 0 \]
This result shows that there is no correlation between different time points for \(t \neq s\), which is a defining property of white noise.
It shows that there is no correlation between different time points except at lag 0, where the variance is \(\sigma^2_w = 1\).
To show that the given time series
\[ x_t = U_1 \sin(2\pi\omega_0 t) + U_2 \cos(2\pi\omega_0 t) \]
is weakly stationary and to derive its autocovariance function
\[ \gamma(h) = \sigma^2 \cos(2\pi\omega_0 h), \]
we need to verify two conditions for weak stationarity:
Proving the Mean is Constant
Given \(U_1\) and \(U_2\) are independent random variables with zero means, \(E[U_1] = E[U_2] = 0\), and \(E[U_1^2] = E[U_2^2] = \sigma^2\), let’s calculate the mean of \(x_t\):
\[ E[x_t] = E[U_1 \sin(2\pi\omega_0 t) + U_2 \cos(2\pi\omega_0 t)] \]
Given the properties of expectation and the independence of \(U_1\) and \(U_2\), and considering their means are zero:Expectation of sin and the cos functions would be themselve, as these are deterministic i.e for a specific value of t, the value of function won;t change.
\[ E[x_t] = E[U_1] \sin(2\pi\omega_0 t) + E[U_2] \cos(2\pi\omega_0 t) = 0 \]
So, the mean of \(x_t\) is constant (specifically, zero) and does not depend on \(t\).
To calculate the autocovariance function \(\gamma(h)\):
\[ \gamma(h) = E\left[(x_t - E[x_t])(x_{t+h} - E[x_{t+h}])\right] \]
Given \(E[x_t] = 0\) and \(E[x_{t+h}] = 0\), this simplifies to:
\[ \gamma(h) = E[x_t x_{t+h}] \]
Putting \(x_t\) and \(x_{t+h}\):
\[ \gamma(h) = E\left[\left(U_1 \sin(2\pi\omega_0 t) + U_2 \cos(2\pi\omega_0 t)\right)\left(U_1 \sin(2\pi\omega_0 (t+h)) + U_2 \cos(2\pi\omega_0 (t+h))\right)\right] \]
Using the identity, focussing on the terms that contribute to the expectation given \(U_1\) and \(U_2\) are independent with zero mean and equal variance:
\[ \gamma(h) = E[U_1^2] \sin(2\pi\omega_0 t)\sin(2\pi\omega_0 (t+h)) + E[U_2^2] \cos(2\pi\omega_0 t)\cos(2\pi\omega_0 (t+h)) \]
Using trigonometric identities and the fact that \(E[U_1^2] = E[U_2^2] = \sigma^2\), we find:
\[ \gamma(h) = \sigma^2 \left(\sin(2\pi\omega_0 t)\sin(2\pi\omega_0 (t+h)) + \cos(2\pi\omega_0 t)\cos(2\pi\omega_0 (t+h))\right) \]
Using the property \(\sin(A)\sin(B) + \cos(A)\cos(B) = \cos(A-B)\), we simplify to:
\[ \gamma(h) = \sigma^2 \cos(2\pi\omega_0 t - 2\pi\omega_0 (t+h)) = \sigma^2 \cos(-2\pi\omega_0 h) \]
Since \(\cos(-\theta) = \cos(\theta)\), it follows that:
\[ \gamma(h) = \sigma^2 \cos(2\pi\omega_0 h) \]
This establishes the autocovariance function and demonstrates the second property of weak stationarity: the autocovariance depends only on the lag \(h\) and not on time \(t\), concluding that the given time series is weakly stationary with autocovariance function \(\gamma(h) = \sigma^2 \cos(2\pi\omega_0 h)\).
The expected value \(E[x_t]\) for the series \(x_t = \sin(2\pi Ut)\), where \(U\) is uniformly distributed over the interval \((0, 1)\), is calculated by integrating the function \(\sin(2\pi Ut)\) times the probability density function (pdf) of \(U\). The pdf of \(U\) is \(1\) over \((0,1)\). Thus, we calculate the expected value as follows:
\[E[x_t] = \int_0^1 \sin(2\pi Ut) dU\]
Computing this integral gives us:
\[E[x_t] = \left[ -\frac{\cos(2\pi Ut)}{2\pi t} \right]_0^1\]
\[E[x_t] = -\frac{\cos(2\pi t) - \cos(0)}{2\pi t}\]
Given that \(\cos(2\pi t) = \cos(0) = 1\) for any integer \(t\), the expression simplifies to:
\[E[x_t] = -\frac{1 - 1}{2\pi t} = 0\]
This calculation demonstrates that the mean of \(x_t\) is indeed \(0\), satisfying the first condition for weak stationarity that the mean is constant over time.
The integral of \(\sin(2\pi Ut)\) from \(0\) to \(1\) evaluates to \(0\), given the periodic nature of the sine function and the uniform distribution of \(U\), not specifically because \(t\) is an integer.
Therefore the mean is constant over time and proves the first property of weak stationarity.
To prove that the autocovariance of x(t) does not depend upon the time period t rather depends upon lag h:
# For 500 points
w = rnorm(500, 0, 1)
# smoothing signal using moving average
v = filter(w, sides = 2, filter = rep(1/3, 3))
# plotting white noise and the smoothed signal
plot.ts(w, main = "white noise when points = 500")
plot.ts(v, main = "moving average for series 1 (pts 500)")
ma_clean = v[!is.na(v)]
acf(w, main = "acf Series s1") # Calculating auto correlation
acf(ma_clean, lag.max = 20, main = "acf Series MA1") # Calculating Sample auto correlation
w2 = rnorm(50, 0, 1)
v2 = filter(w2, sides = 2, filter = rep(1/3, 3))
plot.ts(w2, main = "white noise 50 points")
plot.ts(v2, main = "moving average for series 2 (pts 50)")
ma2 = v2[!is.na(v2)]
acf(w2, lag.max = 20, main = "acf Series s2")
acf(ma2, lag.max = 20, main = "acf MA Series s2")
For 50 points, we can see there is a lot of discrepency between actual acf and sample acf in comparision to 500 points.
Also, for actual ACF we can see lags are not correlated (as white noise is random in nature), when we perform moving average for initial lags the autocorrelation is there.
When we see for 50 points, the autocorrelation in ACF plot for M.A shows a very volatile trend.
phi <- 0.6
sigma_squared <- 2.0
n <- 100
X_bar <- 0.271
Z_alpha_2 <- 1.96
sigma_adj_squared <- sigma_squared / (1 - phi)^2
# Standard Error of the mean
SE <- sqrt(sigma_adj_squared / n)
# Constructing the 95% Confidence Interval
CI_lower <- X_bar - Z_alpha_2 * SE
CI_upper <- X_bar + Z_alpha_2 * SE
cat("Adjusted Variance: ", sigma_adj_squared, "\n")
## Adjusted Variance: 12.5
cat("Standard Error: ", SE, "\n")
## Standard Error: 0.3535534
cat("95% Confidence Interval: [", CI_lower, ", ", CI_upper, "]\n")
## 95% Confidence Interval: [ -0.4219646 , 0.9639646 ]
0 falls between the lower and upper bound of the confidence interval hence we fail to reject the null hypothesis.