So the objective here is to find the Moment Generating Function ( or short MGF) of a \(n\)-dimensional random vector having a Multivariate Normal distribution.
We are interested in finding the Moment Generating Function (MGF) of an \(n\)-dimensional random vector having a Multivariate Normal distribution, that is \(\boldsymbol{X} \sim N_{n}(\boldsymbol{\mu}, \boldsymbol{\Sigma})\). This function, which is denoted \(M_{\boldsymbol{X}}(\boldsymbol{t})\) is an alternative specification of its PDF and defined as
\[ M_{\boldsymbol{X}}(\boldsymbol{t}) = E[e^{\boldsymbol{t}^{T} \boldsymbol{X} }] \]
Then \(n^{th}\) moment will be given by \(E[\boldsymbol{X}^{n}] = \frac{ d^{n} M_{\boldsymbol{X}}(\boldsymbol{t}) }{d\boldsymbol{t}^{n}} \big|_{\boldsymbol{t=0}}\), that is the \(n^{th}\) derivative of the MGF evaluated at \(\boldsymbol{t=0}\).
A \(n\)-dimensional random vector \(\boldsymbol{X}\) has a Multivariate Normal distribution if its density is given by
\[ f(\boldsymbol{x} ) = \frac{1}{(2 \pi)^{n/2} \mid \boldsymbol{\Sigma} \mid^{1/2}} \exp{\bigg(- \frac{1}{2} ( \boldsymbol{x} - \boldsymbol{\mu})^{T} \boldsymbol{\Sigma}^{-1} ( \boldsymbol{x} - \boldsymbol{\mu}) \bigg)} \]
for \(\boldsymbol{x} \in \mathbb{R}^{n}\) and where \(\boldsymbol{\mu} = (\mu_{1}, \mu_{2},..., \mu_{n})^{T}\) and \(\boldsymbol {\Sigma}\) is a \(n \times n\) symmetric positive definite covariance matrix.
The Moment Generating Function (MGF) of \(\boldsymbol{X}\) is then
\[ M_{\boldsymbol{X}}(\boldsymbol{t}) = \exp \bigg( \boldsymbol{t}^{T} \boldsymbol{\mu} + \frac{1}{2} \boldsymbol{t}^{T} \boldsymbol {\Sigma} \boldsymbol{t} \bigg) \]
Indeed, we have
\[ \begin{align*} M_{\boldsymbol{X}}(\boldsymbol{t}) & = E[e^{\boldsymbol{t}^{T} \boldsymbol{X} }] = E[e^{\boldsymbol{t}^{T }(\boldsymbol{\mu} + \boldsymbol{Z}(\boldsymbol{\Sigma}^{1/2} )^{T}) }] \\ & = e^{ \boldsymbol{t}^{T} \boldsymbol{\mu} } E[e^{\boldsymbol{Z}(\boldsymbol{t}\boldsymbol{\Sigma}^{1/2} )^{T}}] \\ & = e^{ \boldsymbol{t}^{T} \boldsymbol{\mu} } M_{\boldsymbol{Z}}(\boldsymbol{t} \boldsymbol{\Sigma}^{1/2} ) \\ & = e^{ \boldsymbol{t}^{T} \boldsymbol{\mu} } e^{\frac{1}{2} \mid \mid \boldsymbol{t} \boldsymbol{\Sigma}^{1/2} \mid \mid^{2}} \\ & = e^{ \boldsymbol{t}^{T} \boldsymbol{\mu} + \frac{1}{2} \boldsymbol{t}^{T} \boldsymbol{\Sigma}^{1/2T} \boldsymbol{\Sigma}^{1/2} \boldsymbol{t} } \\ & = e^{ \boldsymbol{t}^{T} \boldsymbol{\mu} + \frac{1}{2} \boldsymbol{t}^{T} \boldsymbol{\Sigma} \boldsymbol{t} } \\ \end{align*} \]
For the first moment, we have
\[ \begin{align*} E[\boldsymbol{X}]= \frac{ d M_{\boldsymbol{X}}(\boldsymbol{t}) }{d\boldsymbol{t}} \big|_{\boldsymbol{t=0}} &= M_{\boldsymbol{X}}'(\boldsymbol{0}) \\ & = (\boldsymbol {\mu} + \boldsymbol {\Sigma} \boldsymbol{t} ) e^{ \boldsymbol{t}^{T} \boldsymbol {\mu} + \frac{1}{2} \boldsymbol{t}^{T} \boldsymbol {\Sigma} \boldsymbol{t} } \\ & = \boldsymbol {\mu} \\ \end{align*} \]
So we conclude that the first moment of a Multivariate Normal random vector is just \(\boldsymbol {\mu} = (\mu_{1}, \mu_{2},..., \mu_{n})^{T}\). Moments of higher order can be similarly derived from the MGF.
# parameters
rho_12 <- -0.75; rho_13 <- 0.4; rho_23 <- -0.1
mu1 <- 0 ; mu2 <- 0 ; mu3 <- 2
s1 <- 2; s2 <- 1; s3 <- 1
# mean
mu <- c(mu1, mu2, mu3)
# covariance matrix
# cov_12 = s1*s2*rho_12
Sigma <- matrix(c(s1^2, s1*s2*rho_12, s1*s3*rho_13,
s2*s1*rho_12, s2^2, s2*s3*rho_23,
s1*s3*rho_13, s2*s3*rho_23, s3^2), nrow = 3)
t <- c(0,0,0)
mgf <- exp((t(t) %*% mu) + 0.5* t(t) %*% Sigma %*% t )
dmgf.dt <- (mu + Sigma %*% t) %*% exp((t(t) %*% mu) + 0.5* t(t) %*% Sigma %*% t )
dmgf.dt
## [,1]
## [1,] 0
## [2,] 0
## [3,] 2
Marden, J.I. (2015), Multivariate Statistics. University of Illinois at Urbana-Champaign.
Rizzo, M. L. (2019), Statistical Computing with R, Second Edition, Chapman & Hall/CRC, ISBN 9781466553330