Introduction

Simulation is an important technique in applied mathematics. It typically means using computers to conduct experiments! In statistics, it is very useful way to check if a statistical model has the necessary properties and this involves sampling from probability distributions.

In most real life situations, analytical derivations of statistical properties of a model is rarely possible and usually works with Large sample approximations.

1. Normal Random Variable The core step in simulation is drawing samples from probability distributions.

Example Generate 10000 samples from normal distribution with mu=2, std=0.5.

norm.samples <- rnorm(10000, mean = 2, sd = 0.5)

# Plot histogram 
hist(norm.samples, breaks = 21, col = "steelblue")

# Plot the "density"
plot(density(norm.samples))
polygon(density(norm.samples), col="steelblue", border="red")

2.Uniform Random Variable

Example Generate 10000 samples from uniform distribution with min=0, max=2

unif.samples <- runif(100000, 0, 2)

# Plot histogram 
hist(unif.samples, breaks = 21, col = "steelblue")

# Plot the density 
plot(density(unif.samples))
polygon(density(unif.samples), col="steelblue", border="red")

3.Exponential Random Variable

Example Generate 10000 samples from exponential distribution with beta=3

expo.samples <- rexp(100000)

# Plot histogram 
hist(expo.samples, breaks = 21, col = "steelblue")

# Plot the density 
plot(density(expo.samples))
polygon(density(expo.samples), col="steelblue", border="red")

4.Gamma Random Variable

Example Generate 10000 samples from gamma distribution with alpha=3 and beta=2

gamma.samples <- rgamma(100000,3,2)

# Plot histogram 
hist(gamma.samples, breaks = 21, col = "steelblue")

# Plot the density 
plot(density(gamma.samples))
polygon(density(gamma.samples), col="steelblue", border="red")

5.Inverse Gamma Random Variable

Example Generate 10000 samples from gamma distribution with alpha=3 and beta=2

library(pscl)
## Warning: package 'pscl' was built under R version 4.0.5
## Classes and Methods for R developed in the
## Political Science Computational Laboratory
## Department of Political Science
## Stanford University
## Simon Jackman
## hurdle and zeroinfl functions by Achim Zeileis
inv_gamma.samples <- rigamma(100000,3,2)

# Plot histogram 
hist(inv_gamma.samples, breaks = 21, col = "steelblue")

# Plot the density 
plot(density(inv_gamma.samples))
polygon(density(inv_gamma.samples), col="steelblue", border="red")

6.Beta Random Variable

Example Generate 10000 samples from gamma distribution with alpha=3 and beta=2

beta.samples <- rbeta(100000,3,2)

# Plot histogram 
hist(beta.samples, breaks = 21, col = "steelblue")

# Plot the density 
plot(density(beta.samples))
polygon(density(beta.samples), col="steelblue", border="red")

7.Chi-Square Random Variables

Example Generate 10000 samples from chi-square distribution with k=3

chi_sq.samples <- rchisq(100000,3)

# Plot histogram 
hist(chi_sq.samples, breaks = 21, col = "steelblue")

# Plot the density 
plot(density(chi_sq.samples))
polygon(density(chi_sq.samples), col="steelblue", border="red")

8. Inverse Transfrom Method

The Inverse transform theorem is technical result in probability theory which states that if X is a random variable with the C.D.F.

F(x), then if u is random sample from the uniform(0,1) then \(F^{-1}(u)\) is a random sample of X.

There is a simple, sometimes useful transformation, known as the probability integral transform, that allows us to transform any random variable into a uniform random variable and, more importantly, vice versa.So that, with inverse transform method, we can create any random variable.

Example 1 Using the inverse transform method, generate 10000 samples from Exponential Distribution with rate 1 (Generate samples from X ~ Exp(1)).

# Step 1: Find the inverse function  of y= 1- e(-x)  
# ==> f^(-1)(y)= x = -log(1-y)
# Hence the inverse function of y= 1- e(-x) is    F^(-1)(y)= -log(1-y)

#Step 2: generate n samples from uniform random variable

n <- 10000
u <- runif(n)

# Step 3: compute F^-1(u)
inv_samples <- sapply(u, function(u) -1.0*log(1-u))


# Plot histogram
hist(inv_samples, breaks = 21, col = "steelblue", main = "Histogram" )

# Plot the density
plot(density(inv_samples), xlim = c(0.5, 5), main = "Density plot")
polygon(density(inv_samples), col="steelblue", border="red", xlim = c(0.5, 5))

Example 2 Let \(X\) be the Logistic distribution with the C.D.F given by \[F(x) = \frac{1}{1 + e^{-(\frac{(x-\mu)}{\beta})}}\] for two parameters \(\mu\) and \(\beta\).

Generate 1000 random samples using the inverse transform method. Take \(\mu = 5\) and \(\beta = 2\).

# Step 1: Find the inverse function  of y= 1 /  1+e(^-(x-mu)/beta)  
# ==> f^(-1)(y)= x = mu - beta* (log((1/y)-1))
# Hence the inverse function of y= 1 /  1+e(^-(x-mu)/beta) is F^(-1)(y)= mu - beta* (log((1/y)-1))

#Step 2: generate n samples from uniform random variable

mu <- 5
beta <- 2
n <- 10000
u <- runif(n)

# Step 3: compute F^-1(u)
inv_samples2 <- sapply(u, function(u) mu - beta*log((1/u)-1))


# Plot histogram
hist(inv_samples2, breaks = 21, col = "steelblue", main = "Histogram" )

# Plot the density
plot(density(inv_samples2), xlim = c(0.5, 5), main = "Density plot")
polygon(density(inv_samples2), col="steelblue", border="red", xlim = c(0.5, 5))


References:

  1. Colorado State University MAS Computational Methods Course Notes.
  2. Colorado State University MAS Mathematical Probability Course Notes.
  3. Introducing Monte Carlo Methods with R-Springer