This is an R Markdown Notebook. When you execute code within the notebook, the results appear beneath the code.

Try executing this chunk by clicking the Run button within the chunk or by placing your cursor inside it and pressing Ctrl+Shift+Enter. S # Load necessary Librariees

#Load Stock Data Here we are using Quantmod getSymbols to load stock data of the SPY S&P 500 ETF The data is turned into daily logarithmic returns.

getSymbols("SPY", from = "2019-01-01")
## 'getSymbols' currently uses auto.assign=TRUE by default, but will
## use auto.assign=FALSE in 0.5-0. You will still be able to use
## 'loadSymbols' to automatically load data. getOption("getSymbols.env")
## and getOption("getSymbols.auto.assign") will still be checked for
## alternate defaults.
## 
## This message is shown once per session and may be disabled by setting 
## options("getSymbols.warning4.0"=FALSE). See ?getSymbols for details.
## [1] "SPY"
x <- dailyReturn(SPY, type ="log")
head(x,10)
##            daily.returns
## 2019-01-02  0.0169304153
## 2019-01-03 -0.0241520913
## 2019-01-04  0.0329469667
## 2019-01-07  0.0078537254
## 2019-01-08  0.0093514680
## 2019-01-09  0.0046626032
## 2019-01-10  0.0035213502
## 2019-01-11  0.0003862279
## 2019-01-14 -0.0061196093
## 2019-01-15  0.0113956308
str(x)
## An 'xts' object on 2019-01-02/2021-09-20 containing:
##   Data: num [1:685, 1] 0.01693 -0.02415 0.03295 0.00785 0.00935 ...
##  - attr(*, "dimnames")=List of 2
##   ..$ : NULL
##   ..$ : chr "daily.returns"
##   Indexed by objects of class: [Date] TZ: UTC
##   xts Attributes:  
##  NULL

Present Histogram

for the Rmetrics library function histplot quantmod xts data must be converted to Rmetrics timeSeries data. Then the histogram plotted.

x_ts <- as.timeSeries(x)

histPlot(x_ts)

#Frequency Table Here we parsed the data into a frequency data table and, again the xts data must be converted. This time to a data frame for these functions from the r basic library.

spr_dt <- as.data.frame(x)

spr_dt <- spr_dt$daily.returns

factorx <- factor(cut(spr_dt, breaks=nclass.Sturges(spr_dt)))
# and turn into data.frame
xout<- as.data.frame(table(factorx))
#Add cumFreq and proportions
xout <- transform(xout, cumFreq = cumsum(Freq), relative = prop.table(Freq))


xout

EVT estimation

We quickly estimate the extreme values with a General Pareto Distribution and estimate peaks over threshold, the charts first run are the extreme plot or emplot and mean excess plot or meplot

library(evir)
library(fExtremes)
library(fBasics)
emplot(x)
## Warning in xy.coords(x, y, xlabel, ylabel, log): 288 x values <= 0 omitted from
## logarithmic plot

meplot(x)

#GPD FIT here we fit the general pareto distribution to the tail excesses. And plot diagnostic charts

fit <- gpd(x, threshold = NA, nextremes = 25)

#plot(fit)#this command will not run in knitr

#Expected Shortfal estimation here we calculate the expected shortfall and plot the tail and fit the the shortfall distributions at the 95% and 99% levels to the tail data

tp <-  tailplot(fit)
gpd.q(tp,pp = .999, ci.p = .95)
##   Lower CI   Estimate   Upper CI 
## 0.06303236 0.08915581 0.13009649
quantile(x, probs = .999)
##      99.9% 
## 0.08351421
gpd.sfall(tp, .99)

##   Lower CI   Estimate   Upper CI 
## 0.04649994 0.06122867 0.13009649

#Bayesian change point detection Here I am looking for likelihood of an extreme event, with bcp library and we run the posterior analysis charts showing the probabilities of outcomes at points in time.

library(bcp)
## Warning: package 'bcp' was built under R version 4.0.5
## Loading required package: grid
out <- bcp(x)
# summary.bcp(out)
plot(out)

E