Data 624 HW 1

library(tidyverse)
## -- Attaching packages --------------------------------------- tidyverse 1.3.1 --
## v ggplot2 3.3.5     v purrr   0.3.4
## v tibble  3.1.4     v dplyr   1.0.7
## v tidyr   1.1.3     v stringr 1.4.0
## v readr   2.0.1     v forcats 0.5.1
## -- Conflicts ------------------------------------------ tidyverse_conflicts() --
## x dplyr::filter() masks stats::filter()
## x dplyr::lag()    masks stats::lag()
library(ggplot2)
library(tsibble)
## 
## Attaching package: 'tsibble'
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, union
library(tsibbledata)
library(feasts)
## Loading required package: fabletools
library(ggfortify)
library(kableExtra)
## 
## Attaching package: 'kableExtra'
## The following object is masked from 'package:dplyr':
## 
##     group_rows

2.1

Use the help function to explore what the series gafa_stock, PBS, vic_elec and pelt represent.

help(gafa_stock)
help(PBS)
help(vic_elec)
help(pelt)
  1. Use autoplot() to plot some of the series in these data sets.
autoplot(gafa_stock)

autoplot(PBS %>% filter(ATC2 == "A10"))

autoplot(vic_elec, .vars=)

autoplot(pelt, .vars = `Lynx`)

  1. What is the time interval of each series?
  • GAFA_Stock - Historical Stock dailyish (trading days) for 4 stocks 2014-2018
  • PBS - monthly Australia prescriptions data
  • vic_elec - half_hourly electricity demand for Victoria, Australia
  • pelt Hudson Bay Fur records annual

2.2

Use filter() to find what days corresponded to the peak closing price for each of the four stocks in gafa_stock.

AAPLClose <- gafa_stock %>% as_tibble %>% filter(Symbol =="AAPL") %>%
    summarise(maximum = max(Close), .groups = 'drop')
AMZNClose <- gafa_stock %>% as_tibble %>% filter(Symbol =="AMZN") %>%
    summarise(maximum = max(Close), .groups = 'drop')
FBClose <- gafa_stock %>% as_tibble %>% filter(Symbol =="FB") %>%
    summarise(maximum = max(Close), .groups = 'drop')
GOOGClose <- gafa_stock %>% as_tibble %>% filter(Symbol =="GOOG") %>%
    summarise(maximum = max(Close), .groups = 'drop')
kable(tibble(stock = c('AAPL','AMZN', 'FB', 'GOOG'), PeakClose = c(AAPLClose[[1]], AMZNClose[[1]],FBClose[[1]],GOOGClose[[1]])))
stock PeakClose
AAPL 232.07
AMZN 2039.51
FB 217.50
GOOG 1268.33

2.3

Download the file tute1.csv from the book website, open it in Excel (or some other spreadsheet application), and review its contents. You should find four columns of information. Columns B through D each contain a quarterly series, labelled Sales, AdBudget and GDP. Sales contains the quarterly sales for a small company over the period 1981-2005. AdBudget is the advertising budget and GDP is the gross domestic product. All series have been adjusted for inflation.

  1. You can read the data into R with the following script:
tute1 <- readr::read_csv("tute1.csv")
## Rows: 100 Columns: 4
## -- Column specification --------------------------------------------------------
## Delimiter: ","
## dbl  (3): Sales, AdBudget, GDP
## date (1): Quarter
## 
## i Use `spec()` to retrieve the full column specification for this data.
## i Specify the column types or set `show_col_types = FALSE` to quiet this message.
kableExtra::kable(head(tute1))
Quarter Sales AdBudget GDP
1981-03-01 1020.2 659.2 251.8
1981-06-01 889.2 589.0 290.9
1981-09-01 795.0 512.5 290.8
1981-12-01 1003.9 614.1 292.4
1982-03-01 1057.7 647.2 279.1
1982-06-01 944.4 602.0 254.0
  1. Convert the data to time series
mytimeseries <- tute1 %>%
  mutate(Quarter = yearmonth(Quarter)) %>%
  as_tsibble(index = Quarter)
  1. Construct time series plots of each of the three series
mytimeseries %>%
  pivot_longer(-Quarter) %>%
  ggplot(aes(x = Quarter, y = value, colour = name)) +
  geom_line() +
  facet_grid(name ~ ., scales = "free_y")

Check what happens when you don’t include facet_grid().

mytimeseries %>%
  pivot_longer(-Quarter) %>%
  ggplot(aes(x = Quarter, y = value, colour = name)) +
  geom_line()

It plots them on the same graph which actually doesn’t work that horribly in this case aside from the misleading GDP.

2.4

The USgas package contains data on the demand for natural gas in the US.

  1. Install the USgas package.
install.packages("USgas")
  1. Create a tsibble from us_total with year as the index and state as the key.
usGasTotalTsibble<-USgas::us_total %>% tsibble(index=year, key = state)
  1. Plot the annual natural gas consumption by state for the New England area (comprising the states of Maine, Vermont, New Hampshire, Massachusetts, Connecticut and Rhode Island).
usGasTotalTsibble %>% filter(state %in% c("Maine", "Vermont", "New Hampshire", "Massachusetts", "Connecticut", "Rhode Island") ) %>% autoplot(.vars= y)

2.5

  1. Download tourism.xlsx from the book website and read it into R using readxl::read_excel().
xlstourism <- readxl::read_excel("tourism.xlsx")
  1. Create a tsibble which is identical to the tourism tsibble from the tsibble package.
xlstourismTsibble<-xlstourism %>% mutate(Quarter = yearquarter(Quarter)) %>% tsibble(index=Quarter, key=c(Region,State,Purpose))
  1. Find what combination of Region and Purpose had the maximum number of overnight trips on average.
kableExtra::kable(head(xlstourismTsibble %>% as_tibble() %>% group_by(Region,Purpose) %>% dplyr::summarise(maxTrips = max(Trips)) %>% arrange(desc(maxTrips))))
Region Purpose maxTrips
Melbourne Visiting 985.2784
Sydney Business 948.1294
Sydney Visiting 920.7759
South Coast Holiday 914.7728
North Coast NSW Holiday 905.8450
Sydney Holiday 828.3171

So Melbourne Visiting seems to win.

  1. Create a new tsibble which combines the Purposes and Regions, and just has total trips by State.
stateTourism<-xlstourismTsibble%>% group_by(State)%>%summarize(TotalTrips=sum(Trips))
stateTourism %>% autoplot(.vars=TotalTrips)

2.8

Monthly Australian retail data is provided in aus_retail. Select one of the time series as follows (but choose your own seed value):

set.seed(12345678-4321)
myseries <- aus_retail %>%
  filter(`Series ID` == sample(aus_retail$`Series ID`,1))

This selects clothing retailing

Explore your chosen retail time series using the following functions:

autoplot(), gg_season(), gg_subseries(), gg_lag(),ACF() %>% autoplot()

autoplot(myseries, .vars=Turnover)

gg_season(myseries, y=Turnover)

gg_subseries(myseries, y=Turnover)

 gg_lag(myseries, y=Turnover, lags = 1:12)

ACF(myseries, Turnover) %>% autoplot()

  1. Can you spot any seasonality, cyclicity and trend?

We see strong seasonality (which is not unexpected with an explicitly seasonal product). There seems to be a holiday/Summer boost (remember Australia land of XMAS tanning) and a medium winter boost. Also there is a consistent nadir at February.

December has in the time series gone from a nice month for retailers to a mammoth one.

A lag of 12 is very predictive which indicates a strong yearly cycle, making comparisons of last years month very useful. 6 month comparisons would also be revealing though less so. Though there is a consistent trend toward ever higher values. A 2018 February performance (the weakest month generally) beats a 2002 December (generally the best month).

  1. What do you learn about the series?

Data has strong time components, both seasonally and yearly. There may be a cultural shift in the holiday period to make clothing gifts more acceptable or practical.