Objective:

Please submit exercises 2.1, 2.2, 2.3, 2.4, 2.5 and 2.8 from the Hyndman online Forecasting book.

2.1

Explore the following four time series: Bricks from aus_production, Lynx from pelt, Close from gafa_stock, Demand from vic_elec.

Use ? (or help()) to find out about the data in each series.

library(fpp3)
## Warning: package 'fpp3' was built under R version 4.4.3
## Registered S3 method overwritten by 'tsibble':
##   method               from 
##   as_tibble.grouped_df dplyr
## ── Attaching packages ──────────────────────────────────────────── fpp3 1.0.2 ──
## ✔ tibble      3.2.1     ✔ tsibble     1.1.6
## ✔ dplyr       1.1.4     ✔ tsibbledata 0.4.1
## ✔ tidyr       1.3.1     ✔ feasts      0.4.2
## ✔ lubridate   1.9.3     ✔ fable       0.4.1
## ✔ ggplot2     3.5.1
## Warning: package 'tsibble' was built under R version 4.4.3
## Warning: package 'tsibbledata' was built under R version 4.4.3
## Warning: package 'feasts' was built under R version 4.4.3
## Warning: package 'fabletools' was built under R version 4.4.3
## Warning: package 'fable' was built under R version 4.4.3
## ── Conflicts ───────────────────────────────────────────────── fpp3_conflicts ──
## ✖ lubridate::date()    masks base::date()
## ✖ dplyr::filter()      masks stats::filter()
## ✖ tsibble::intersect() masks base::intersect()
## ✖ tsibble::interval()  masks lubridate::interval()
## ✖ dplyr::lag()         masks stats::lag()
## ✖ tsibble::setdiff()   masks base::setdiff()
## ✖ tsibble::union()     masks base::union()
?aus_production
## starting httpd help server ... done
?pelt
?gafa_stock
?vic_elec

What is the time interval of each series? The time intervals for each of the series are: “Bricks” is quarterly, “Lynx” is annual, “Close” is daily, and “Demand” is half-hourly. Use autoplot() to produce a time plot of each series. For the last plot, modify the axis labels and title.

autoplot(aus_production %>% select(Bricks)) + labs(title="Quarterly Clay Brick Production in Australia")
## Plot variable not specified, automatically selected `.vars = Bricks`
## Warning: Removed 20 rows containing missing values or values outside the scale range
## (`geom_line()`).

autoplot(pelt %>% select(Lynx)) + labs(title="Annual Lynx Trappings by Hudson’s Bay Company")
## Plot variable not specified, automatically selected `.vars = Lynx`

autoplot(gafa_stock %>% filter(Symbol == "AAPL") %>% select(Close)) + labs(title="Daily Closing Price: Apple (AAPL)")
## Plot variable not specified, automatically selected `.vars = Close`

autoplot(vic_elec %>% select(Demand)) +
  labs(
    title="Half-hourly Electricity Demand in Victoria, 2012",
    x="Date",
    y="Megawatts"
  )
## Plot variable not specified, automatically selected `.vars = Demand`

2.2

Use filter() to find what days corresponded to the peak closing price for each of the four stocks in gafa_stock.

gafa_stock %>%
  group_by(Symbol) %>%
  filter(near(Close, max(Close, na.rm = TRUE))) %>% # near to avoid floating point equality issues
  select(Symbol, Date, Close) %>%
  arrange(Symbol, Date)
## # A tsibble: 4 x 3 [!]
## # Key:       Symbol [4]
## # Groups:    Symbol [4]
##   Symbol Date       Close
##   <chr>  <date>     <dbl>
## 1 AAPL   2018-10-03  232.
## 2 AMZN   2018-09-04 2040.
## 3 FB     2018-07-25  218.
## 4 GOOG   2018-07-26 1268.

2.3

Download the file tute1.csv from the book website, open it in Excel (or some other spreadsheet application), and review its contents. You should find four columns of information. Columns B through D each contain a quarterly series, labelled Sales, AdBudget and GDP. Sales contains the quarterly sales for a small company over the period 1981-2005. AdBudget is the advertising budget and GDP is the gross domestic product. All series have been adjusted for inflation

a. You can read the data into R with the following script:

tute1 <- readr::read_csv("tute1.csv")
## Rows: 100 Columns: 4
## ── Column specification ────────────────────────────────────────────────────────
## Delimiter: ","
## dbl  (3): Sales, AdBudget, GDP
## date (1): Quarter
## 
## ℹ Use `spec()` to retrieve the full column specification for this data.
## ℹ Specify the column types or set `show_col_types = FALSE` to quiet this message.
View(tute1)

b. Convert the data to time series

mytimeseries <- tute1 |>
  mutate(Quarter = yearquarter(Quarter)) |>
  as_tsibble(index = Quarter)

c. Construct time series plots of each of the three series

mytimeseries |>
  pivot_longer(-Quarter) |>
  ggplot(aes(x = Quarter, y = value, colour = name)) +
  geom_line() +
  facet_grid(name ~ ., scales = "free_y")

Check what happens when you don’t include facet_grid().

mytimeseries |>
  pivot_longer(-Quarter) |>
  ggplot(aes(x = Quarter, y = value, colour = name)) +
  geom_line()

According to the plot, when I don’t include facet_grid(), all series are plotted into one singular graph, which also makes the graph not as presentable to the viewer because of how due to scalibility, “Sales” over takes the graph while “GDP” and “AdBudget” gets misrepresented in comparison to the scale of “Sales”.

2.4

The USgas package contains data on the demand for natural gas in the US. a. Install the USgas package.

library(USgas)
## Warning: package 'USgas' was built under R version 4.4.3

b. Create a tsibble from us_total with year as the index and state as the key.

us_total_ts <- us_total |>
  as_tsibble(index = year, key = state)
us_total_ts
## # A tsibble: 1,266 x 3 [1Y]
## # Key:       state [53]
##     year state        y
##    <int> <chr>    <int>
##  1  1997 Alabama 324158
##  2  1998 Alabama 329134
##  3  1999 Alabama 337270
##  4  2000 Alabama 353614
##  5  2001 Alabama 332693
##  6  2002 Alabama 379343
##  7  2003 Alabama 350345
##  8  2004 Alabama 382367
##  9  2005 Alabama 353156
## 10  2006 Alabama 391093
## # ℹ 1,256 more rows

c. Plot the annual natural gas consumption by state for the New England area (comprising the states of Maine, Vermont, New Hampshire, Massachusetts, Connecticut and Rhode Island).

ngc_states<- c("Maine","Vermont","New Hampshire",
               "Massachusetts","Connecticut","Rhode Island")
us_total_ts |>
  filter(state %in% ngc_states) |>
  autoplot(y) +
  labs(title = "New England Annual Natural Gas Consumption",
       x = "Year", y = "Million in cubic feet")

2.5

a. Download tourism.xlsx from the book website and read it into R using readxl::read_excel().

tourism_xl <- readxl::read_excel("tourism.xlsx")

b. Create a tsibble which is identical to the tourism tsibble from the tsibble package.

tourism_ts <- tourism_xl |>
  mutate(Quarter=yearquarter(Quarter)) |>
  as_tsibble(index= Quarter, key = c(Region, State, Purpose))

c. Find what combination of Region and Purpose had the maximum number of overnight trips on average.

tourism_ts |>
  group_by(Region, Purpose) |>
  summarise(avg_trips = mean(Trips, na.rm = TRUE)) |>
  slice_max(avg_trips, n = 1, with_ties = TRUE)
## # A tsibble: 76 x 4 [1Q]
## # Key:       Region, Purpose [76]
## # Groups:    Region [76]
##    Region                     Purpose  Quarter avg_trips
##    <chr>                      <chr>      <qtr>     <dbl>
##  1 Adelaide                   Visiting 2017 Q1     270. 
##  2 Adelaide Hills             Visiting 2002 Q4      81.1
##  3 Alice Springs              Holiday  1998 Q3      76.5
##  4 Australia's Coral Coast    Holiday  2014 Q3     198. 
##  5 Australia's Golden Outback Business 2017 Q3     174. 
##  6 Australia's North West     Business 2016 Q3     297. 
##  7 Australia's South West     Holiday  2016 Q1     612. 
##  8 Ballarat                   Visiting 2004 Q1     103. 
##  9 Barkly                     Holiday  1998 Q3      37.9
## 10 Barossa                    Holiday  2006 Q1      51.0
## # ℹ 66 more rows

d. Create a new tsibble which combines the Purposes and Regions, and just has total trips by State.

library(fabletools)
total_state_trips <- tourism_ts |>
  aggregate_key(State, Trips = sum(Trips))
total_state_trips
## # A tsibble: 720 x 3 [1Q]
## # Key:       State [9]
##    Quarter State         Trips
##      <qtr> <chr*>        <dbl>
##  1 1998 Q1 <aggregated> 23182.
##  2 1998 Q2 <aggregated> 20323.
##  3 1998 Q3 <aggregated> 19827.
##  4 1998 Q4 <aggregated> 20830.
##  5 1999 Q1 <aggregated> 22087.
##  6 1999 Q2 <aggregated> 21458.
##  7 1999 Q3 <aggregated> 19914.
##  8 1999 Q4 <aggregated> 20028.
##  9 2000 Q1 <aggregated> 22339.
## 10 2000 Q2 <aggregated> 19941.
## # ℹ 710 more rows

2.8

Use the following graphics functions: autoplot(), gg_season(), gg_subseries(), gg_lag(), ACF() and explore features from the following time series: “Total Private” Employed from us_employment, Bricks from aus_production, Hare from pelt, “H02” Cost from PBS, and Barrels from us_gasoline.

# “Total Private” Employed
emp <- us_employment %>%
  filter(Title == "Total Private") %>%
  select(Month, Employed) %>%
  drop_na()

autoplot(emp, Employed) + labs(title = "US Employment — Total Private")

gg_season(emp, Employed) + labs(title = "Seasonal plot (Monthly)")
## Warning: `gg_season()` was deprecated in feasts 0.4.2.
## ℹ Please use `ggtime::gg_season()` instead.
## This warning is displayed once every 8 hours.
## Call `lifecycle::last_lifecycle_warnings()` to see where this warning was
## generated.

gg_subseries(emp, Employed) + labs(title = "Subseries plot")
## Warning: `gg_subseries()` was deprecated in feasts 0.4.2.
## ℹ Please use `ggtime::gg_subseries()` instead.
## This warning is displayed once every 8 hours.
## Call `lifecycle::last_lifecycle_warnings()` to see where this warning was
## generated.

gg_lag(emp, Employed, lags = 1:12) + labs(title = "Lag plot (1–12)")
## Warning: `gg_lag()` was deprecated in feasts 0.4.2.
## ℹ Please use `ggtime::gg_lag()` instead.
## This warning is displayed once every 8 hours.
## Call `lifecycle::last_lifecycle_warnings()` to see where this warning was
## generated.

ACF(emp, Employed) %>% autoplot() + labs(title = "ACF")

# Bricks
bricks <- aus_production %>%
  select(Bricks) %>%
  drop_na()

autoplot(bricks, Bricks) + labs(title = "Australia Clay Bricks (Quarterly)")

gg_season(bricks, Bricks) + labs(title = "Seasonal plot (Quarterly)")

gg_subseries(bricks, Bricks) + labs(title = "Subseries plot")

gg_lag(bricks, Bricks, lags = 1:8) + labs(title = "Lag plot (1–8)")

ACF(bricks, Bricks) %>% autoplot() + labs(title = "ACF")

# Hare
hare <- pelt %>%
  select(Hare) %>%
  drop_na()

autoplot(hare, Hare) + labs(title = "Hare Pelts (Annual)")

gg_lag(hare, Hare, lags = 1:10) + labs(title = "Lag plot (1–10)")

ACF(hare, Hare) %>% autoplot() + labs(title = "ACF")

# H02
h02 <- PBS %>%
  filter(ATC2 == "H02") %>%
  summarise(Cost = sum(Cost, na.rm = TRUE))

autoplot(h02, Cost) + labs(title = "PBS Cost — ATC2 H02 (Monthly Total)")

gg_season(h02, Cost) + labs(title = "Seasonal plot (Monthly)")

gg_subseries(h02, Cost) + labs(title = "Subseries plot")

gg_lag(h02, Cost, lags = 1:24) + labs(title = "Lag plot (1–24)")

ACF(h02, Cost) %>% autoplot() + labs(title = "ACF")

# Barrels
gas <- us_gasoline %>%
  select(Barrels) %>%
  drop_na()


autoplot(gas, Barrels) + labs(title = "US Gasoline Barrels (Weekly)")

gg_season(gas, Barrels) + labs(title = "Seasonal plot (Weekly)")

gg_subseries(gas, Barrels) + labs(title = "Subseries plot")

gg_lag(gas, Barrels, lags = 1:52) + labs(title = "Lag plot (1–52)")

ACF(gas, Barrels, lag_max = 104) %>% autoplot() + labs(title = "ACF (up to 104 lags)")

Can you spot any seasonality, cyclicity and trend? US Employment (monthly): Mainly increasing overall but recessions are very visibly when happens with huge dips according to the graph. Multiyear business cycles. Australian Bricks (quarterly): Trended upwards in the late 1970s-1980s. Hare pelts (annual): There aren’t inherit trends but in terms of cyclicity, H02 (monthly): consistently trending upward with multiyear changes. US Barrels (weekly): Rises throughout 1990s-mid 2000s in terms of consistent upward trending. What do you learn about the series? US Employment (monthly): Trending upward with recession interputing the series. Australian Bricks (quarterly): Quarterly-based seasons with multiyear cycles for construction with level changes in 1980s-1990s. Hare pelts (annual): No clear indication of seasonality, but slight trend upwards though heavily disrupted throughout the years in this 10 year cycle. H02 (monthly): Overall upward trend that has seasonality and mutiyear swings. US Barrels (weekly): Overall upward trend though slower than the other graphs in comparison with major seasonal highs and lows. What can you say about the seasonal patterns? US Employment (monthly): No seasonal patterns. Australian Bricks (quarterly): In terms of seasons, Q1 tends to be the lowest point while Q3 tends to be the highest point. Hare pelts (annual): No seasonal patterns. H02 (monthly): Low in Spring February/March. High in Late fall-early winter November/December. US Barrels (weekly): Low in winter January/February. High in summer June/August. Can you identify any unusual years? US Employment (monthly): 2008-2009 and 2020 were the most unusual with extreme dips showing major events in the US labor market (recession and COVID). Australian Bricks (quarterly): 1982-1983, 1991-1992, and early 2000s show lowest points. Hare pelts (annual): Extreme high in 1863. Followed by the lowest of the low points in 1924. H02 (monthly): Extreme high in 2004 December, but extreme low’s peak in pre-1995. US Barrels (weekly): 2008-2009 were the most unusual with extreme dips showing major events in the labor market (recession and COVID).