library(tsibble)
## Warning: package 'tsibble' was built under R version 4.3.3
## Registered S3 method overwritten by 'tsibble':
##   method               from 
##   as_tibble.grouped_df dplyr
## 
## Attaching package: 'tsibble'
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, union
library(tsibbledata)
library(tidyverse)
## ── Attaching core tidyverse packages ──────────────────────── tidyverse 2.0.0 ──
## ✔ dplyr     1.1.4     ✔ readr     2.1.5
## ✔ forcats   1.0.0     ✔ stringr   1.5.1
## ✔ ggplot2   3.5.1     ✔ tibble    3.2.1
## ✔ lubridate 1.9.3     ✔ tidyr     1.3.1
## ✔ purrr     1.0.2
## ── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──
## ✖ dplyr::filter()       masks stats::filter()
## ✖ lubridate::interval() masks tsibble::interval()
## ✖ dplyr::lag()          masks stats::lag()
## ℹ Use the conflicted package (<http://conflicted.r-lib.org/>) to force all conflicts to become errors
library(feasts) 
## Loading required package: fabletools
library(lubridate)
library(patchwork)

Data 624 HW-1

2.10 Exercises

#1 Explore the following four time series: Bricks from aus_production, Lynx from pelt, Close from gafa_stock, Demand from vic_elec.

data("aus_production")
data("pelt")
data("gafa_stock")
data("vic_elec")

head(aus_production)
## # A tsibble: 6 x 7 [1Q]
##   Quarter  Beer Tobacco Bricks Cement Electricity   Gas
##     <qtr> <dbl>   <dbl>  <dbl>  <dbl>       <dbl> <dbl>
## 1 1956 Q1   284    5225    189    465        3923     5
## 2 1956 Q2   213    5178    204    532        4436     6
## 3 1956 Q3   227    5297    208    561        4806     7
## 4 1956 Q4   308    5681    197    570        4418     6
## 5 1957 Q1   262    5577    187    529        4339     5
## 6 1957 Q2   228    5651    214    604        4811     7
head(pelt)
## # A tsibble: 6 x 3 [1Y]
##    Year  Hare  Lynx
##   <dbl> <dbl> <dbl>
## 1  1845 19580 30090
## 2  1846 19600 45150
## 3  1847 19610 49150
## 4  1848 11990 39520
## 5  1849 28040 21230
## 6  1850 58000  8420
head(gafa_stock)
## # A tsibble: 6 x 8 [!]
## # Key:       Symbol [1]
##   Symbol Date        Open  High   Low Close Adj_Close    Volume
##   <chr>  <date>     <dbl> <dbl> <dbl> <dbl>     <dbl>     <dbl>
## 1 AAPL   2014-01-02  79.4  79.6  78.9  79.0      67.0  58671200
## 2 AAPL   2014-01-03  79.0  79.1  77.2  77.3      65.5  98116900
## 3 AAPL   2014-01-06  76.8  78.1  76.2  77.7      65.9 103152700
## 4 AAPL   2014-01-07  77.8  78.0  76.8  77.1      65.4  79302300
## 5 AAPL   2014-01-08  77.0  77.9  77.0  77.6      65.8  64632400
## 6 AAPL   2014-01-09  78.1  78.1  76.5  76.6      65.0  69787200
head(vic_elec)
## # A tsibble: 6 x 5 [30m] <Australia/Melbourne>
##   Time                Demand Temperature Date       Holiday
##   <dttm>               <dbl>       <dbl> <date>     <lgl>  
## 1 2012-01-01 00:00:00  4383.        21.4 2012-01-01 TRUE   
## 2 2012-01-01 00:30:00  4263.        21.0 2012-01-01 TRUE   
## 3 2012-01-01 01:00:00  4049.        20.7 2012-01-01 TRUE   
## 4 2012-01-01 01:30:00  3878.        20.6 2012-01-01 TRUE   
## 5 2012-01-01 02:00:00  4036.        20.4 2012-01-01 TRUE   
## 6 2012-01-01 02:30:00  3866.        20.2 2012-01-01 TRUE

Use ? (or help()) to find out about the data in each series.

?aus_production
?pelt
?gafa_stock
?vic_elec

What is the time interval of each series?

aus_production: “Quarterly estimates of selected indicators of manufacturing production in Australia.” This has a quartely time interval.

pelt: “Hudson Bay Company trading records for Snowshoe Hare and Canadian Lynx furs from 1845 to 1935.” This has a yearly time interval.

gafa_stock: “Historical stock prices from 2014-2018 for Google, Amazon, Facebook and Apple.” This has a daily time interval for days stock market open.

vic_elec: “Half-hourly electricity demand for Victoria, Australia” This has a half-hour tiime interval

Use autoplot() to produce a time plot of each series. For the last plot, modify the axis labels and title.

autoplot(aus_production, Bricks)
## Warning: Removed 20 rows containing missing values or values outside the scale range
## (`geom_line()`).

autoplot(pelt, Lynx)

autoplot(gafa_stock, Close)

autoplot(vic_elec, Demand) +  labs(title = "Electricity Demand in Victoria", x = "Year", y = "Demand") 

#2 Use filter() to find what days corresponded to the peak closing price for each of the four stocks in gafa_stock.

gafa_stock %>% 
  group_by(Symbol) %>%
  filter(Close == max(Close)) %>% 
  select(Symbol, Date, Close)
## # A tsibble: 4 x 3 [!]
## # Key:       Symbol [4]
## # Groups:    Symbol [4]
##   Symbol Date       Close
##   <chr>  <date>     <dbl>
## 1 AAPL   2018-10-03  232.
## 2 AMZN   2018-09-04 2040.
## 3 FB     2018-07-25  218.
## 4 GOOG   2018-07-26 1268.

#3 Download the file tute1.csv from the book website, open it in Excel (or some other spreadsheet application), and review its contents. You should find four columns of information. Columns B through D each contain a quarterly series, labelled Sales, AdBudget and GDP. Sales contains the quarterly sales for a small company over the period 1981-2005. AdBudget is the advertising budget and GDP is the gross domestic product. All series have been adjusted for inflation.

  1. You can read the data into R with the following script:
tute1 <- readr::read_csv("tute1.csv") 
## Rows: 100 Columns: 4
## ── Column specification ────────────────────────────────────────────────────────
## Delimiter: ","
## dbl  (3): Sales, AdBudget, GDP
## date (1): Quarter
## 
## ℹ Use `spec()` to retrieve the full column specification for this data.
## ℹ Specify the column types or set `show_col_types = FALSE` to quiet this message.
head(tute1)
## # A tibble: 6 × 4
##   Quarter    Sales AdBudget   GDP
##   <date>     <dbl>    <dbl> <dbl>
## 1 1981-03-01 1020.     659.  252.
## 2 1981-06-01  889.     589   291.
## 3 1981-09-01  795      512.  291.
## 4 1981-12-01 1004.     614.  292.
## 5 1982-03-01 1058.     647.  279.
## 6 1982-06-01  944.     602   254
  1. Convert the data to time series
mytimeseries <- tute1 |>
  mutate(Quarter = yearquarter(Quarter)) |>
  as_tsibble(index = Quarter)
  1. Construct time series plots of each of the three series
mytimeseries |>
  pivot_longer(-Quarter) |>
  ggplot(aes(x = Quarter, y = value, colour = name)) +
  geom_line() +
  facet_grid(name ~ ., scales = "free_y")

Plot without facet grid

mytimeseries %>%
  pivot_longer(-Quarter) %>%
  ggplot(aes(x = Quarter, y = value, colour = name)) +
  geom_line()

Plotting without facet_grid makes it much more difficult to to interpret the results especially if they are based on different skills.

#4 The USgas package contains data on the demand for natural gas in the US.

Install the USgas package. Create a tsibble from us_total with year as the index and state as the key.

library("USgas")
data("usgas")
head(usgas)
##         date                 process state state_abb      y
## 1 1973-01-01  Commercial Consumption  U.S.      U.S. 392315
## 2 1973-01-01 Residential Consumption  U.S.      U.S. 843900
## 3 1973-02-01  Commercial Consumption  U.S.      U.S. 394281
## 4 1973-02-01 Residential Consumption  U.S.      U.S. 747331
## 5 1973-03-01  Commercial Consumption  U.S.      U.S. 310799
## 6 1973-03-01 Residential Consumption  U.S.      U.S. 648504
us_total_tsibble <- us_total %>%
  as_tsibble(index = year, key = state)

head(us_total_tsibble)
## # A tsibble: 6 x 3 [1Y]
## # Key:       state [1]
##    year state        y
##   <int> <chr>    <int>
## 1  1997 Alabama 324158
## 2  1998 Alabama 329134
## 3  1999 Alabama 337270
## 4  2000 Alabama 353614
## 5  2001 Alabama 332693
## 6  2002 Alabama 379343

Plot the annual natural gas consumption by state for the New England area (comprising the states of Maine, Vermont, New Hampshire, Massachusetts, Connecticut and Rhode Island).

new_england_states <- c("Maine", "Vermont", "New Hampshire", "Massachusetts", "Connecticut", "Rhode Island")

new_england_data <- us_total_tsibble %>%
  filter(state %in% new_england_states)

autoplot(new_england_data) +
  labs(title = "Natural Gas Consumption in New England States", x = "Year", y = "Consumption") +
  theme_minimal()
## Plot variable not specified, automatically selected `.vars = y`

#5 a. Download tourism.xlsx from the book website and read it into R using readxl::read_excel().

library(readxl)
tourism_data <- read_excel("/Users/zigcah/Downloads/tourism.xlsx")
head(tourism_data)
## # A tibble: 6 × 5
##   Quarter    Region   State           Purpose  Trips
##   <chr>      <chr>    <chr>           <chr>    <dbl>
## 1 1998-01-01 Adelaide South Australia Business  135.
## 2 1998-04-01 Adelaide South Australia Business  110.
## 3 1998-07-01 Adelaide South Australia Business  166.
## 4 1998-10-01 Adelaide South Australia Business  127.
## 5 1999-01-01 Adelaide South Australia Business  137.
## 6 1999-04-01 Adelaide South Australia Business  200.
  1. Create a tsibble which is identical to the tourism tsibble from the tsibble package.
tourism_tsibble <- tourism_data %>%
  mutate(Quarter = yearquarter(Quarter)) %>%
  as_tsibble(index = Quarter, key = c(Region, Purpose))

head(tourism_tsibble)
## # A tsibble: 6 x 5 [1Q]
## # Key:       Region, Purpose [1]
##   Quarter Region   State           Purpose  Trips
##     <qtr> <chr>    <chr>           <chr>    <dbl>
## 1 1998 Q1 Adelaide South Australia Business  135.
## 2 1998 Q2 Adelaide South Australia Business  110.
## 3 1998 Q3 Adelaide South Australia Business  166.
## 4 1998 Q4 Adelaide South Australia Business  127.
## 5 1999 Q1 Adelaide South Australia Business  137.
## 6 1999 Q2 Adelaide South Australia Business  200.
  1. Find what combination of Region and Purpose had the maximum number of overnight trips on average.
max_trips <- tourism_data |>
  group_by(Region, Purpose) |>
  summarize(avg_trips = mean(Trips, na.rm = TRUE)) |>
  arrange(desc(avg_trips)) |>
  head(1) 
## `summarise()` has grouped output by 'Region'. You can override using the
## `.groups` argument.
print(max_trips)
## # A tibble: 1 × 3
## # Groups:   Region [1]
##   Region Purpose  avg_trips
##   <chr>  <chr>        <dbl>
## 1 Sydney Visiting      747.
  1. Create a new tsibble which combines the Purposes and Regions, and just has total trips by State.
state_tsibble <- tourism_tsibble %>%
    index_by(Quarter) %>%
    group_by(State) %>%
    summarize(total_trips = sum(Trips, na.rm = TRUE)) 

print(state_tsibble)
## # A tsibble: 640 x 3 [1Q]
## # Key:       State [8]
##    State Quarter total_trips
##    <chr>   <qtr>       <dbl>
##  1 ACT   1998 Q1        551.
##  2 ACT   1998 Q2        416.
##  3 ACT   1998 Q3        436.
##  4 ACT   1998 Q4        450.
##  5 ACT   1999 Q1        379.
##  6 ACT   1999 Q2        558.
##  7 ACT   1999 Q3        449.
##  8 ACT   1999 Q4        595.
##  9 ACT   2000 Q1        600.
## 10 ACT   2000 Q2        557.
## # ℹ 630 more rows

**This is a plot for the final tsibble of combined purposes and regions. With final column of total trips by state.

autoplot(state_tsibble, total_trips) +
  labs(title = "Total Trips by State", x = "Quarter", y = "Total Trips") +
  theme_minimal()

#8 Use the following graphics functions: autoplot(), gg_season(), gg_subseries(), gg_lag(), ACF() and explore features from the following time series: “Total Private” Employed from us_employment, Bricks from aus_production, Hare from pelt, “H02” Cost from PBS, and Barrels from us_gasoline.

library(fpp3)
## Warning: package 'fpp3' was built under R version 4.3.3
## ── Attaching packages ──────────────────────────────────────────── fpp3 1.0.0 ──
## ✔ fable 0.3.4
## ── Conflicts ───────────────────────────────────────────────── fpp3_conflicts ──
## ✖ lubridate::date()     masks base::date()
## ✖ dplyr::filter()       masks stats::filter()
## ✖ tsibble::intersect()  masks base::intersect()
## ✖ lubridate::interval() masks tsibble::interval()
## ✖ dplyr::lag()          masks stats::lag()
## ✖ tsibble::setdiff()    masks base::setdiff()
## ✖ tsibble::union()      masks base::union()
data("us_employment")
data("aus_production")
data("pelt")
data("PBS")
data("us_gasoline")
employment_data <- us_employment %>% filter(Title == "Total Private")

# Total Private Employed - Autoplot
autoplot(employment_data, Employed) +
  labs(title = "Total Private Employment in US", x = "Year", y = "Employed")

# Seasonal Plot 
gg_season(employment_data, Employed)

# Subseries Plot
gg_subseries(employment_data, Employed)

# Lag Plot
gg_lag(employment_data, Employed)

# ACF Plot
ACF(employment_data, Employed) %>% autoplot()

Can you spot any seasonality, cyclicity and trend? Yes, the seasonality that appears displays a pattern of more hiring in the summer.

What do you learn about the series? The number of people employed in the U.S. has had an upward trajectory throughout time.

What can you say about the seasonal patterns? The number of employed seems to peek during the summer months followed by a downturn during the winter months.

Can you identify any unusual years? There was a sharp decline in the number of employed around the global recession which occurred in 2008-2010.

brick_data <- aus_production %>% filter(Bricks != 0)

# Bricks - Autoplot
autoplot(brick_data, Bricks) +
  labs(title = "Brick Production in Australia", x = "Year", y = "Bricks")

# Seasonal Plot
gg_season(brick_data, Bricks)

# Subseries Plot
gg_subseries(brick_data, Bricks)

# Lag Plot
gg_lag(brick_data, Bricks)

# ACF Plot
ACF(brick_data, Bricks) %>% autoplot()

Can you spot any seasonality, cyclicity and trend? The first quarter is usually when the least amount of bricks are produced.

What do you learn about the series? The level of brick production peaked in 1980 and has been on a downtrend eversince.

What can you say about the seasonal patterns? The first quarter is usually the lowest level of brick production which may have to do with the weather. The third quarter is usually the busiest followed by the second and fourth quarters of the year.

Can you identify any unusual years? The peak of Brick production occurred in 1980, there was a significant decline in production a few years later. Brick production sharply increased towards the end of the decade which was 1989. Since then, Brick production has not been able to recover.

head(pelt)
## # A tsibble: 6 x 3 [1Y]
##    Year  Hare  Lynx
##   <dbl> <dbl> <dbl>
## 1  1845 19580 30090
## 2  1846 19600 45150
## 3  1847 19610 49150
## 4  1848 11990 39520
## 5  1849 28040 21230
## 6  1850 58000  8420
# Autoplot
autoplot(pelt, Hare) +
  labs(title = "Hare Population", y = "Hare Count", x = "Year")

# Seasonal Plot - Producing an error
# gg_season(pelt, Hare) +
#  labs(title = "Seasonal Plot: Hare Population", y = "Count", x = "Year")

# Subseries Plot
gg_subseries(pelt, Hare) +
  labs(title = "Subseries Plot: Hare Population", y = "Count", x = "Year")

# ACF Plot
ACF(pelt, Hare) %>% autoplot() +
  labs(title = "ACF: Hare Population")

# Lag Plot
gg_lag(pelt, Hare, geom = "point") +
  labs(title = "Lag Plot: Hare Population")

Can you spot any seasonality, cyclicity and trend? Hare’s definitely follow a cycle which displays up and down trends through each decade. A quick Google search on hare’s displayed information that correlates with the charts. Google stated: “Snowshoe hare populations cycle every 8–11 years, with population densities fluctuating 5–25 times during a cycle. The average time between population peaks is about 10 years.” This gave me a great perspective on the data I was looking at.

What do you learn about the series? I can see the Hare pelts were way more popular and common in the 1800’s and started to decrease in the 1900’s.

What can you say about the seasonal patterns? It is hard to discern any seasonal patterns because the data is displayed on a wider scale and doesnt show much fluctuations within the years themselves.

Can you identify any unusual years? 1865 and 1885 were the highest years in pelt production. The Hare population never again reached those levels.

h02_data <- PBS %>% filter(ATC2 == "H02")

# H02 Cost - AutoPlot
autoplot(h02_data, Cost) +
  labs(title = "H02 Drug Cost", x = "Year", y = "Cost")

# Seasonal Plot
gg_season(h02_data, Cost)

# Subseries Plot
gg_subseries(h02_data, Cost)

# Lag Plot - Producing an error
# gg_lag(h02_data, Cost)

# ACF Plot
ACF(h02_data, Cost) %>% autoplot()

Can you spot any seasonality, cyclicity and trend? The Concessional and General Safety Net have lower costs from February to July, while Concessional Co-payments have higher costs during those months.

What do you learn about the series? General Co-payments appear to have the most stable costs year-round. The other forms of prescription insurance have costs that varies throughout the year.

What can you say about the seasonal patterns? Concessional Safety Net and General Safety Net appear to follow the same pattern of lower costs between February and July, and increased costs in the following months thereafter. Concessional Co-payments appears to have the opposite pattern, with higher costs from February to July, and decreased in the following months thereafter.

Can you identify any unusual years? There doesn’t appear to be any year with an unusual increase or decrease in costs.

gasoline_data <- us_gasoline %>% select(Barrels)

# Barrels - AutoPlot
autoplot(gasoline_data) +
  labs(title = "US Gasoline Consumption", x = "Year", y = "Barrels")
## Plot variable not specified, automatically selected `.vars = Barrels`

# Seasonal Plot
gg_season(gasoline_data, Barrels)

# Subseries Plot
gg_subseries(gasoline_data, Barrels)

# Lag Plot
gg_lag(gasoline_data, Barrels)

# ACF Plot
ACF(gasoline_data, Barrels) %>% autoplot()

Can you spot any seasonality, cyclicity and trend? Gasoline production appears to increase during the Spring and Summer, and decrease during the Fall and Winter.

What do you learn about the series? The number of barrels produced steadily increased from 1991 to around 2007, before declining around the global recession that occurred in 2009. The amount of barrels being produced since that decline has recovered, having a peak of production that slightly exceeds the peak of production prior to the decline

What can you say about the seasonal patterns? Production tends to pick up during the warm weather months in the Spring and Summer, and lag during cold weather months in the Fall and Winter.

Can you identify any unusual years? After a mostly uptrend during the 1990s, there was a significant drop in production around 2001, which may have coincides with September 11th attacks in the U.S. which caused many issues in the middle east that may have affected this.