#install.packages('googlesheets')
library(googlesheets)
## Warning: package 'googlesheets' was built under R version 3.2.5

Check out available Google Sheets

# sheets = gs_ls()
# sheets[1,]

Read in WorldMap data

world.map <- gs_title("Aug Data")
## Sheet successfully identified: "Aug Data"

Read the data

## Accessing worksheet titled 'WorldMap'.
## No encoding supplied: defaulting to UTF-8.
## Accessing worksheet titled 'TS Costs'.
## No encoding supplied: defaulting to UTF-8.
##    user  system elapsed 
##   0.000   0.000  10.006
behaviour.all <- gs_read(world.map, ws="Behaviour All Pages")
## Accessing worksheet titled 'Behaviour All Pages'.
## No encoding supplied: defaulting to UTF-8.
exit.pages <- gs_read(world.map, ws = "Exit Pages")
## Accessing worksheet titled 'Exit Pages'.
## No encoding supplied: defaulting to UTF-8.
##    user  system elapsed 
##       0       0      10
## Accessing worksheet titled 'Landing Pages'.
## No encoding supplied: defaulting to UTF-8.
## Accessing worksheet titled 'CPC_CTR_Clicks'.
## No encoding supplied: defaulting to UTF-8.
##    user  system elapsed 
##   0.001   0.000  10.002
## Accessing worksheet titled 'Timeseries'.
## No encoding supplied: defaulting to UTF-8.

Plotly requires the country code for a given country for Choropleth Maps.

Let’s create a new var country code first

# install.packages("countrycode")
library(countrycode)
df$country.code <- countrycode(df$Country, "country.name", "wb")

Choropleth Map regaring Worldwide Clicks

For a larger version click here

AdWords Analysis

# Load data
aug.dat <- world.map
#data <- gs_read(aug.dat, ws='CPC_CTR_Clicks')
#data.time <- gs_read(aug.dat, ws ="Timeseries")

time <- subset(data.time, Clicks>0)
time$Day <- as.Date(time$Day, format = "%d/%m/%Y")

Time Series Plot of the Clicks

August 14th was the day when I’ve started to create new Campaigns and AdGroups and focused on decreasing Animal clicks while increasing AI/Suffering/Cooperation

Time Series Plot of the Costs

Changes regarding Cost and Clicks from August 14th on

There was first a huge increase in AI and Ethics, and then it suddenly collapsed. The reason was most likely due to the fact that after a few days I’ve realised I should create a LOT of different Adds per AdGroup, with different Headlines and Text to experiment which would be the most effective Adds. Google rotates automatically through the Adds to find the most promising ones. Once Google found the most clicked Adds, it keeps displaying those ones most of the time. This could explain the irregularities. After approx 7d I’ve stopped creating new Adds en mass. This would explain why the costs/clicks have kept steadily increasing since August 24th. My guess is every major break in any of those lines is due to adding new AdGroups/Adds.

Time Series of Cost per Campaign

Also the drop of Wild is explained by the fact that I excluded all countries except of 25, most of them taken from Brian’s blog post. After a few days I sarted to copy this change to the other Campaigns as well. That’s why AI peaks on Aug 16th with 307 Clicks and never reaches that max again. It reaches the max of Costs though, most likely due to the fact that Clicks in western countries are more expensive.

Pie Chart of Number of Sessions by Country (Google Analytics Data Aug 18 - Aug 31)

Time Series of Cost per Day by Country (Threshlevel: >10$/Day)

Cost by Country in US$

Aug 01 - Aug 31

Aug 17 - Aug 31

User Behaviour

I kept only the top 10 pages for both Landing Pages and Exit Pages. Log into Google Analytics for more information.

Top 25 Landing Pages

NOTE: “/” represents the start page (www.foundational-research.org)

How long do people stay on the page? And how many of them go on different sections/pages of the website?

For all pageviews to the page, Exit Rate is the percentage that were the last in the session. For all sessions that start with the page, Bounce Rate is the percentage that were the only one of the session. Bounce Rate for a page is based only on sessions that start with that page.

So the average session duration is very low, most visitor leave within seconds. Though, as explained below, we don’t know how long people that start with and leave from the same page stay on that page.

Exit rate plotted onto Entrances

.

From http://help.analyticsedge.com/googleanalytics/misunderstood-metrics-time-on-page-session-duration/

TLTR The Google Analytics metrics for Avg Time on Page are a good indication of the time users spent looking at a page on your site if the page has a low % Exit. Do not use the Avg Session Duration as a key performance indicator as it is heavily influenced by Pages / Session, Bounce Rate and Sessions count.

[…]Google can’t measure the time a user spent looking at the last page of their visit to your site.

[…]This happens because Google uses the time of the next page view to determine the time you spent looking at the current page. On the last page, there is no next page recorded, so the Time on Page is unknown (recorded as 0) and the Session Duration ends when they opened the last page.

[…]For sessions where the user only looked at one page (a “bounce”) […] the Time on Page and the Session Duration is 0. This isn’t because Google knows they left right away — it is because they didn’t have any indication of when the user left so they couldn’t calculate the Time on Page, and they consider the lack of a value means 0.

[…]If a page does not have a high exit rate (% Exits), then the Avg Time on Page is a pretty good reflection of the real average. With a higher exit rate, you should have less confidence in the average metric because the average is based on a portion of total users.

Avg. Time per Page in seconds plotted onto ascending Exit Rate

plot_ly(dplyr::arrange(behaviour.all, `% Exit`), x = Page, y = `% Exit`, type='bar', text = paste("Avg. Time on Page (sec):", `Avg. Time on Page`))

Avg. Time on Page in seconds

Search keywords text mining

Finding the most frequent words/terms

##    user  system elapsed 
##   0.000   0.000  10.003
## Accessing worksheet titled 'Search Keywords'.
## No encoding supplied: defaulting to UTF-8.

Random sample of the search terms

## [1] "Total of 973 search terms. Random sample of 25 terms:"
## # A tibble: 24 × 1
##                            `Search keyword`
##                                       <chr>
## 1                            AI development
## 2  artificial intelligence computer program
## 3                  reporting animal cruelty
## 4                   ethics and intelligence
## 5                                   site ai
## 6                           learn from news
## 7                                 making ai
## 8                    +Importance +Knowledge
## 9         artificial intelligence computers
## 10                    utilitarian questions
## # ... with 14 more rows

Frequency

## Loading required package: NLP
## 
## Attaching package: 'NLP'
## The following object is masked from 'package:ggplot2':
## 
##     annotate
## Loading required package: RColorBrewer