Research Methods & Data analysis Workbook

Author

Jacob Walton

true

December 28, 2024

Lecture Notes

Week 1

Notes

  • Aims of module –> looks at the problem to find the solution using the right tool for the job. The first tool may not be the correct one but it can be learned from.

  • Data Analysis -> this aspect of the module focuses on the basics; Less is more!!

  • Research Methods -> covers the theoretical and philosophical aspects of doing science, we need to make sense of the science which needs the crucial reading and writing skills for doing research.

  • Finally, week 1 allowed for the introduction of Quarto and R. Along with the expectations of the module and the assignments at the end of it.

Week 1 Extension task and quarto published website

  • My first attempt with R followed the guide set out in the first chapter of the Book “Statistics in R for Biodiversity Conservation” (Smith et al., 2020).

Using the book Statistics in R for Biodiversity Conservation (Smith et al., 2020) data provided for examples throughout the book, the data here was on Blue Tits and this was imported into R.

An overview of the data was found using the following:

Code
str(cyan)
'data.frame':   438 obs. of  7 variables:
 $ id    : chr  "B1" "B10" "B111" "B112" ...
 $ zone  : chr  "B" "B" "B" "B" ...
 $ year  : int  2002 2001 2002 2002 2001 2001 2002 2002 2002 2002 ...
 $ multi : chr  "yes" "yes" "no" "no" ...
 $ height: num  2.7 2 2 1.9 2.1 2.5 1.9 2.2 2.4 2.3 ...
 $ day   : int  8 25 10 6 28 23 10 10 11 7 ...
 $ depth : num  0.33 0.25 0.2 0.25 0.33 0.4 0.2 0.2 0.2 0.33 ...

The following function from the “psych” package was used to find any missing data in which can be problematic for data analysis:

library(psych)
library(palmerpenguins)
library(tidyverse)
Code
colSums(is.na(cyan))
    id   zone   year  multi height    day  depth 
     0      0      0      0      0      0      0 

Then a quick overview and box-plot of the data was created as practice using and running this program:

Code
par(mfrow = c(1,1), mar = c(6,6,2,2), cex.lab=1.5)

boxplot(depth ~ zone,
        ylab = "Nest depth",
        xlab = "Woodland zone",
        data = cyan,
        col = "lightgreen",
        outpch = 16,
        las=1 )

Data outlier between Nest depth and Woodland zone

Week 2

Notes

  • Tuesday’s lecture was taken by Antonio, who went through ethics.

  • Thursdays lecture was back with Felipe and looked at exploring the data more and encouraging us to have some fun with the data and R allowing us to learn and practice, as it is the only way to get better at the it!

Ethics:

  • Antonio taught us about the importance of ethics in research.

  • It requires a scientific method, which is a systematic approach to increasing, gaining or improving knowledge and information on a subject.

  • Ethics changes with time and society, as society changes views on things, ethics also changes

  • Morality –> is a set of principles concerning the distinction between right and wrong, good and bad

  • Ethics –> acts on a social level and governs human behaviour or the conducting of an activity.

    The 3 R’s of Ethics

    • Replacement –> replace or avoid using something that will damage the interested site or animal.

    • Reduction –> reducing the amount of interaction or disturbance to site or the number of animals used.

    • Refinement –> refining scientific procedures and other factors affecting animals or the site of interest to reduce the impact or disturbance to them.

I am using this section below to show some of the stuff Felipe highlighted in the Thursday session, I will be working off the penguins data set now and use some of the functions from the Book mentioned above which I completed on the Bluetit data and advance on this now.

Tip

This will be shown in Week 2 Extension

Week 2 Extension

Loading Packages code is HERE!!!

Code
#| include: true

#| warning: false
library(kableExtra)

Attaching package: 'kableExtra'
The following object is masked from 'package:dplyr':

    group_rows
Code
library(palmerpenguins)
library(tidyverse)
library(gt)
library(vtable)
library(dplyr)
library(knitr)
library(ggplot2)
library(readxl)
library(gmodels)
library(gplots)
Registered S3 method overwritten by 'gplots':
  method         from 
  reorder.factor gdata

Attaching package: 'gplots'
The following object is masked from 'package:stats':

    lowess
Code
library(graphics)
library(vcd)
Loading required package: grid
Code
library(corrplot)
corrplot 0.95 loaded
Code
library(ggpubr)
library(gapminder)
Warning: package 'gapminder' was built under R version 4.4.2
Code
library(PairedData)
Warning: package 'PairedData' was built under R version 4.4.2
Loading required package: MASS

Attaching package: 'MASS'
The following object is masked from 'package:dplyr':

    select
Loading required package: gld
Warning: package 'gld' was built under R version 4.4.2
Loading required package: mvtnorm
Warning: package 'mvtnorm' was built under R version 4.4.2
Loading required package: lattice

Attaching package: 'PairedData'
The following object is masked from 'package:base':

    summary
Code
library(car)
Warning: package 'car' was built under R version 4.4.2
Loading required package: carData

Attaching package: 'car'
The following object is masked from 'package:psych':

    logit
The following object is masked from 'package:dplyr':

    recode
The following object is masked from 'package:purrr':

    some
Code
library(caret)
Warning: package 'caret' was built under R version 4.4.2

Attaching package: 'caret'
The following object is masked from 'package:purrr':

    lift
Code
library(mlbench)
Warning: package 'mlbench' was built under R version 4.4.2
Code
library(ISLR2)
Warning: package 'ISLR2' was built under R version 4.4.2

Attaching package: 'ISLR2'
The following object is masked from 'package:MASS':

    Boston
The following object is masked from 'package:vcd':

    Hitters
Code
library(AER)
Warning: package 'AER' was built under R version 4.4.2
Loading required package: lmtest
Loading required package: zoo

Attaching package: 'zoo'
The following objects are masked from 'package:base':

    as.Date, as.Date.numeric
Loading required package: sandwich
Warning: package 'sandwich' was built under R version 4.4.2
Loading required package: survival
Warning: package 'survival' was built under R version 4.4.2

Attaching package: 'survival'
The following object is masked from 'package:caret':

    cluster
Code
library(FactoMineR)
Warning: package 'FactoMineR' was built under R version 4.4.2
Code
library(factoextra)
Warning: package 'factoextra' was built under R version 4.4.2
Welcome! Want to learn more? See two factoextra-related books at https://goo.gl/ve3WBa
Conflicting packages!!

I am now getting conflicting functions in r due to the number of packages, using ai i was told to use the following code to fix this, mainly with the select function in the dyplr package:

Code
library(conflicted)
conflict_prefer("select", "dplyr")
conflict_prefer("penguins", "palmerpenguins", "modeldata")
conflict_prefer("filter", "dplyr", "stats")

The penguins data is summerised below:

Code
glimpse(penguins)
Rows: 344
Columns: 8
$ species           <fct> Adelie, Adelie, Adelie, Adelie, Adelie, Adelie, Adel…
$ island            <fct> Torgersen, Torgersen, Torgersen, Torgersen, Torgerse…
$ bill_length_mm    <dbl> 39.1, 39.5, 40.3, NA, 36.7, 39.3, 38.9, 39.2, 34.1, …
$ bill_depth_mm     <dbl> 18.7, 17.4, 18.0, NA, 19.3, 20.6, 17.8, 19.6, 18.1, …
$ flipper_length_mm <int> 181, 186, 195, NA, 193, 190, 181, 195, 193, 190, 186…
$ body_mass_g       <int> 3750, 3800, 3250, NA, 3450, 3650, 3625, 4675, 3475, …
$ sex               <fct> male, female, female, NA, female, male, female, male…
$ year              <int> 2007, 2007, 2007, 2007, 2007, 2007, 2007, 2007, 2007…

The table above shows that the data has some data recorded as NA, the code below should show how to see how many of these results there are in this data set:

Code
colSums(is.na(penguins))
          species            island    bill_length_mm     bill_depth_mm 
                0                 0                 2                 2 
flipper_length_mm       body_mass_g               sex              year 
                2                 2                11                 0 

Looking at this result, to me this shows that where 0 is present then all data is known in the column, unlike in bill length, there is 2 pieces of data marked as NA in this category.

Week 3

Notes

DA - Data Wrangling

  • Felipe began to introduce the importance of data wrangling in R and introduced us to some important functions and tricks that will be useful to know when manipulating and working with data.

  • For example the pipe operator code –> which will run any code the left of it first in the code lines and the select () function for data, I attempted to run the pipe code before having the packages installed onto R!!!

  • The Pipe operator code requires RTools to be downloaded and installed onto R before use. See below for the code:

    Code
    penguins %>% 
     dplyr:: select(1:5)
    # A tibble: 344 × 5
       species island    bill_length_mm bill_depth_mm flipper_length_mm
       <fct>   <fct>              <dbl>         <dbl>             <int>
     1 Adelie  Torgersen           39.1          18.7               181
     2 Adelie  Torgersen           39.5          17.4               186
     3 Adelie  Torgersen           40.3          18                 195
     4 Adelie  Torgersen           NA            NA                  NA
     5 Adelie  Torgersen           36.7          19.3               193
     6 Adelie  Torgersen           39.3          20.6               190
     7 Adelie  Torgersen           38.9          17.8               181
     8 Adelie  Torgersen           39.2          19.6               195
     9 Adelie  Torgersen           34.1          18.1               193
    10 Adelie  Torgersen           42            20.2               190
    # ℹ 334 more rows
Important

Data Wrangling must be completed before any data analysis can be completed

  • Another function shown to use was the Summary() function:
Code
summary(penguins)
      species          island    bill_length_mm  bill_depth_mm  
 Adelie   :152   Biscoe   :168   Min.   :32.10   Min.   :13.10  
 Chinstrap: 68   Dream    :124   1st Qu.:39.23   1st Qu.:15.60  
 Gentoo   :124   Torgersen: 52   Median :44.45   Median :17.30  
                                 Mean   :43.92   Mean   :17.15  
                                 3rd Qu.:48.50   3rd Qu.:18.70  
                                 Max.   :59.60   Max.   :21.50  
                                 NA's   :2       NA's   :2      
 flipper_length_mm  body_mass_g       sex           year     
 Min.   :172.0     Min.   :2700   female:165   Min.   :2007  
 1st Qu.:190.0     1st Qu.:3550   male  :168   1st Qu.:2007  
 Median :197.0     Median :4050   NA's  : 11   Median :2008  
 Mean   :200.9     Mean   :4202                Mean   :2008  
 3rd Qu.:213.0     3rd Qu.:4750                3rd Qu.:2009  
 Max.   :231.0     Max.   :6300                Max.   :2009  
 NA's   :2         NA's   :2                                 
  • There are 2 types of variables:
  • Categorical:

Ordinal are categories that maintain an order

Nominal –> categories with no order ranking

Binary –> nominal variables with 2 categories

  • Quantitative:

Data can only take certain values or integers

Numbered values that are measured and can be any number in a range

  • Simple table using vtable:
Code
penguins %>%
  vtable(., lush=TRUE %>%
         na.omit())
.
Name Class Values Missing Summary
species factor 'Adelie' 'Chinstrap' 'Gentoo' 0 nuniq: 3
island factor 'Biscoe' 'Dream' 'Torgersen' 0 nuniq: 3
bill_length_mm numeric Num: 32.1 to 59.6 2 mean: 43.922, sd: 5.46, nuniq: 164
bill_depth_mm numeric Num: 13.1 to 21.5 2 mean: 17.151, sd: 1.975, nuniq: 80
flipper_length_mm integer Num: 172 to 231 2 mean: 200.915, sd: 14.062, nuniq: 55
body_mass_g integer Num: 2700 to 6300 2 mean: 4201.754, sd: 801.955, nuniq: 94
sex factor 'female' 'male' 11 nuniq: 2
year integer Num: 2007 to 2009 0 mean: 2008.029, sd: 0.818, nuniq: 3
  • Summarising the mean and standard deviation of some of the penguins data:
Code
penguins %>%
  na.omit() %>%
  summarise(Bill_length_mean=mean(bill_length_mm),      Bill_depth_mean=mean(bill_depth_mm),
      Length_sd=sd(bill_length_mm), Depth_sd=sd(bill_depth_mm),
      n=n())
# A tibble: 1 × 5
  Bill_length_mean Bill_depth_mean Length_sd Depth_sd     n
             <dbl>           <dbl>     <dbl>    <dbl> <int>
1             44.0            17.2      5.47     1.97   333

RM - Research Questions

  • Felipe introduced research questions to us as the first time we get the half and half lecture.

  • We were introduced to Inductivism and Deductivism

Observation –> patter –> hypothesis –> Theory

Theory –> hypothesis –> observation –> Confirmation

Important

Research Questions must NOT have simple YES or NO answers.

Good questions MUST develop further on the knowledge

Good questions are NEVER based from bad questions.

Week 3 Extension

These next exercises are from chapter 6.6.1

These following codes are written out - Not Copied!! - from this chapter:

Code
diamonds %>%
  group_by(color, clarity) %>%
  mutate(price200 = mean(price)) %>%
  ungroup() %>%
  mutate(random10 = 10 + price) %>%
  select(cut, color, clarity, price, price200, random10) %>%
  arrange(color) %>%
  group_by(cut) %>%
  mutate(dis = n_distinct(price),
         rowID = row_number()) %>%
  ungroup()
# A tibble: 53,940 × 8
   cut       color clarity price price200 random10   dis rowID
   <ord>     <ord> <ord>   <int>    <dbl>    <dbl> <int> <int>
 1 Very Good D     VS2       357    2587.      367  5840     1
 2 Very Good D     VS1       402    3030.      412  5840     2
 3 Very Good D     VS2       403    2587.      413  5840     3
 4 Good      D     VS2       403    2587.      413  3086     1
 5 Good      D     VS1       403    3030.      413  3086     2
 6 Premium   D     VS2       404    2587.      414  6014     1
 7 Premium   D     SI1       552    2976.      562  6014     2
 8 Ideal     D     SI1       552    2976.      562  7281     1
 9 Ideal     D     SI1       552    2976.      562  7281     2
10 Very Good D     VVS1      553    2948.      563  5840     4
# ℹ 53,930 more rows

Problem A

Code
library(tidyverse)
midwest %>%               # this line uses the midwest dataset
  group_by(state) %>%     # groups data by state variable
  summarise(poptotalmean = mean(poptotal),   # summarises the mean from the poptotal variable
            poptotalmed = median(poptotal),  # summarises the median for poptotal variable
            popmax = max(poptotal),          # summarises the max value from poptotal
            popmin = min(poptotal),          # summarises the min value from poptotal
            popdistinct = n_distinct(poptotal), # summarises the number of distinct values from poptotal variable
            popfirst = first(poptotal),      # unsure, think its the first value in the state variable
            popany = any(poptotal < 5000),   # identifies if any areas have less than 5000
            popany2 = any(poptotal > 2000000)) %>%  # identifies is any have more than 2,000,000 
  ungroup()  # ungroups the data
# A tibble: 5 × 9
  state poptotalmean poptotalmed  popmax popmin popdistinct popfirst popany
  <chr>        <dbl>       <dbl>   <int>  <int>       <int>    <int> <lgl> 
1 IL         112065.      24486. 5105067   4373         101    66090 TRUE  
2 IN          60263.      30362.  797159   5315          92    31095 FALSE 
3 MI         111992.      37308  2111687   1701          83    10145 TRUE  
4 OH         123263.      54930. 1412140  11098          88    25371 FALSE 
5 WI          67941.      33528   959275   3890          72    15682 TRUE  
# ℹ 1 more variable: popany2 <lgl>

Problem B

Code
midwest %>%       # uses midwest data
  group_by(state) %>%    # groups data by state
  summarise(num5k = sum(poptotal < 5000),   # summarises number of states with pop under 5000
            num2mil = sum(poptotal > 2000000), # summarises number of states with pop over 2mil
            numrows = n()) %>% # number of rows with summarised data (i think)
  ungroup()
# A tibble: 5 × 4
  state num5k num2mil numrows
  <chr> <int>   <int>   <int>
1 IL        1       1     102
2 IN        0       0      92
3 MI        1       1      83
4 OH        0       0      88
5 WI        2       0      72

Problem C Pat 1

Code
midwest %>%   # midwest data set
  group_by(county) %>%  # grouped by county variable
  summarise(x = n_distinct(state))%>%  # creating new variable x for the state each county is in
  arrange(desc(x))%>%  # these are then arranged from highest number to lowest and therefore arranged county by state
  ungroup()
# A tibble: 320 × 2
   county         x
   <chr>      <int>
 1 CRAWFORD       5
 2 JACKSON        5
 3 MONROE         5
 4 ADAMS          4
 5 BROWN          4
 6 CLARK          4
 7 CLINTON        4
 8 JEFFERSON      4
 9 LAKE           4
10 WASHINGTON     4
# ℹ 310 more rows

part 2

N() collects the total number including duplicate values, n_distinct() collects the total number counting duplicate values together. (I know what I mean and trying to say just not explaining it well!) These 2 are different but can show the same results, the example in part 1 will show the same as the following as we are using county’s as the variable and here there will be no duplicates for places.

Code
midwest %>%
  group_by(county) %>%
  summarise(x = n()) %>%
  ungroup()
# A tibble: 320 × 2
   county        x
   <chr>     <int>
 1 ADAMS         4
 2 ALCONA        1
 3 ALEXANDER     1
 4 ALGER         1
 5 ALLEGAN       1
 6 ALLEN         2
 7 ALPENA        1
 8 ANTRIM        1
 9 ARENAC        1
10 ASHLAND       2
# ℹ 310 more rows

Part 3

Not sure what part 3 is looking at or for?? the questions are worded a bit odd to me.

Problem D

Code
diamonds %>%     # uses diamonds data set
  group_by(clarity) %>%   # groups by clarity variable
  summarise(a = n_distinct(color),   # summarises the number colours for each clarity
            b = n_distinct(price),   # summarises the number of different prices for each clarity
            c = n()) %>%     # shows the number in each type of the clarity variable
  ungroup()
# A tibble: 8 × 4
  clarity     a     b     c
  <ord>   <int> <int> <int>
1 I1          7   632   741
2 SI2         7  4904  9194
3 SI1         7  5380 13065
4 VS2         7  5051 12258
5 VS1         7  3926  8171
6 VVS2        7  2409  5066
7 VVS1        7  1623  3655
8 IF          7   902  1790

Problem E Part 1

Code
diamonds %>%    # Uses diamonds data
  group_by(color, cut) %>%    #groups data by both colour and cut, (has used colour first)
  summarise(m = mean(price),  # has summarised the mean of each colour and cut using price variable
            s = sd(price)) %>%   # has summarised the standard deviation of each colour and cut using price variable
  ungroup()
# A tibble: 35 × 4
   color cut           m     s
   <ord> <ord>     <dbl> <dbl>
 1 D     Fair      4291. 3286.
 2 D     Good      3405. 3175.
 3 D     Very Good 3470. 3524.
 4 D     Premium   3631. 3712.
 5 D     Ideal     2629. 3001.
 6 E     Fair      3682. 2977.
 7 E     Good      3424. 3331.
 8 E     Very Good 3215. 3408.
 9 E     Premium   3539. 3795.
10 E     Ideal     2598. 2956.
# ℹ 25 more rows

Part 2

Code
diamonds %>%
  group_by(cut, color) %>%
  summarise(m = mean(price),
            s = sd(price)) %>%
  ungroup()
# A tibble: 35 × 4
   cut   color     m     s
   <ord> <ord> <dbl> <dbl>
 1 Fair  D     4291. 3286.
 2 Fair  E     3682. 2977.
 3 Fair  F     3827. 3223.
 4 Fair  G     4239. 3610.
 5 Fair  H     5136. 3886.
 6 Fair  I     4685. 3730.
 7 Fair  J     4976. 4050.
 8 Good  D     3405. 3175.
 9 Good  E     3424. 3331.
10 Good  F     3496. 3202.
# ℹ 25 more rows

This is the same principle as part 1, however, cut is prioritized over colour.

part 3

Code
diamonds %>%   # diamonds data
  group_by(cut, color, clarity) %>%   # grouped by cut, color, and clarity
  summarise(m = mean(price),  # mean price
            s = sd(price),    # price standard deviation
            msale = m * 0.80) %>%   # mean value with a 20% sale/reduction
  ungroup()
# A tibble: 276 × 6
   cut   color clarity     m     s msale
   <ord> <ord> <ord>   <dbl> <dbl> <dbl>
 1 Fair  D     I1      7383  5899. 5906.
 2 Fair  D     SI2     4355. 3260. 3484.
 3 Fair  D     SI1     4273. 3019. 3419.
 4 Fair  D     VS2     4513. 3383. 3610.
 5 Fair  D     VS1     2921. 2550. 2337.
 6 Fair  D     VVS2    3607  3629. 2886.
 7 Fair  D     VVS1    4473  5457. 3578.
 8 Fair  D     IF      1620.  525. 1296.
 9 Fair  E     I1      2095.  824. 1676.
10 Fair  E     SI2     4172. 3055. 3338.
# ℹ 266 more rows

Problem F This problem shows a code with random names in for example summarise(potato = mean(dept)) etc. here I am change these names to more appropriate ones.

Code
diamonds %>%
  group_by(cut) %>%
  summarise(D = mean(depth),
            P = mean(price),
            Width= median(y),
            DP = D-P,
            MeansSQD = DP ^ 2,
            n = n()) %>%
  ungroup()
# A tibble: 5 × 7
  cut           D     P Width     DP  MeansSQD     n
  <ord>     <dbl> <dbl> <dbl>  <dbl>     <dbl> <int>
1 Fair       64.0 4359.  6.1  -4295. 18444586.  1610
2 Good       62.4 3929.  5.99 -3866. 14949811.  4906
3 Very Good  61.8 3982.  5.77 -3920. 15365942. 12082
4 Premium    61.3 4584.  6.06 -4523. 20457466. 13791
5 Ideal      61.7 3458.  5.26 -3396. 11531679. 21551

Problem G part 1

Code
diamonds %>%    #diamonds data
  group_by(color) %>%     #grouping data using color variable
  summarise(m = mean(price)) %>%    # creating a mean price for each color variable
  mutate(x1 = str_c("Diamond color ", color),   #creating a new column with each row having diamond color conmbine together with the color variable
         x2 = 5) %>%  #creating a new column where all values are 5
  ungroup()
# A tibble: 7 × 4
  color     m x1                 x2
  <ord> <dbl> <chr>           <dbl>
1 D     3170. Diamond color D     5
2 E     3077. Diamond color E     5
3 F     3725. Diamond color F     5
4 G     3999. Diamond color G     5
5 H     4487. Diamond color H     5
6 I     5092. Diamond color I     5
7 J     5324. Diamond color J     5

part 2

Code
diamonds %>%
  group_by(color) %>%
  summarise(m = mean(price)) %>%
  ungroup() %>%
  mutate(x1 = str_c("Diamond color ", color),
         x2 = 5)
# A tibble: 7 × 4
  color     m x1                 x2
  <ord> <dbl> <chr>           <dbl>
1 D     3170. Diamond color D     5
2 E     3077. Diamond color E     5
3 F     3725. Diamond color F     5
4 G     3999. Diamond color G     5
5 H     4487. Diamond color H     5
6 I     5092. Diamond color I     5
7 J     5324. Diamond color J     5

The first ungroup here means that the code is no longer working from the color variable but the whole data set. There is no difference in the data produced as the final code is also looking just at the color variable.

Problem H part 1

Code
diamonds %>%
  group_by(color) %>%
  mutate(x1 = price * 0.5) %>%  #creating a half price column
  summarise(m = mean(x1)) %>%   #finding the means of the half price values
  ungroup()
# A tibble: 7 × 2
  color     m
  <ord> <dbl>
1 D     1585.
2 E     1538.
3 F     1862.
4 G     2000.
5 H     2243.
6 I     2546.
7 J     2662.

Part 2 #| code-fold: true #| include: true #| warning: true #| label: Problem H part 2

diamonds %>% group_by(color) %>% mutate(x1 = price * 0.5) %>% ungroup() summarise (m = mean(x1))

this code was used for this section, as it provides an error, workbook cant render:

the code is unable to find the new variable x1 and produce a mean as the data has been ungrouped here (i think).

Questions

Why is grouping data necessary? This allows us to perform data analysis on the variables you are wanting to test for rather than having to perform it on the whole data set, this can for me, see a benefit in keeping organised in the data set.

Why is ungrouping necessary? If this ungroup code is not used then the code string will still use the initial group by string in later codes, this can and will cause issues in the later data analysis where you are potentially wanting to move onto another varuable.

When should you ungroup? When it is needed, depending on what you are looking to analyse, as a rule of thumb is group_by() is used then always ungroup() at the end of the code chunk!

####Extra Practice!

Code
view(diamonds)
Code
diamonds %>%
  arrange(desc(price))
# A tibble: 53,940 × 10
   carat cut       color clarity depth table price     x     y     z
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>
 1  2.29 Premium   I     VS2      60.8    60 18823  8.5   8.47  5.16
 2  2    Very Good G     SI1      63.5    56 18818  7.9   7.97  5.04
 3  1.51 Ideal     G     IF       61.7    55 18806  7.37  7.41  4.56
 4  2.07 Ideal     G     SI2      62.5    55 18804  8.2   8.13  5.11
 5  2    Very Good H     SI1      62.8    57 18803  7.95  8     5.01
 6  2.29 Premium   I     SI1      61.8    59 18797  8.52  8.45  5.24
 7  2.04 Premium   H     SI1      58.1    60 18795  8.37  8.28  4.84
 8  2    Premium   I     VS1      60.8    59 18795  8.13  8.02  4.91
 9  1.71 Premium   F     VS2      62.3    59 18791  7.57  7.53  4.7 
10  2.15 Ideal     G     SI2      62.6    54 18791  8.29  8.35  5.21
# ℹ 53,930 more rows
Code
diamonds %>%
  arrange(price)
# A tibble: 53,940 × 10
   carat cut       color clarity depth table price     x     y     z
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>
 1  0.23 Ideal     E     SI2      61.5    55   326  3.95  3.98  2.43
 2  0.21 Premium   E     SI1      59.8    61   326  3.89  3.84  2.31
 3  0.23 Good      E     VS1      56.9    65   327  4.05  4.07  2.31
 4  0.29 Premium   I     VS2      62.4    58   334  4.2   4.23  2.63
 5  0.31 Good      J     SI2      63.3    58   335  4.34  4.35  2.75
 6  0.24 Very Good J     VVS2     62.8    57   336  3.94  3.96  2.48
 7  0.24 Very Good I     VVS1     62.3    57   336  3.95  3.98  2.47
 8  0.26 Very Good H     SI1      61.9    55   337  4.07  4.11  2.53
 9  0.22 Fair      E     VS2      65.1    61   337  3.87  3.78  2.49
10  0.23 Very Good H     VS1      59.4    61   338  4     4.05  2.39
# ℹ 53,930 more rows
Code
diamonds %>%
  arrange(price, cut)
# A tibble: 53,940 × 10
   carat cut       color clarity depth table price     x     y     z
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>
 1  0.21 Premium   E     SI1      59.8    61   326  3.89  3.84  2.31
 2  0.23 Ideal     E     SI2      61.5    55   326  3.95  3.98  2.43
 3  0.23 Good      E     VS1      56.9    65   327  4.05  4.07  2.31
 4  0.29 Premium   I     VS2      62.4    58   334  4.2   4.23  2.63
 5  0.31 Good      J     SI2      63.3    58   335  4.34  4.35  2.75
 6  0.24 Very Good J     VVS2     62.8    57   336  3.94  3.96  2.48
 7  0.24 Very Good I     VVS1     62.3    57   336  3.95  3.98  2.47
 8  0.22 Fair      E     VS2      65.1    61   337  3.87  3.78  2.49
 9  0.26 Very Good H     SI1      61.9    55   337  4.07  4.11  2.53
10  0.23 Very Good H     VS1      59.4    61   338  4     4.05  2.39
# ℹ 53,930 more rows
Code
diamonds %>%
  arrange(desc(price), desc(cut))
# A tibble: 53,940 × 10
   carat cut       color clarity depth table price     x     y     z
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>
 1  2.29 Premium   I     VS2      60.8    60 18823  8.5   8.47  5.16
 2  2    Very Good G     SI1      63.5    56 18818  7.9   7.97  5.04
 3  1.51 Ideal     G     IF       61.7    55 18806  7.37  7.41  4.56
 4  2.07 Ideal     G     SI2      62.5    55 18804  8.2   8.13  5.11
 5  2    Very Good H     SI1      62.8    57 18803  7.95  8     5.01
 6  2.29 Premium   I     SI1      61.8    59 18797  8.52  8.45  5.24
 7  2.04 Premium   H     SI1      58.1    60 18795  8.37  8.28  4.84
 8  2    Premium   I     VS1      60.8    59 18795  8.13  8.02  4.91
 9  2.15 Ideal     G     SI2      62.6    54 18791  8.29  8.35  5.21
10  1.71 Premium   F     VS2      62.3    59 18791  7.57  7.53  4.7 
# ℹ 53,930 more rows
Code
diamonds %>%
  arrange(price, clarity)
# A tibble: 53,940 × 10
   carat cut       color clarity depth table price     x     y     z
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>
 1  0.23 Ideal     E     SI2      61.5    55   326  3.95  3.98  2.43
 2  0.21 Premium   E     SI1      59.8    61   326  3.89  3.84  2.31
 3  0.23 Good      E     VS1      56.9    65   327  4.05  4.07  2.31
 4  0.29 Premium   I     VS2      62.4    58   334  4.2   4.23  2.63
 5  0.31 Good      J     SI2      63.3    58   335  4.34  4.35  2.75
 6  0.24 Very Good J     VVS2     62.8    57   336  3.94  3.96  2.48
 7  0.24 Very Good I     VVS1     62.3    57   336  3.95  3.98  2.47
 8  0.26 Very Good H     SI1      61.9    55   337  4.07  4.11  2.53
 9  0.22 Fair      E     VS2      65.1    61   337  3.87  3.78  2.49
10  0.23 Very Good H     VS1      59.4    61   338  4     4.05  2.39
# ℹ 53,930 more rows
Code
diamonds %>%
  mutate(salePrice = price - 250)
# A tibble: 53,940 × 11
   carat cut       color clarity depth table price     x     y     z salePrice
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>     <dbl>
 1  0.23 Ideal     E     SI2      61.5    55   326  3.95  3.98  2.43        76
 2  0.21 Premium   E     SI1      59.8    61   326  3.89  3.84  2.31        76
 3  0.23 Good      E     VS1      56.9    65   327  4.05  4.07  2.31        77
 4  0.29 Premium   I     VS2      62.4    58   334  4.2   4.23  2.63        84
 5  0.31 Good      J     SI2      63.3    58   335  4.34  4.35  2.75        85
 6  0.24 Very Good J     VVS2     62.8    57   336  3.94  3.96  2.48        86
 7  0.24 Very Good I     VVS1     62.3    57   336  3.95  3.98  2.47        86
 8  0.26 Very Good H     SI1      61.9    55   337  4.07  4.11  2.53        87
 9  0.22 Fair      E     VS2      65.1    61   337  3.87  3.78  2.49        87
10  0.23 Very Good H     VS1      59.4    61   338  4     4.05  2.39        88
# ℹ 53,930 more rows
Code
diamonds %>%
  select(1:7)
# A tibble: 53,940 × 7
   carat cut       color clarity depth table price
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int>
 1  0.23 Ideal     E     SI2      61.5    55   326
 2  0.21 Premium   E     SI1      59.8    61   326
 3  0.23 Good      E     VS1      56.9    65   327
 4  0.29 Premium   I     VS2      62.4    58   334
 5  0.31 Good      J     SI2      63.3    58   335
 6  0.24 Very Good J     VVS2     62.8    57   336
 7  0.24 Very Good I     VVS1     62.3    57   336
 8  0.26 Very Good H     SI1      61.9    55   337
 9  0.22 Fair      E     VS2      65.1    61   337
10  0.23 Very Good H     VS1      59.4    61   338
# ℹ 53,930 more rows
Code
diamonds %>%
  group_by(cut) %>%
  summarise(n()) %>%
  ungroup()
# A tibble: 5 × 2
  cut       `n()`
  <ord>     <int>
1 Fair       1610
2 Good       4906
3 Very Good 12082
4 Premium   13791
5 Ideal     21551
Code
diamonds %>%
  mutate(totalNum = n())
# A tibble: 53,940 × 11
   carat cut       color clarity depth table price     x     y     z totalNum
   <dbl> <ord>     <ord> <ord>   <dbl> <dbl> <int> <dbl> <dbl> <dbl>    <int>
 1  0.23 Ideal     E     SI2      61.5    55   326  3.95  3.98  2.43    53940
 2  0.21 Premium   E     SI1      59.8    61   326  3.89  3.84  2.31    53940
 3  0.23 Good      E     VS1      56.9    65   327  4.05  4.07  2.31    53940
 4  0.29 Premium   I     VS2      62.4    58   334  4.2   4.23  2.63    53940
 5  0.31 Good      J     SI2      63.3    58   335  4.34  4.35  2.75    53940
 6  0.24 Very Good J     VVS2     62.8    57   336  3.94  3.96  2.48    53940
 7  0.24 Very Good I     VVS1     62.3    57   336  3.95  3.98  2.47    53940
 8  0.26 Very Good H     SI1      61.9    55   337  4.07  4.11  2.53    53940
 9  0.22 Fair      E     VS2      65.1    61   337  3.87  3.78  2.49    53940
10  0.23 Very Good H     VS1      59.4    61   338  4     4.05  2.39    53940
# ℹ 53,930 more rows

Week 4

Pre-Session Task

Following the R Graphics Cookbook chapter 2 and the mtcars database which is on R already:

Scatter plot

Code
data(mtcars)

plot(mtcars$wt, mtcars$mpg)

ggplot()

Code
mtcars %>%
ggplot(aes(x = wt, y = mpg)) +
geom_point()

Line Graph

Code
plot(pressure$temperature, pressure$pressure, type = "l")
points(pressure$temperature, pressure$pressure)

lines(pressure$temperature, pressure$pressure/2, col = "red")
points(pressure$temperature, pressure$pressure/2, col = "red")

Bar Graph

Code
barplot(BOD$demand, names.arg=BOD$Time, col="purple")

Code
barplot(table(mtcars$cyl))

Histograms

Code
hist(mtcars$mpg)

Code
hist(mtcars$mpg, breaks=10)

Code
ggplot(mtcars, aes(x = mpg)) +
geom_histogram()

Code
ggplot(mtcars, aes(x = mpg)) +
  geom_histogram(binwidth = 4)

Box plot

Code
plot(ToothGrowth$supp, ToothGrowth$len)

Function Curve

Code
curve(x^3 - 5*x, from = -4, to = 4)

Session Activities

This will be using the data collected during the session the how we are feeling about the module on a scale of 1-10!!

adding the data using:

#|code-fold: true
satisfaction <- read_xlsx("C:\\Users\\jacob\\Downloads\\scale of satisfaction(Planilha1).xlsx")

satisfaction
# A tibble: 79 × 2
   Person value
    <dbl> <dbl>
 1      1  12  
 2      2  20  
 3      3  15  
 4      4   5  
 5      5   4  
 6      6   6  
 7      7   6  
 8      8   4  
 9      9   4  
10     10   7.5
# ℹ 69 more rows

First a histogram will be made:

Code
satisfaction %>%
  na.omit(satisfaction) %>%
  ggplot(aes(x = Person, fill=value)) +
  geom_histogram(bins=15, fill="darkred")

Second, Boxplot:

Code
satisfaction %>%
  group_by(value) %>%
  ggplot(aes(x=value))+
  geom_boxplot()

Distributions:

Code
satisfaction %>%
  na.omit(satisfaction) %>%
  ggplot(aes(x = value)) +
geom_density()

Week 4 Post Session Activities

Data analysis

Following the video, the below using the graph and data set from there.

Code
library(tidyverse)
library(modeldata)

data("crickets")
crickets
# A tibble: 31 × 3
   species           temp  rate
   <fct>            <dbl> <dbl>
 1 O. exclamationis  20.8  67.9
 2 O. exclamationis  20.8  65.1
 3 O. exclamationis  24    77.3
 4 O. exclamationis  24    78.7
 5 O. exclamationis  24    79.4
 6 O. exclamationis  24    80.4
 7 O. exclamationis  26.2  85.8
 8 O. exclamationis  26.2  86.6
 9 O. exclamationis  26.2  87.5
10 O. exclamationis  26.2  89.1
# ℹ 21 more rows

Scatter plot:

Code
crickets %>%
  ggplot(aes(x = temp, y = rate,
             color = species))+
  geom_point()

Code
labs(x = "Temperature",
       y = "Chirp rate",
       title = "Cricket chirps",
       caption = "Source: McDonald (2009)",
     color = "Species")
$x
[1] "Temperature"

$y
[1] "Chirp rate"

$colour
[1] "Species"

$title
[1] "Cricket chirps"

$caption
[1] "Source: McDonald (2009)"

attr(,"class")
[1] "labels"
Code
scale_color_brewer(palette = "Dark2")
<ggproto object: Class ScaleDiscrete, Scale, gg>
    aesthetics: colour
    axis_order: function
    break_info: function
    break_positions: function
    breaks: waiver
    call: call
    clone: function
    dimension: function
    drop: TRUE
    expand: waiver
    get_breaks: function
    get_breaks_minor: function
    get_labels: function
    get_limits: function
    get_transformation: function
    guide: legend
    is_discrete: function
    is_empty: function
    labels: waiver
    limits: NULL
    make_sec_title: function
    make_title: function
    map: function
    map_df: function
    n.breaks.cache: NULL
    na.translate: TRUE
    na.value: NA
    name: waiver
    palette: function
    palette.cache: NULL
    position: left
    range: environment
    rescale: function
    reset: function
    train: function
    train_df: function
    transform: function
    transform_df: function
    super:  <ggproto object: Class ScaleDiscrete, Scale, gg>

Modyfying the Graph:

Code
crickets %>%
  ggplot(aes(x = temp, y = rate,
             color = species))+
  geom_point(color = "red",
             size = 1.5,
             alpha = 0.5,
             shape = "square")

Code
labs(x = "Temperature",
       y = "Chirp rate",
       title = "Cricket chirps",
       caption = "Source: McDonald (2009)",
     color = "Species")
$x
[1] "Temperature"

$y
[1] "Chirp rate"

$colour
[1] "Species"

$title
[1] "Cricket chirps"

$caption
[1] "Source: McDonald (2009)"

attr(,"class")
[1] "labels"
Code
scale_color_brewer(palette = "Dark2")
<ggproto object: Class ScaleDiscrete, Scale, gg>
    aesthetics: colour
    axis_order: function
    break_info: function
    break_positions: function
    breaks: waiver
    call: call
    clone: function
    dimension: function
    drop: TRUE
    expand: waiver
    get_breaks: function
    get_breaks_minor: function
    get_labels: function
    get_limits: function
    get_transformation: function
    guide: legend
    is_discrete: function
    is_empty: function
    labels: waiver
    limits: NULL
    make_sec_title: function
    make_title: function
    map: function
    map_df: function
    n.breaks.cache: NULL
    na.translate: TRUE
    na.value: NA
    name: waiver
    palette: function
    palette.cache: NULL
    position: left
    range: environment
    rescale: function
    reset: function
    train: function
    train_df: function
    transform: function
    transform_df: function
    super:  <ggproto object: Class ScaleDiscrete, Scale, gg>

Adding more to the graph and data:

Code
crickets %>%
  ggplot(aes(x = temp, y = rate,
             colour = species))+
  geom_point() +
  geom_smooth(method = "lm", se = FALSE) +
labs(x = "Temperature",
       y = "Chirp rate",
       title = "Cricket chirps",
       caption = "Source: McDonald (2009)",
     color = "Species") +
scale_color_brewer(palette = "Dark2")

Other plots:

Histogram:

Code
crickets %>%
  ggplot(aes(x = rate)) +
  geom_histogram(bins = 15)

Frequency Polygon

Code
crickets %>%
  ggplot(aes(x = rate)) +
  geom_freqpoly(bins = 15)

Bar Chart:

Code
crickets %>%
  ggplot(aes(x = species)) +
  geom_bar(color = "darkblue", fill="lightblue")

Code
crickets %>%
  ggplot(aes(x = species, fill=species)) +
  geom_bar(show.legend = FALSE)

Box Plot:

Code
crickets %>%
  ggplot(aes(x = species, y = rate, fill = species)) +
  geom_boxplot(show.legend = FALSE) +
theme_minimal()

Faceting:

Code
crickets %>%
  ggplot(aes(x = rate, fill =species)) +
  geom_histogram(bins = 15)

Code
crickets %>%
  ggplot(aes(x = rate, fill = species)) +
  geom_histogram(bins = 15, show.legend = FALSE) +
facet_wrap(~species) +
theme_minimal()

Code
crickets %>%
  ggplot(aes(x = rate, fill = species)) +
  geom_histogram(bins = 15, show.legend = FALSE) +
facet_wrap(~species, ncol = 1) +
theme_minimal()

Research Methods Definition

A research hypothesis should be a short statement that refers to the expected results from an experiment or test. This must use the scientific method that looks to get facts and correct any errors or mistakes that could arise. A good hypothesis must be answerable with an “educated guess” which can then be proven correct or incorrect using scientific research and methods. Furthermore, the hypothesis should be clear and follow reproducible patterns in the methods and should always be achievable.

Note

My interpretation of a good research hypothesis, which may not be correct!

Week 5

Pre-session

The paper by Mishra et al. (2019), titled “Selection of Appropriate Statistical Methods for Data Analysis,” addresses the challenges of choosing suitable statistical techniques for data analysis in research. The authors focus on the importance of selecting the correct methods based on the nature of the data and the research objectives, as inappropriate choices can lead to misleading conclusions.

Key Points:

1. Understanding Data Types: The authors emphasize the importance of identifying the type of data—whether nominal, ordinal, interval, or ratio—as this determines which statistical tests are appropriate.

2. Objective-Based Approach: Mishra et al. suggest that the statistical method should align with the research objectives, such as describing data, testing relationships, or making predictions.

3. Common Statistical Methods:

Descriptive Statistics: For summarizing and organizing data (e.g., mean, median, mode, standard deviation).

Inferential Statistics: To make predictions or inferences about a population based on a sample (e.g., t-tests, chi-square tests, ANOVA).

Regression Analysis: Useful for predicting relationships between dependent and independent variables (e.g., linear, logistic regression).

Non-parametric Tests: Recommended when data do not meet the assumptions of parametric tests (e.g., Mann-Whitney U test, Kruskal-Wallis test).

4. Assumptions in Statistical Testing: The authors highlight the need to check assumptions like normality, homoscedasticity, and independence before applying parametric tests. Violation of these assumptions may require alternative methods or transformations.

5. Software and Tools: Mishra et al. discuss the role of statistical software in helping researchers perform complex analyses and encourage familiarity with tools like SPSS, R, and SAS for applying these methods.

6. Decision Flowchart: The paper includes a flowchart to guide researchers in selecting the appropriate statistical method based on the type of data, research design, and hypothesis.

Conclusion:

The paper serves as a guide for researchers in selecting the correct statistical methods, emphasizing that an understanding of both the data type and research goals is crucial. The authors advocate for a methodical approach, cautioning against blindly applying techniques without considering underlying assumptions and the research context.

This summary captures the essence of Mishra et al.’s paper, highlighting its practical guidance for statistical method selection in research.

The 2010 paper by Parab and Bhalerao, titled “Choosing Statistical Test”, serves as a practical guide for researchers in selecting the appropriate statistical tests based on their data type and research objectives. The paper is geared towards helping non-statisticians understand when and how to use various statistical tests.

Key Points:

1. Importance of Choosing the Right Test:

The authors emphasize that selecting the correct statistical test is crucial for ensuring valid and reliable results. Incorrect selection can lead to false conclusions.

2. Classification of Data:

Qualitative (Categorical) Data: Includes nominal and ordinal data, where the numbers represent categories or ranks but do not convey numerical meaning.

Quantitative (Numerical) Data: Includes interval and ratio data, where numbers represent measurable quantities with consistent intervals.

3. Understanding Data Distribution:

Parab and Bhalerao highlight the importance of checking whether the data follow a normal distribution. This will influence whether parametric (for normally distributed data) or non-parametric (for non-normally distributed data) tests should be used.

4. Selection of Statistical Tests Based on Research Questions:

Comparing Means (Differences):

Two groups: If data are normally distributed, use a t-test (independent or paired). If not, use the Mann-Whitney U test.

More than two groups: For normally distributed data, use ANOVA. For non-normal data, use the Kruskal-Wallis test.

Comparing Proportions:

Use the Chi-square test for categorical data or the Fisher’s exact test for small sample sizes.

Testing Correlations:

For normally distributed interval/ratio data, use Pearson’s correlation. For non-normally distributed or ordinal data, use Spearman’s rank correlation.

5. Checklist for Choosing a Test:

Parab and Bhalerao provide a simplified decision process:

1. Identify the type of data (categorical vs. numerical).

2. Determine the distribution (normal vs. non-normal).

3. Establish the research objective (e.g., comparing means, assessing relationships).

4. Choose parametric tests for normal data and non-parametric tests for non-normal data.

6. Common Mistakes:

The authors warn about common pitfalls, such as using parametric tests on non-normal data or misinterpreting results when assumptions are violated. They stress the need to check assumptions before applying tests.

Conclusion:

The paper provides a concise framework for researchers to follow when selecting statistical tests, focusing on the nature of the data and the research question. It serves as a practical, easy-to-understand guide for non-statisticians, emphasizing that understanding the type of data and its distribution is crucial for choosing the right test.

This summary encapsulates the main points from Parab and Bhalerao (2010) on how to choose statistical tests effectively.

Here’s a step-by-step guide based on the principles outlined in Mishra et al.’s paper (2019), presented as a logical flow that can serve as a decision-making process for selecting statistical methods:

Step 1: Identify the Type of Data

What kind of data do you have?

Nominal (Categorical): Data represent categories without a natural order (e.g., gender, ethnicity).

Ordinal: Data represent categories with a clear order but no consistent difference between them (e.g., rankings).

Interval: Data are numeric, with meaningful intervals but no true zero (e.g., temperature in Celsius).

Ratio: Data are numeric with a true zero point (e.g., height, weight).

Next Step: After identifying the data type, move on to define the objective.


Step 2: Define the Research Objective

What is your goal?

1. Descriptive Analysis: Summarizing the characteristics of the data.

Methods: Mean, median, mode, standard deviation (for interval/ratio data), frequencies, and percentages (for categorical/ordinal data).

2. Testing Relationships or Differences:

Are you comparing groups?

Yes → Move to Step 3.

Are you testing associations between variables?

Yes → Move to Step 4.

3. Prediction: Do you want to predict an outcome based on one or more predictors?

Yes → Move to Step 5.


Step 3: Testing for Differences Between Groups

How many groups are you comparing?

Two Groups:

For interval/ratio data with a normal distribution: t-test (Independent t-test or Paired t-test).

For ordinal or non-normally distributed interval/ratio data: Mann-Whitney U Test or Wilcoxon Signed-Rank Test.

For nominal data: Chi-square test or Fisher’s exact test (if small sample size).

More than Two Groups:

For normally distributed interval/ratio data: ANOVA.

For non-normally distributed interval/ratio data: Kruskal-Wallis test.

For ordinal data: Friedman test.

For nominal data: Chi-square test (or Fisher’s exact for small samples).


Step 4: Testing Associations Between Variables

What types of variables are you testing for associations?

Both variables are nominal: Use a Chi-square test.

Ordinal variables or ranks: Use Spearman’s rank correlation.

Interval/ratio variables:

For normally distributed data: Use Pearson’s correlation.

For non-normally distributed data: Use Spearman’s correlation.


Step 5: Predicting Outcomes (Regression Analysis)

What kind of outcome are you predicting?

Binary (Yes/No) outcome: Use Logistic Regression.

Continuous outcome: Use Linear Regression.

Categorical outcome with more than two levels: Use Multinomial Logistic Regression.

Count outcome: Use Poisson regression.


Step 6: Checking Assumptions

Before running any parametric tests (e.g., t-test, ANOVA, Pearson’s correlation), ensure that assumptions like normality, homoscedasticity, and independence are met. If these assumptions are violated:

Consider using non-parametric alternatives like the Mann-Whitney U test or Kruskal-Wallis test.


Step 7: Use Statistical Software

Finally, implement your chosen method using statistical software (e.g., SPSS, R, SAS) that will guide you through computations.


Example Flowchart (text representation):

What type of data?

Nominal → Chi-square, Fisher’s exact.

Ordinal → Spearman correlation, Mann-Whitney U, Kruskal-Wallis.

Interval/Ratio:

Normal → Pearson correlation, t-test, ANOVA.

Non-normal → Spearman correlation, Mann-Whitney U, Kruskal-Wallis.

What is your objective?

Descriptive → Use mean, median, standard deviation, or frequencies.

Testing differences → Compare 2 or more groups using t-tests, ANOVA, or non-parametric tests.

Predicting → Use regression analysis (logistic, linear, or Poisson based on the outcome type).

Session Notes

Creating the session graphs from the slides

What is the relationship between bill length and flipper length?

Code
penguins %>%
  na.omit() %>%
  ggplot(aes(
    x= bill_length_mm,
    y= flipper_length_mm,
    color= species
  )) +
geom_point() -> num_x_num 

num_x_num

We can see from this graph that there is a relationship between the species bill and flipper length.

Can body mass predict sex?

Code
penguins %>%
  na.omit() %>%
  ggplot(aes(
    x= body_mass_g,
    y= sex,
    color= species
  )) +
  geom_point() -> num_x_cat

num_x_cat

This graph shows that there are differences in the species sex and body mass to be explored, especially in the Gentoo females compared with the Chinstrap and Adelie penguins. In the males Gentoo again show the largest body mass but the Adelie and Chinstrap’s cross over a large amount and would need to be explored further.

Count and species bar plot

Code
penguins %>%
  na.omit() %>%
  ggplot(aes(
    x= species,
    fill= sex
  )) +
  geom_bar(position = "dodge") -> cat_x_cat

cat_x_cat

This bar chart shows that the number of males and females of each penguins species are very similar.

Bill Length for each species

Code
penguins %>%
  na.omit() %>%
  ggplot(aes(
    x= species,
    y= bill_length_mm,
    fill= sex
  )) +
  geom_boxplot(position = "dodge") -> cat_x_num

cat_x_num

Week 5 post session activities

Code
data("iris")

  iris %>%
  na.omit() %>%
  ggplot(aes(
    x= Species,
    y= Sepal.Length,
    color= Species
  )) +
    geom_boxplot()

I think this test should be using the ANOVA test as the x-axis (or independent/predictor variable) is categorical and the y-axis (or outcome/dependent variable) is quantitative. I then believe that as there are 3 different types of species and only 1 outcome variable that the ANOVA test is the most applicable for this data.

Code
iris %>%
  na.omit() %>%
  ggplot(aes(
    x= Petal.Length,
    fill= Species
  )) +
  geom_density(alpha = 0.5)

I think this test should be using the multiple regression test as the independent variable is quantitative and the dependent variable is also quantitative. As there are more than 2 different predictor variables (species and petal length) a multiple regression is more applicable than the single regression test.

Code
iris %>%
  na.omit() %>%
  ggplot(aes(
    x= Petal.Length,
    y= Petal.Width
  )) +
  geom_point(mapping = aes(color = Species, shape = Species)) +
  geom_smooth(method = "lm")

I think this set will also use the multiple regression statistical test, as both x and y-axis are quantitative and again the petal length and species provide 2 predictor variables suiting the multiple regression method.

Code
iris %>%
  na.omit() %>%
  mutate(size=ifelse(Sepal.Length < median(Sepal.Length),
  "small", "big")) %>%
  ggplot(aes(
    x= Species,
    fill= size
  )) +
    geom_bar(position = "dodge")

I think that the best test for this data is the Chi-Square test as the x-axis uses categorical values. I was first thrown off the scent with this one thinking maybe anova was needed due to the fact that the y-axis was a quantitative value, being numbers, but realised in the end that it was actually a frequency table therefore making the Chi-test more applicable!

Week 6

Pression Tasks

Contingency Tables

  • A frequency table is a list of values ans how often each appears. The frequency is the number of times a specific data value occurs in your data set, helping to understand which values are most or least commonly observed.

  • A contingency table (or two-way frequency table), is a tabular mechanism with at least 2 rows and 2 columns used in stats to present categorical data in terms of frequency counts.

Proportions

The following tips chunks will show how to create proportions from tabled data sets:

Code: prop.table() This code with the data in the brackets will provide proportions for the data.

Code: prop.table() * 100

This code will provide the proportions as a percentage for easier reading.

CrossTable example

The following example is a cross table using the diamonds data with notes to what all of it means:

Code
CrossTable(diamonds$cut, diamonds$color)

 
   Cell Contents
|-------------------------|
|                       N |
| Chi-square contribution |
|           N / Row Total |
|           N / Col Total |
|         N / Table Total |
|-------------------------|

 
Total Observations in Table:  53940 

 
             | diamonds$color 
diamonds$cut |         D |         E |         F |         G |         H |         I |         J | Row Total | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|
        Fair |       163 |       224 |       312 |       314 |       303 |       175 |       119 |      1610 | 
             |     7.607 |    16.009 |     2.596 |     1.575 |    12.268 |     1.071 |    14.772 |           | 
             |     0.101 |     0.139 |     0.194 |     0.195 |     0.188 |     0.109 |     0.074 |     0.030 | 
             |     0.024 |     0.023 |     0.033 |     0.028 |     0.036 |     0.032 |     0.042 |           | 
             |     0.003 |     0.004 |     0.006 |     0.006 |     0.006 |     0.003 |     0.002 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|
        Good |       662 |       933 |       909 |       871 |       702 |       522 |       307 |      4906 | 
             |     3.403 |     1.973 |     1.949 |    23.708 |     3.758 |     1.688 |    10.427 |           | 
             |     0.135 |     0.190 |     0.185 |     0.178 |     0.143 |     0.106 |     0.063 |     0.091 | 
             |     0.098 |     0.095 |     0.095 |     0.077 |     0.085 |     0.096 |     0.109 |           | 
             |     0.012 |     0.017 |     0.017 |     0.016 |     0.013 |     0.010 |     0.006 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|
   Very Good |      1513 |      2400 |      2164 |      2299 |      1824 |      1204 |       678 |     12082 | 
             |     0.014 |    19.258 |     0.333 |    20.968 |     0.697 |     0.090 |     3.823 |           | 
             |     0.125 |     0.199 |     0.179 |     0.190 |     0.151 |     0.100 |     0.056 |     0.224 | 
             |     0.223 |     0.245 |     0.227 |     0.204 |     0.220 |     0.222 |     0.241 |           | 
             |     0.028 |     0.044 |     0.040 |     0.043 |     0.034 |     0.022 |     0.013 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|
     Premium |      1603 |      2337 |      2331 |      2924 |      2360 |      1428 |       808 |     13791 | 
             |     9.634 |    11.245 |     4.837 |     0.473 |    26.432 |     1.257 |    11.300 |           | 
             |     0.116 |     0.169 |     0.169 |     0.212 |     0.171 |     0.104 |     0.059 |     0.256 | 
             |     0.237 |     0.239 |     0.244 |     0.259 |     0.284 |     0.263 |     0.288 |           | 
             |     0.030 |     0.043 |     0.043 |     0.054 |     0.044 |     0.026 |     0.015 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|
       Ideal |      2834 |      3903 |      3826 |      4884 |      3115 |      2093 |       896 |     21551 | 
             |     5.972 |     0.032 |     0.049 |    30.745 |    12.390 |     2.479 |    45.486 |           | 
             |     0.132 |     0.181 |     0.178 |     0.227 |     0.145 |     0.097 |     0.042 |     0.400 | 
             |     0.418 |     0.398 |     0.401 |     0.433 |     0.375 |     0.386 |     0.319 |           | 
             |     0.053 |     0.072 |     0.071 |     0.091 |     0.058 |     0.039 |     0.017 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|
Column Total |      6775 |      9797 |      9542 |     11292 |      8304 |      5422 |      2808 |     53940 | 
             |     0.126 |     0.182 |     0.177 |     0.209 |     0.154 |     0.101 |     0.052 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|

 

Cell contents

  • N = the count of observations in each cell

  • Chi-square contribution = the contribution of each cell to the overall Chi-sqaure statistic.

  • N/Row Total = the proportion of observations in each cell relative to the total number of observations in that row.

  • N/Col Total = The proportion of observations in each cell relative to the total number of observations in that column.

  • N/Table Total = the proportion of observations in each cell relative to the total number of observation in the table

Total Observations

  • The total number of observations in the table

Row and Column Totals

  • The sum of counts for each row and column

Goodness of Fit Test

To perform this test there are a few steps to follow:

  1. First the null and alternative hypothesis
  2. Identify the appropriate stats test
  3. Run the appropriate stats test.
  4. Calculate the p value. For this each stats test has a degrees of freedom or df, this is always one less than the number of categories being tested. If 7 categories are being tested then the df = 6
  5. Accept or reject the hypothesis. We always reject the null hypothesis if the p value is less than the significance level of the test. The significance level of the test is generally 0.05 unless otherwise stated.
  6. If less than 0.05, we reject the null for the alternative, if the p value is greater, we accept the null hypothesis.

Chi-Squared testing for independent variables

This follows this video and the example from it:

Code
volunteers <- matrix(c(111, 96, 48, 96, 133, 61, 91, 150, 53), byrow = TRUE, nrow = 3)

volunteers
     [,1] [,2] [,3]
[1,]  111   96   48
[2,]   96  133   61
[3,]   91  150   53

This basic code above has created a trable matrix of data. Now the next chunk will add row and column names to the data.

Code
rownames(volunteers) <- c("Community College", "Four Year", "Nonstudent") 
colnames(volunteers) <- c("1-3 Houes", "4-6 Hours", "7-9 Hours")
volunteers
                  1-3 Houes 4-6 Hours 7-9 Hours
Community College       111        96        48
Four Year                96       133        61
Nonstudent               91       150        53

Now the table is labelled! Now to run the Chi-Squared test:

Code
model <- chisq.test(volunteers)
model

    Pearson's Chi-squared test

data:  volunteers
X-squared = 12.991, df = 4, p-value = 0.01132

Now we are going to look at the expected cell counts for this data:

Code
model$expected
                  1-3 Houes 4-6 Hours 7-9 Hours
Community College  90.57211  115.1907  49.23719
Four Year         103.00358  131.0012  55.99523
Nonstudent        104.42431  132.8081  56.76758

And… the Pearson residuals:

Code
model$residuals
                   1-3 Houes  4-6 Hours  7-9 Hours
Community College  2.1464772 -1.7880604 -0.1763148
Four Year         -0.6900708  0.1746359  0.6688187
Nonstudent        -1.3136852  1.4918030 -0.5000487

Creating another way to do the Chi-squared test:

Code
vol_table <- as.table(volunteers)
summary(vol_table)
Number of cases in table: 839 
Number of factors: 2 
Test for independence of all factors:
    Chisq = 12.991, df = 4, p-value = 0.01132

Session Notes

Data Analysis

This DA session went through new proportions and stats codes that could be useful to use on data. Due to this nature of the data being examples, I have decided not to copy and paste the codes.

Research Methods

What Makes a good title.
  • A good title is generally short explains the paper in several words and needs to be enticing to the reader, informative and jargon less, using keywords.

There are several different types of titles that can be used for research papers: - Descriptive - Methodological - “Spoiler” -Interrogative

Post Session

The first set of post-session activities where taken from this site

Running One-Proportion Z-Tests in R

This test is used to compare the observed proportions to a theoretical one, when there are only 2 categories.

Example: In this example there are an equal number of male and female mice being observed (p=0.5=50%), of these mice some where diagnosed with cancer (n=160), it was found that 95 were male and 65 were female.

We want to find out if the cancer affects more male mice than female mice. In this example: - The number of successes (male with cancer) is 95.

  • The observed proportion (po) of male is 95/160

  • The observed proportion (q) of female is 1−po

  • The expected proportion (pe) of male is 0.5 (50%)

  • The number of observations (n) is 160.

Research questions and statistical hypotheses: Typical research questions are:

  • Whether the observed proportion of male (po) is equal to the expected proportion (pe)?
  • Whether the observed proportion of male (po) is less than the expected proportion (pe)?
  • Whether the observed proportion of male (p) is greater than the expected proportion (pe)?

In statistics, we can define the corresponding null hypothesis (H0) as follow:

  • H0:po=pe
  • H0:po≤pe
  • H0:po≥pe

The corresponding alternative hypotheses (Ha) are as follow:

  • Ha:po≠pe (different) Note this is a two-tailed test
  • Ha:po>pe (greater) Note this is a one-tailed test
  • Ha:po<pe (less) Note this is a one-tailed test
Compute one proportion z-test

Binom.test() and prop.test(): - binom.test(): compute exact binomial test. Recommended when sample size is small - prop.test(): can be used when sample size is large ( N > 30). It uses a normal approximation to binomial

binom.test(x, n, p = 0.5, alternative = “two.sided”) prop.test(x, n, p = NULL, alternative = “two.sided”, correct = TRUE)

x: the number of of successes n: the total number of trials p: the probability to test against. correct: a logical indicating whether Yates’ continuity correction should be applied where possible.

Running the test using the mice data from above and whether males are more likely to get cancer compared with females.

Code
mice <- prop.test(x=95, n=160, p = 0.5, correct = FALSE)

mice

    1-sample proportions test without continuity correction

data:  95 out of 160, null probability 0.5
X-squared = 5.625, df = 1, p-value = 0.01771
alternative hypothesis: true p is not equal to 0.5
95 percent confidence interval:
 0.5163169 0.6667870
sample estimates:
      p 
0.59375 
Note

The functions returned from the aboce code show:

  • the value of Pearson’s chi-squared test statistic
  • a p-value
  • a 95% confidence intervals
  • an estimated probability of succes (the proportion of male with cancer)
Tip

To test if the proortion of males with cancer is less than 0.5 (one-tailed), use:

Code
mice2 <- prop.test(x = 95, n = 160, p = 0.5, correct = FALSE, alternative = "less")

mice2

    1-sample proportions test without continuity correction

data:  95 out of 160, null probability 0.5
X-squared = 5.625, df = 1, p-value = 0.9911
alternative hypothesis: true p is less than 0.5
95 percent confidence interval:
 0.0000000 0.6555425
sample estimates:
      p 
0.59375 

To test if the proportion of male with cancer is greater than 0.5 (one-tailed test), use:

Code
mice3 <- prop.test(x = 95, n = 160, p = 0.5, correct = FALSE, alternative = "greater")

mice3

    1-sample proportions test without continuity correction

data:  95 out of 160, null probability 0.5
X-squared = 5.625, df = 1, p-value = 0.008853
alternative hypothesis: true p is greater than 0.5
95 percent confidence interval:
 0.5288397 1.0000000
sample estimates:
      p 
0.59375 

Interpreting the results: The p-value generated from the test is 0.01771, which is less than the significance level = 0.05. We can conclude that the proportion of males with cancer is significantly different from 0.5 with a p-value = 0.01771

The results from the prop.test can be used with:

  • statistic - the number of successes

  • parameter - the number of trials

  • p.value - the p-value of the test

  • conf.int - a confidence interval for the probability of successes

  • estimate - the estimated probability of successes

use the code format of:

mice$p.value

for example

Two-Proportions Z-test in R

What is two-proportions z-test? This test is used to compare two observed proportions.

For example: - Group A with lung cancer: n=500 - Groub B healthy people: n=500

The number of smokers in each group is as follow: - Group A, 490 smokers, pA = 490/500 = 98 - Gorup B, 400 smokers, pB = 400/500 = 80

In this setting: - the overall proportion of smokers is p = frac(490+400)500 + 500 = 89 - the overall proportion of non-smokers is q = 1 - p = 11

We want to know, whether the proportions of smokers are the same in the two groups of individuals?

Typical research questions:

  1. whether the observed proportion of smokers in group A (pA) is equal to the observes proportion of smokers in group B (pB)?

  2. whether the observed proportion of smokers in group A (pA) is less than the observed proportion of smokers in group B(pB)?

  3. whether the observed proportion of smokers in group A (pA) is greater than the observed proportion of smokers in group (pB)?

Code for 2-proportions z-test

prop.test(x, n, p = NULL, alternative = “two.sided”, correct = TRUE)

  • x: a vector of counts of successes
  • n: a vector of count trials
  • alternative: a character string specifying the alternative hypothesis
  • correct: a logical indicating whether Yates’ continuity correction should be applied where possible

Running the code: using the data of the smokers above, the following code is used:

Code
people <- prop.test(x = c(490, 400), n = c(500, 500))

people

    2-sample test for equality of proportions with continuity correction

data:  c(490, 400) out of c(500, 500)
X-squared = 80.909, df = 1, p-value < 2.2e-16
alternative hypothesis: two.sided
95 percent confidence interval:
 0.1408536 0.2191464
sample estimates:
prop 1 prop 2 
  0.98   0.80 

The returned functions are: - the value of Pearson’s chi-squared test statistic - a p-value - a 95% confidence intervals - an estimated probability of success (proportion of smokers in 2 groups)

Tip

If testing for the observed proportion of smokers in group A (pA) is less than the observed from pB use:

Code
people2 <- prop.test(x = c(490, 400), n = c(500, 500), alternative = "less")

people2

    2-sample test for equality of proportions with continuity correction

data:  c(490, 400) out of c(500, 500)
X-squared = 80.909, df = 1, p-value = 1
alternative hypothesis: less
95 percent confidence interval:
 -1.0000000  0.2131742
sample estimates:
prop 1 prop 2 
  0.98   0.80 

if testing for the observed proportion of smokers in group A (pA) is greater than the observed proportion of pB use:

Code
people3 <- prop.test(x = c(490, 400), n = c(500, 500), alternative = "greater")

people3

    2-sample test for equality of proportions with continuity correction

data:  c(490, 400) out of c(500, 500)
X-squared = 80.909, df = 1, p-value < 2.2e-16
alternative hypothesis: greater
95 percent confidence interval:
 0.1468258 1.0000000
sample estimates:
prop 1 prop 2 
  0.98   0.80 

Interpretation of the above results for the two-proportion z-test:

The p-value of the test is 2.363 x10 -19, which is less than the significance level =0.05. We can conclude that the proportion of smokers is significantly different in the two groups with the p-value found.

Tip

For a 2 X 2 table, the standard chi-square test (chisq.test()) is exactly the same as prop.test() but it works with data in matrix form.

Also note that the results from these test can also be used with the additional functions seen in one-prop z-test.

Chi-Squared Goodness of fit test

The chi-square goodness of fit test is used to compare the observed distribution to an expected distribution, in a situation where we have 2 or more categories in a discrete data. Basically it compares multiple observed proportions to expected probabilities.

Example:

collection of wild tulips, 81 were red, 50 yellow and 27 white.

Qestions:

  1. Are these colours equally common? (if they were equally distributed, the expected proportion would be 1/3 for each colour).
  2. We want to know, if there is any significant difference between the observed prooortions and the expected proportions. (if in the region you collected the data, the ratio of red, yellow and white tulips is 3:2:1 (3+2+1 = 6). This means the expected proportion is:
  • 3/6 for red

  • 2/6 for yellow

  • 1/6 for white

Statistical hypotheses
  • null hypothesis (H0): There is no significant difference between the observed and the expected value.

  • Alternative hypothesis (Ha): There is a significant difference between the observed and the expected value.

Code in R

To run this code use:

chisq.test(x, p)

where:

  • x = a numeric vector

  • p = a vector of probabilities of the same length of x

Question 1 Answer

Are the colours equally common?

Code
tulips <- c(81, 50, 27)

tulip1 <- chisq.test(tulips, p = c(1/3, 1/3, 1/3))

tulip1

    Chi-squared test for given probabilities

data:  tulips
X-squared = 27.886, df = 2, p-value = 8.803e-07

The returned functions of this test are the value of chi-square test statistic (“X-squared”) and a p-value.

The p-value of the test is 8.8 x10 -7 which is less than the significance level =0.05. We can conclude that the colours are significantly not commonly distributed.

As our expected values are (and can be calculated with):

Code
tulip1$expected
[1] 52.66667 52.66667 52.66667
Answer to question 2

Comparing the observed to expected proportions.

Using the data set tulips we set up earlier the following code can be used:

Code
tulip2 <- chisq.test(tulips, p = c(1/2, 1/3, 1/6))

tulip2

    Chi-squared test for given probabilities

data:  tulips
X-squared = 0.20253, df = 2, p-value = 0.9037

The p-value of this test is 0.9037 which is greater than the significance level at 0.05, therefore we can conclude that observed proportions are not significantly different from the expected proportions.

Additional functions to use with the chi-square test in R:

  • statistic: the value the chi-squared test statistic.

  • parameter: the degrees of freedom

  • p.value: the p-value of the test

  • observed: the observed count

  • expected: the expected count

put in the format of:

tulip1$p.value

Chi-Squared test of independence

This test is used to analyse the frequency table (contingency table) formed by 2 categorical variables. It evaluates whether there is a significant association between the categories of the two variables.

Importing in data
housetasks <- read.table(file = "http://www.sthda.com/sthda/RDoc/data/housetasks.txt", header = TRUE)
housetasks
           Wife Alternating Husband Jointly
Laundry     156          14       2       4
Main_meal   124          20       5       4
Dinner       77          11       7      13
Breakfeast   82          36      15       7
Tidying      53          11       1      57
Dishes       32          24       4      53
Shopping     33          23       9      55
Official     12          46      23      15
Driving      10          51      75       3
Finances     13          13      21      66
Insurance     8           1      53      77
Repairs       0           3     160       2
Holidays      0           1       6     153
Displaying Contingency tables
Code
library(gplots)
htct <- as.table(as.matrix(housetasks))
balloonplot(t(htct), main ="housetasks", xlab ="", ylab="",
            label = FALSE, show.margins = FALSE)

Code
library("graphics")

mosaicplot(htct, shade = TRUE, las=2, main = "Housetasks")

  • the argument shade is used to colour the graph

  • the argument las=2 produces 2 vertical labels

  • blue colour indicates that the observed value is higher than the expected value if the data were random

  • red colour specifies that the observed value is lower than the expected value if the data were random

From this mosaic plot, it can be seen that the household tasks laundry, main meal dinner and breakfeast are mainly done by the wife due to the blue colour.

Code
library(vcd)

assoc(head(htct, 5), shade = TRUE, las=3)

Chi-Square test basics

This test examines whether rows and columns of a contingency table are statistically significantly associated.

  • Null hypothesis (H0): the row and column variables of the contingency table are independent.

  • Alternative hypothesis (H1): row and column variables are dependent.

For this test to be run each cell of the table has to have the expected value calculated under null hypothesis. When the chi-squared statistic has been completed and calculated, it must be compared with the critical value (obtained from statistical tables) with df = (r-1)(c-1) or known as degrees of freedom and p= 0.05. Were r is the number of rows and c the number of columns in the contingency table.

Running chi-square test
htcsq <- chisq.test(housetasks)

htcsq

    Pearson's Chi-squared test

data:  housetasks
X-squared = 1944.5, df = 36, p-value < 2.2e-16

In this example the row and column variables are statistically significantly associated as the p-value = 0

The observed and expected counts can be seen using:

  • Observed:
htcsq$observed
           Wife Alternating Husband Jointly
Laundry     156          14       2       4
Main_meal   124          20       5       4
Dinner       77          11       7      13
Breakfeast   82          36      15       7
Tidying      53          11       1      57
Dishes       32          24       4      53
Shopping     33          23       9      55
Official     12          46      23      15
Driving      10          51      75       3
Finances     13          13      21      66
Insurance     8           1      53      77
Repairs       0           3     160       2
Holidays      0           1       6     153
  • Expected:
round(htcsq$expected,2)
            Wife Alternating Husband Jointly
Laundry    60.55       25.63   38.45   51.37
Main_meal  52.64       22.28   33.42   44.65
Dinner     37.16       15.73   23.59   31.52
Breakfeast 48.17       20.39   30.58   40.86
Tidying    41.97       17.77   26.65   35.61
Dishes     38.88       16.46   24.69   32.98
Shopping   41.28       17.48   26.22   35.02
Official   33.03       13.98   20.97   28.02
Driving    47.82       20.24   30.37   40.57
Finances   38.88       16.46   24.69   32.98
Insurance  47.82       20.24   30.37   40.57
Repairs    56.77       24.03   36.05   48.16
Holidays   55.05       23.30   34.95   46.70

To check to see which cells provided the greatest contribution to the chi-square test score the following code can be used:

round(htcsq$residuals, 3)
             Wife Alternating Husband Jointly
Laundry    12.266      -2.298  -5.878  -6.609
Main_meal   9.836      -0.484  -4.917  -6.084
Dinner      6.537      -1.192  -3.416  -3.299
Breakfeast  4.875       3.457  -2.818  -5.297
Tidying     1.702      -1.606  -4.969   3.585
Dishes     -1.103       1.859  -4.163   3.486
Shopping   -1.289       1.321  -3.362   3.376
Official   -3.659       8.563   0.443  -2.459
Driving    -5.469       6.836   8.100  -5.898
Finances   -4.150      -0.852  -0.742   5.750
Insurance  -5.758      -4.277   4.107   5.720
Repairs    -7.534      -4.290  20.646  -6.651
Holidays   -7.419      -4.620  -4.897  15.556

This can be visualised using:

library(corrplot)

corrplot(htcsq$residuals, is.cor = FALSE)

For each given cell the size of the circle shows how much the cell contributed to the chi-square test score.

Interpreting the association between rows and columns are important:

Important
  1. Positive residuals are in blue. Positive values in cells specify an attraction (positive association) between the corresponding row and column variables.
  • In the graph above, it’s evident that there is an association between the column wife and the rows laundry and main meals.

  • while for column husband there is an association with row repair.

  1. Negative residuals are in red. This implies a repulsion (negative association) between the corresponding row and column variables. For example, column wife are negatively associated (~“not associated”) with the row repairs. There is a repulsion between column husband and the rows laundry and main meal.

Contribution in %

percht <- 100*htcsq$residuals^2/htcsq$statistic
round(percht, 3)
            Wife Alternating Husband Jointly
Laundry    7.738       0.272   1.777   2.246
Main_meal  4.976       0.012   1.243   1.903
Dinner     2.197       0.073   0.600   0.560
Breakfeast 1.222       0.615   0.408   1.443
Tidying    0.149       0.133   1.270   0.661
Dishes     0.063       0.178   0.891   0.625
Shopping   0.085       0.090   0.581   0.586
Official   0.688       3.771   0.010   0.311
Driving    1.538       2.403   3.374   1.789
Finances   0.886       0.037   0.028   1.700
Insurance  1.705       0.941   0.868   1.683
Repairs    2.919       0.947  21.921   2.275
Holidays   2.831       1.098   1.233  12.445

Visualizing the contribution

corrplot(percht, is.cor = FALSE)

The relative contribution of each cell to the total chi-square score give some indication of the nature of the dependency between rows and columns of the contingency table.

it shows:

  1. The column “wife” is strongly associated with Laundry, main meal and dinner
  2. The column “Husband” is strongly associated with the row repairs
  3. The column jointly is frequently associated with Holidays

From the above results it can be seen that the most contributing cells to the chi-square are wife/laundry (7.74%), Wife/Main meal (4.98%), Husband/Repairs (21.9%), Jointly/Holidays (12.44%).

These cells contribute about 47.06% to the total Chi-Square score and thus account for most of the difference between expected and observed values.

Contingency Tables

This next section follows the information and codes from this site.

dplyr & tidyr

This site suggests using dplyr to produce summery stats as it will enable code to seamlessly flow into the next tasks. Suggesting the use of group_by(), summarise() and spread() commands to look at the data in question.

This code requires the ggplot2, dplyr, tidyr and knitr packages which have already been loaded into this document and will use the mpg data already provided by r.

First, we want to get the total number of cars within each class and the number of cylinders they have using the group_by and summarise functions.

Code
mpg %>%
  group_by(class, cyl) %>%
  summarise(n=n()) %>%
  kable()
class cyl n
2seater 8 5
compact 4 32
compact 5 2
compact 6 13
midsize 4 16
midsize 6 23
midsize 8 2
minivan 4 1
minivan 6 10
pickup 4 3
pickup 6 10
pickup 8 20
subcompact 4 21
subcompact 5 2
subcompact 6 7
subcompact 8 5
suv 4 8
suv 6 16
suv 8 38
dplyr and tidyr: crosstabs

Summary data can be converted into crosstabs or contingency tables, we need varuiable A (class) to be listed by row, and variable B (cyl) to be listed by column.

This can be achieved by including the spread() command, to create columns for each cyl value, with n as the crosstab response value.

Code
mpg %>%
  group_by(class, cyl) %>%
  summarise(n=n()) %>%
  spread(cyl, n) %>%
  kable()
class 4 5 6 8
2seater NA NA NA 5
compact 32 2 13 NA
midsize 16 NA 23 2
minivan 1 NA 10 NA
pickup 3 NA 10 20
subcompact 21 2 7 5
suv 8 NA 16 38
Summary statistics other than frequency

Summarise allows us to determine more than just frequency values, we can look at means for example:

Code
mpg %>%
  group_by(class, cyl) %>%
  summarise(mean_cty=mean(cty)) %>%
  spread(cyl, mean_cty) %>%
  kable()
class 4 5 6 8
2seater NA NA NA 15.40000
compact 21.37500 21 16.92308 NA
midsize 20.50000 NA 17.78261 16.00000
minivan 18.00000 NA 15.60000 NA
pickup 16.00000 NA 14.50000 11.80000
subcompact 22.85714 20 17.00000 14.80000
suv 18.00000 NA 14.50000 12.13158

Or max number of city miles by class and cyl

Code
mpg %>%
  group_by(class, cyl) %>%
  summarise(max_cty=max(cty)) %>%
  spread(cyl, max_cty)%>%
  kable()
class 4 5 6 8
2seater NA NA NA 16
compact 33 21 18 NA
midsize 23 NA 19 16
minivan 18 NA 17 NA
pickup 17 NA 16 14
subcompact 35 20 18 15
suv 20 NA 17 14
dplyr and tidyr proportions

Proportions can be found by using the mutate functions:

Code
mpg %>%
  group_by(class) %>%
  summarise(n=n()) %>%
  mutate(prop=n/sum(n)) %>%
  kable()
class n prop
2seater 5 0.0213675
compact 47 0.2008547
midsize 41 0.1752137
minivan 11 0.0470085
pickup 33 0.1410256
subcompact 35 0.1495726
suv 62 0.2649573

Creating a contingency table of proportion values by applying the same spread command:

Code
mpg %>%
  group_by(class, cyl) %>%
  summarise(n=n()) %>%
  mutate(prop=n/sum(n)) %>%
  subset(select=c("class", "cyl", "prop")) %>%
  spread(class, prop) %>%
  kable()
cyl 2seater compact midsize minivan pickup subcompact suv
4 NA 0.6808511 0.3902439 0.0909091 0.0909091 0.6000000 0.1290323
5 NA 0.0425532 NA NA NA 0.0571429 NA
6 NA 0.2765957 0.5609756 0.9090909 0.3030303 0.2000000 0.2580645
8 1 NA 0.0487805 NA 0.6060606 0.1428571 0.6129032
Table ()

This function is a quick way to pull together row/column frequencies and proportions for categorical variables. Using the table() command we can get a contingency table of vehicle class by number of cylinders.

table(mpg$class, mpg$cyl)
            
              4  5  6  8
  2seater     0  0  0  5
  compact    32  2 13  0
  midsize    16  0 23  2
  minivan     1  0 10  0
  pickup      3  0 10 20
  subcompact 21  2  7  5
  suv         8  0 16 38
Table, column, and row frequencies

The table frequency can be called by using the ftable() command:

mpg_table <- table(mpg$class, mpg$cyl)
ftable(mpg_table)
             4  5  6  8
                       
2seater      0  0  0  5
compact     32  2 13  0
midsize     16  0 23  2
minivan      1  0 10  0
pickup       3  0 10 20
subcompact  21  2  7  5
suv          8  0 16 38

For row frequencies, use margin.table() command with the 1 argument:

margin.table(mpg_table, 1)

   2seater    compact    midsize    minivan     pickup subcompact        suv 
         5         47         41         11         33         35         62 

For column frequencies, use the margin.table() command with the 2 argument:

margin.table(mpg_table, 2)

 4  5  6  8 
81  4 79 70 
Table, Column and row proportions

For proportion of the entire table use prop.table() command:

prop.table(mpg_table)
            
                       4           5           6           8
  2seater    0.000000000 0.000000000 0.000000000 0.021367521
  compact    0.136752137 0.008547009 0.055555556 0.000000000
  midsize    0.068376068 0.000000000 0.098290598 0.008547009
  minivan    0.004273504 0.000000000 0.042735043 0.000000000
  pickup     0.012820513 0.000000000 0.042735043 0.085470085
  subcompact 0.089743590 0.008547009 0.029914530 0.021367521
  suv        0.034188034 0.000000000 0.068376068 0.162393162

For row proportions, we use the prop.table() command, with the argument 1:

prop.table(mpg_table, 1)
            
                      4          5          6          8
  2seater    0.00000000 0.00000000 0.00000000 1.00000000
  compact    0.68085106 0.04255319 0.27659574 0.00000000
  midsize    0.39024390 0.00000000 0.56097561 0.04878049
  minivan    0.09090909 0.00000000 0.90909091 0.00000000
  pickup     0.09090909 0.00000000 0.30303030 0.60606061
  subcompact 0.60000000 0.05714286 0.20000000 0.14285714
  suv        0.12903226 0.00000000 0.25806452 0.61290323

For the column proportions use the prop.table() command with the 2 argument:

prop.table(mpg_table, 2)
            
                      4          5          6          8
  2seater    0.00000000 0.00000000 0.00000000 0.07142857
  compact    0.39506173 0.50000000 0.16455696 0.00000000
  midsize    0.19753086 0.00000000 0.29113924 0.02857143
  minivan    0.01234568 0.00000000 0.12658228 0.00000000
  pickup     0.03703704 0.00000000 0.12658228 0.28571429
  subcompact 0.25925926 0.50000000 0.08860759 0.07142857
  suv        0.09876543 0.00000000 0.20253165 0.54285714

gmodels: CrossTable()

The crosstable() command from the gmodels package produces frequencies, table, row and column proportions with a single command. The values are not as quickly drawn into tables of their own, or further manipulated as they are with the dyplr/tidyr tables, but rhis is a handy command:

Running the command for the mpg data:

CrossTable(mpg$class, mpg$cyl)

 
   Cell Contents
|-------------------------|
|                       N |
| Chi-square contribution |
|           N / Row Total |
|           N / Col Total |
|         N / Table Total |
|-------------------------|

 
Total Observations in Table:  234 

 
             | mpg$cyl 
   mpg$class |         4 |         5 |         6 |         8 | Row Total | 
-------------|-----------|-----------|-----------|-----------|-----------|
     2seater |         0 |         0 |         0 |         5 |         5 | 
             |     1.731 |     0.085 |     1.688 |     8.210 |           | 
             |     0.000 |     0.000 |     0.000 |     1.000 |     0.021 | 
             |     0.000 |     0.000 |     0.000 |     0.071 |           | 
             |     0.000 |     0.000 |     0.000 |     0.021 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
     compact |        32 |         2 |        13 |         0 |        47 | 
             |    15.210 |     1.782 |     0.518 |    14.060 |           | 
             |     0.681 |     0.043 |     0.277 |     0.000 |     0.201 | 
             |     0.395 |     0.500 |     0.165 |     0.000 |           | 
             |     0.137 |     0.009 |     0.056 |     0.000 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
     midsize |        16 |         0 |        23 |         2 |        41 | 
             |     0.230 |     0.701 |     6.059 |     8.591 |           | 
             |     0.390 |     0.000 |     0.561 |     0.049 |     0.175 | 
             |     0.198 |     0.000 |     0.291 |     0.029 |           | 
             |     0.068 |     0.000 |     0.098 |     0.009 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
     minivan |         1 |         0 |        10 |         0 |        11 | 
             |     2.070 |     0.188 |    10.641 |     3.291 |           | 
             |     0.091 |     0.000 |     0.909 |     0.000 |     0.047 | 
             |     0.012 |     0.000 |     0.127 |     0.000 |           | 
             |     0.004 |     0.000 |     0.043 |     0.000 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
      pickup |         3 |         0 |        10 |        20 |        33 | 
             |     6.211 |     0.564 |     0.117 |    10.391 |           | 
             |     0.091 |     0.000 |     0.303 |     0.606 |     0.141 | 
             |     0.037 |     0.000 |     0.127 |     0.286 |           | 
             |     0.013 |     0.000 |     0.043 |     0.085 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
  subcompact |        21 |         2 |         7 |         5 |        35 | 
             |     6.515 |     3.284 |     1.963 |     2.858 |           | 
             |     0.600 |     0.057 |     0.200 |     0.143 |     0.150 | 
             |     0.259 |     0.500 |     0.089 |     0.071 |           | 
             |     0.090 |     0.009 |     0.030 |     0.021 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
         suv |         8 |         0 |        16 |        38 |        62 | 
             |     8.444 |     1.060 |     1.162 |    20.403 |           | 
             |     0.129 |     0.000 |     0.258 |     0.613 |     0.265 | 
             |     0.099 |     0.000 |     0.203 |     0.543 |           | 
             |     0.034 |     0.000 |     0.068 |     0.162 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|
Column Total |        81 |         4 |        79 |        70 |       234 | 
             |     0.346 |     0.017 |     0.338 |     0.299 |           | 
-------------|-----------|-----------|-----------|-----------|-----------|

 

Running mpg data in contingency tables:

This next part looks at the above sections from week 6 and combining them together

using the mpg dataset and creating the diagrams and graphs from the other exercise!

To create a balloon plot of the mpg data (using ai help as I got confused on the steps!!) a contingency table needs to be created, to allow for the balloon plot to be made. For this example I will use the cyl and class for consistency from the previous exercises!

Data:

Code
mpg
# A tibble: 234 × 11
   manufacturer model      displ  year   cyl trans drv     cty   hwy fl    class
   <chr>        <chr>      <dbl> <int> <int> <chr> <chr> <int> <int> <chr> <chr>
 1 audi         a4           1.8  1999     4 auto… f        18    29 p     comp…
 2 audi         a4           1.8  1999     4 manu… f        21    29 p     comp…
 3 audi         a4           2    2008     4 manu… f        20    31 p     comp…
 4 audi         a4           2    2008     4 auto… f        21    30 p     comp…
 5 audi         a4           2.8  1999     6 auto… f        16    26 p     comp…
 6 audi         a4           2.8  1999     6 manu… f        18    26 p     comp…
 7 audi         a4           3.1  2008     6 auto… f        18    27 p     comp…
 8 audi         a4 quattro   1.8  1999     4 manu… 4        18    26 p     comp…
 9 audi         a4 quattro   1.8  1999     4 auto… 4        16    25 p     comp…
10 audi         a4 quattro   2    2008     4 manu… 4        20    28 p     comp…
# ℹ 224 more rows
Balloon Plot
Code
mpgt <- table(mpg$class, mpg$cyl)

balloonplot(t(mpgt), main = "Number of cars by class and cylinder count", xlab = "", ylab = "", label = FALSE, show.margins = FALSE)

Mosaic plot
Code
mosaicplot(mpgt, shade = TRUE, las=2)

Association plot
Code
assoc(head(mpgt, 5), shade = TRUE, las=3)

Chi-sq test
Code
csqmpg <- chisq.test(mpgt)
csqmpg

    Pearson's Chi-squared test

data:  mpgt
X-squared = 138.03, df = 18, p-value < 2.2e-16

After finding the results of the chi-square test, I am following the website and now creating and looking into the residuals which are used to gain the value for the test:

round(csqmpg$residuals, 3)
            
                  4      5      6      8
  2seater    -1.316 -0.292 -1.299  2.865
  compact     3.900  1.335 -0.720 -3.750
  midsize     0.480 -0.837  2.462 -2.931
  minivan    -1.439 -0.434  3.262 -1.814
  pickup     -2.492 -0.751 -0.342  3.224
  subcompact  2.553  1.812 -1.401 -1.691
  suv        -2.906 -1.029 -1.078  4.517
corrplot(csqmpg$residuals, is.cor = FALSE)

Contribution of each cell in %

permpg <- 100*csqmpg$residuals^2/csqmpg$statistic

round(permpg, 3)
            
                  4      5      6      8
  2seater     1.254  0.062  1.223  5.948
  compact    11.020  1.291  0.375 10.186
  midsize     0.167  0.508  4.390  6.224
  minivan     1.500  0.136  7.709  2.384
  pickup      4.500  0.409  0.085  7.528
  subcompact  4.720  2.379  1.422  2.070
  suv         6.117  0.768  0.842 14.782

Visualizing this table above

corrplot(permpg, is.cor = FALSE)

Week 7

Pre-Session

Video run through:

Single sample t-test:

summary(gapminder)
        country        continent        year         lifeExp     
 Afghanistan:  12   Africa  :624   Min.   :1952   Min.   :23.60  
 Albania    :  12   Americas:300   1st Qu.:1966   1st Qu.:48.20  
 Algeria    :  12   Asia    :396   Median :1980   Median :60.71  
 Angola     :  12   Europe  :360   Mean   :1980   Mean   :59.47  
 Argentina  :  12   Oceania : 24   3rd Qu.:1993   3rd Qu.:70.85  
 Australia  :  12                  Max.   :2007   Max.   :82.60  
 (Other)    :1632                                                
      pop              gdpPercap       
 Min.   :6.001e+04   Min.   :   241.2  
 1st Qu.:2.794e+06   1st Qu.:  1202.1  
 Median :7.024e+06   Median :  3531.8  
 Mean   :2.960e+07   Mean   :  7215.3  
 3rd Qu.:1.959e+07   3rd Qu.:  9325.5  
 Max.   :1.319e+09   Max.   :113523.1  
                                       

For this first example we are looking at if the mean life expectancy in Africa is over 50 years

H0: the mean life expectancy is 50 years

H1: the mean life expectancy is not 50 years

Observation: sample data provides mean life expectancy of 48.9. Is this statistically significant?

Code
gapminder %>%
  filter(continent == "Africa") %>%
  select(lifeExp) %>%
  t.test(mu = 50)

    One Sample t-test

data:  .
t = -3.0976, df = 623, p-value = 0.002038
alternative hypothesis: true mean is not equal to 50
95 percent confidence interval:
 48.14599 49.58467
sample estimates:
mean of x 
 48.86533 

Here with this test we can reject the null hypothesis and accept the alternative hypothesis as the p-value is less than 0.05.

Code
gapminder %>%
  filter(continent %in% c("Africa", "Europe")) %>%
  t.test(lifeExp ~ continent, data = ., alternative = "two.sided")

    Welch Two Sample t-test

data:  lifeExp by continent
t = -49.551, df = 981.2, p-value < 2.2e-16
alternative hypothesis: true difference in means between group Africa and group Europe is not equal to 0
95 percent confidence interval:
 -23.95076 -22.12595
sample estimates:
mean in group Africa mean in group Europe 
            48.86533             71.90369 

Looking at whether there is significant difference in the life exp between 2 countries, H0: no (0) significant difference. H1: there is a significant difference.

Code
gapminder %>%
  filter(country %in% c("Ireland", "Switzerland")) %>%
  t.test(lifeExp ~ country, data = .,
         alternative = "less", conf.level = 0.95)

    Welch Two Sample t-test

data:  lifeExp by country
t = -1.6337, df = 21.77, p-value = 0.05835
alternative hypothesis: true difference in means between group Ireland and group Switzerland is less than 0
95 percent confidence interval:
      -Inf 0.1313697
sample estimates:
    mean in group Ireland mean in group Switzerland 
                 73.01725                  75.56508 

Paired t-test:

This code provided on the video is now outdated and doesnt work:

Code
gapminder %>%
  filter(year %in% c(1957, 2007) &
           continent == "Africa") %>%
  mutate(year = factor(year, levels = c(2007, 1957))) %>%
  t.test(lifeExp ~ year, data = ., paired = TRUE)
Error in t.test.formula(lifeExp ~ year, data = ., paired = TRUE): cannot use 'paired' in formula method

My attempt:

Code
africa_data <- gapminder %>%
  filter(continent == "Africa" & year %in% c(1957, 2007))

tt_Africa <- t.test(lifeExp ~ year, data = africa_data)

tt_Africa

    Welch Two Sample t-test

data:  lifeExp by year
t = -8.7561, df = 82.126, p-value = 2.175e-13
alternative hypothesis: true difference in means between group 1957 and group 2007 is not equal to 0
95 percent confidence interval:
 -16.61575 -10.46364
sample estimates:
mean in group 1957 mean in group 2007 
          41.26635           54.80604 

Code check in session

Code
gapminder %>%
  filter(continent == "Africa") %>%
  mutate(year2 = as.factor(year)) %>%
  filter(year %in% c("1957", "2007")) %>%
  select(country, year2, lifeExp) %>%
  spread(year2, lifeExp) -> africa

africa
# A tibble: 52 × 3
   country                  `1957` `2007`
   <fct>                     <dbl>  <dbl>
 1 Algeria                    45.7   72.3
 2 Angola                     32.0   42.7
 3 Benin                      40.4   56.7
 4 Botswana                   49.6   50.7
 5 Burkina Faso               34.9   52.3
 6 Burundi                    40.5   49.6
 7 Cameroon                   40.4   50.4
 8 Central African Republic   37.5   44.7
 9 Chad                       39.9   50.7
10 Comoros                    42.5   65.2
# ℹ 42 more rows
t.test(A=africa$'1957', africa$'2007', paired = TRUE, alternative = "less")  
Error in t.test.default(A = africa$"1957", africa$"2007", paired = TRUE, : 'y' is missing for paired test

Also cant get this to work! Nevermind!!

ANOVA video

ANOVA stands for analysis of variance

A t-test is used to compare the means of two categorical values, for example the life expectancy of people in 2 different continents, however, if you wanted to test a thrid continent this cannont be done without ANOVA tests.

If we are testing the life expectancy of Europe, Americas and Asia, for ANOVA we have to assume that the life expectancy of each continent is the same (H0), our alternative hypothesis is then therefore than they have significantly difference life expectancy’s.

We have to set out the alpha level or significance level before doing the test to avoid p-value hacking, which is bad science. Generally we use an alpha value of 0.05, or 5% difference.

Code
gapdata <- gapminder %>%
  filter(year == 2007 & 
           continent %in% c("Americas", "Europe", "Asia")) %>%
  select(continent, lifeExp)
gapdata
# A tibble: 88 × 2
   continent lifeExp
   <fct>       <dbl>
 1 Asia         43.8
 2 Europe       76.4
 3 Americas     75.3
 4 Europe       79.8
 5 Asia         75.6
 6 Asia         64.1
 7 Europe       79.4
 8 Americas     65.6
 9 Europe       74.9
10 Americas     72.4
# ℹ 78 more rows

Looking at the distribution of data:

Code
gapdata %>%
  group_by(continent) %>%
  summarise(Mean_life = mean(lifeExp)) %>%
  arrange(Mean_life)
# A tibble: 3 × 2
  continent Mean_life
  <fct>         <dbl>
1 Asia           70.7
2 Americas       73.6
3 Europe         77.6

Running ANOVA

Code
gapdata %>%
  aov(lifeExp ~ continent, data = .) %>%
  summary()
            Df Sum Sq Mean Sq F value   Pr(>F)    
continent    2  755.6   377.8   11.63 3.42e-05 ***
Residuals   85 2760.3    32.5                     
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Pr here is the p-value we are looking for.

As the p-value is less than 0.05 therefore we reject the null hypothesis and accept the alternative, knowing that the life expectancy’s are significantly different from one another.

Looking to see which continent is significantly different.

Code
gapdata %>%
  aov(lifeExp ~ continent, data = .) %>%
  TukeyHSD()
  Tukey multiple comparisons of means
    95% family-wise confidence level

Fit: aov(formula = lifeExp ~ continent, data = .)

$continent
                     diff        lwr        upr     p adj
Asia-Americas   -2.879635 -6.4839802  0.7247099 0.1432634
Europe-Americas  4.040480  0.3592746  7.7216854 0.0279460
Europe-Asia      6.920115  3.4909215 10.3493088 0.0000189

Here the p adj column is of most interest as this is the p-value of each variable or in this case continent. We can see that Asia Americas have a relationship as the p-value is above our significance level 0.05 and not statistically different however the other 2 are less than 0.05 and are therefore we can confirm that they are statistically different.

Session Notes

Data Analysis

This session looked at means testing in R

We went though:

  • One sample t-test

  • 2 independant variables

  • testing for normal distribution

  • testing for equal variances

  • Unpaired 2 sample t-test

  • More than 2 samples (ANOVA test)

  • Paired test with 2 samples and more than 2 samples

Research Methods

This part of the session looked at writing abstracts as preparation for the formative 2 task which I have completed notes on below.

Post-Session

This post session information is taken from

Comparing Means in R

One sample t-test

This test is used to compare the mean of one sample to a known standard mean.

The theoretical mean comes from:

  • A previous experiment

  • from an experiemnt where you have control and treatment conditions.

Warning

This test can only be done if the results are known to be normally distributed, this can be found in R using the Shapiro-Wilk Test.

Research questions and statistical hypotheses

Typical research questions are:

  1. whether the mean ((m)) of the sample is equal to the theoretical mean (())?

  2. whether the mean ((m)) of the sample is less than the theoretical mean (())?

  3. whether the mean ((m)) of the sample is greater than the theoretical mean (())?

In statistics, we can define the corresponding null hypothesis ((H_0)) as follow:

  1. (H_0: m = )

  2. (H_0: m )

  3. (H_0: m )

The corresponding alternative hypotheses ((H_a)) are as follow:

  1. (H_a: m ) (different)

  2. (H_a: m > ) (greater)

  3. (H_a: m < ) (less)

Note that:

  • Hypotheses 1) are called two-tailed tests

  • Hypotheses 2) and 3) are called one-tailed tests

Visualizing and running the t-test

This test requires ggpubr package to be installed and added to the library which is added to the load packages code!

Running a simple t-test:

t.test(x, mu = 0, alternative = “two.sided”)

  • x: a numeric vector containing your data values

  • mu: the theoretical mean. Default is 0 but you can change it.

  • alternative: the alternative hypothesis. Allowed value is one of “two.sided” (default), “greater” or “less”.

This example for the t-test will use a made up dataset of 10 mice:

set.seed(1234)
mice <- data.frame(
  nme = paste0(rep("M_", 10), 1:10),
  weight = round(rnorm(10, 20, 2), 1)
)
mice
    nme weight
1   M_1   17.6
2   M_2   20.6
3   M_3   22.2
4   M_4   15.3
5   M_5   20.9
6   M_6   21.0
7   M_7   18.9
8   M_8   18.9
9   M_9   18.9
10 M_10   18.2

Checking the data:

this checks the first 10 rows

head(mice, 10)
    nme weight
1   M_1   17.6
2   M_2   20.6
3   M_3   22.2
4   M_4   15.3
5   M_5   20.9
6   M_6   21.0
7   M_7   18.9
8   M_8   18.9
9   M_9   18.9
10 M_10   18.2

Summary of the data:

summary(mice$weight)
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
  15.30   18.38   18.90   19.25   20.82   22.20 

Visualising data in a boxplot:

ggboxplot(mice$weight,
          ylab = "Weight (g)", xlab = FALSE,
          ggtheme = theme_minimal())

Preleminary test to chekc one-sample t-test assumptions
  1. Is this a large sample? no, n < 30.
  2. SInce the sample size is not large enough, we need to check the distribution, and if it is normal.

Running a Shapiro-Wilk test

shapiro.test(mice$weight)

    Shapiro-Wilk normality test

data:  mice$weight
W = 0.9526, p-value = 0.6993

As the p-value from the test output is greater than the significance level 0.05, this then implies that the distribution of the data is not significantly different from normal distribution. We can then assume normality.

Visual inspection of the data normality using q-q plots (quantile-quantile plots). q-q plots draws the correlation between a given sample and the normal distribution.

ggqqplot(mice$weight, ylab = "Weight", ggtheme = theme_minimal())

Tip

If the data is not normally distributed, it is recommended to use the non parametric one-sample Wilcoxon rank test

Running test code:

we want to know if the average weight of the mice differs from 25g (2-tailed test)?

mttest <- t.test(mice$weight, mu = 25)

mttest

    One Sample t-test

data:  mice$weight
t = -9.0783, df = 9, p-value = 7.953e-06
alternative hypothesis: true mean is not equal to 25
95 percent confidence interval:
 17.8172 20.6828
sample estimates:
mean of x 
    19.25 

In the result above :

  • t is the t-test statistic value (t = -9.078),

  • df is the degrees of freedom (df= 9),

  • p-value is the significance level of the t-test (p-value = 7.95310^{-6}).

  • conf.int is the confidence interval of the mean at 95% (conf.int = [17.8172, 20.6828]);

  • sample estimates is he mean value of the sample (mean = 19.25).

This code can also be used with alternative = “less” or “greater” after mu = 25, if you want to test whether mean test is less than 25g or more than 25g.

As the p-value of the test is less than 0.05 we can conclude that the weight of mice is significantly different than 25g.

The following function can be used with the test results to extract information.

  • statistic: the value of the t test statistics

  • parameter: the degrees of freedom for the t test statistics

  • p.value: the p-value for the test

  • conf.int: a confidence interval for the mean appropriate to the specified alternative hypothesis.

  • estimate: the means of the two groups being compared (in the case of independent t test) or difference in means (in the case of paired t test).

table_name$p.value

One-Sample Wilcoxon Signed Ranked Test

The one-sample Wilcoxon signed rank test is a non-parametric alternative to one-sample t-test when the data cannot be assumed to be normally distributed. It’s used to determine whether the median of the sample is equal to a known standard value (eg theoretical value).

Research questions and statistical hypotheses

Typical research questions are:

  1. whether the median (mm) of the sample is equal to the theoretical value (m0m0)?

  2. whether the median (mm) of the sample is less than to the theoretical value (m0m0)?

  3. whether the median (mm) of the sample is greater than to the theoretical value(m0m0)?

In statistics, we can define the corresponding null hypothesis (H0H0) as follow:

  1. H0:m=m0H0:m=m0

  2. H0:m≤m0H0:m≤m0

  3. H0:m≥m0H0:m≥m0

The corresponding alternative hypotheses (HaHa) are as follow:

  1. Ha:m≠m0Ha:m≠m0 (different)

  2. Ha:m>m0Ha:m>m0 (greater)

  3. Ha:m<m0Ha:m<m0 (less)

Note that:

  • Hypotheses 1) are called two-tailed tests

  • Hypotheses 2) and 3) are called one-tailed tests

Running code for the Wilcoxon test:

wilcox.test(x, mu = 0, alternative = “two.sided”)

  • x: a numeric vector containing your data values

  • mu: the theoretical mean/median value. Default is 0 but you can change it.

  • alternative: the alternative hypothesis. Allowed value is one of “two.sided” (default), “greater” or “less”.

Running the test

Using the 10 mice data used above

head(mice, 10)
    nme weight
1   M_1   17.6
2   M_2   20.6
3   M_3   22.2
4   M_4   15.3
5   M_5   20.9
6   M_6   21.0
7   M_7   18.9
8   M_8   18.9
9   M_9   18.9
10 M_10   18.2

We want to know if the average weight of the mice differs from 25g (two-tailed test)?

miceWT <- wilcox.test(mice$weight, mu = 25)
Warning in wilcox.test.default(mice$weight, mu = 25): cannot compute exact
p-value with ties
miceWT

    Wilcoxon signed rank test with continuity correction

data:  mice$weight
V = 0, p-value = 0.005793
alternative hypothesis: true location is not equal to 25

As the p-value of the test is less than the significance level = 0.05. We can reject the null hypothesis and conclude that the average weight of the mice is significantly different from 25g.

Again if you want to test whether the median weight of mice is less than 25g use alternative = “less” after the mu = 25, or if the median weight of the mice is greater than 25g use alternative = “greater”.

Comparing the means of two independent groups

Unpaired two samples t-test (parametric)

The unpaired two-sample t-test is used to compare the mean of two independent groups.

Eg. suppose that 100 individuals had weight measured 50 women (Group A) and 50 men (Group B). We want to know if the mean weight of women (mA) is significantly different from that of men (mB).

In this case we have 2 unrelated (independent or unpaired) groups of samples. Therefore, it’s possible to use an independent t-test to evaluate whether the means are different.

Note

Unpaired 2-samples t-test can only be used when:

  • two groups of samples being compared are normally distributed. This can be checked using the Shapiro-Wilk test.

  • the variances of the two groups are equal. This can be checked using the F-test.

Research questions and statistical hypotheses

Typical research questions are:

  1. whether the mean of group A (mAmA) is equal to the mean of group B (mBmB)?

  2. whether the mean of group A (mAmA) is less than the mean of group B (mBmB)?

  3. whether the mean of group A (mAmA) is greather than the mean of group B (mBmB)?

In statistics, we can define the corresponding null hypothesis (H0H0) as follow:

  1. H0:mA=mBH0:mA=mB

  2. H0:mA≤mBH0:mA≤mB

  3. H0:mA≥mBH0:mA≥mB

The corresponding alternative hypotheses (HaHa) are as follow:

  1. Ha:mA≠mBHa:mA≠mB (different)

  2. Ha:mA>mBHa:mA>mB (greater)

  3. Ha:mA<mBHa:mA<mB (less)

Note that:

  • Hypotheses 1) are called two-tailed tests

  • Hypotheses 2) and 3) are called one-tailed tests

Function code for running 2-samples t-test

t.test(x, y, alternative = “two.sided”, var.equal = FALSE))

  • x,y: numeric vectors

  • alternative: the alternative hypothesis. Allowed value is one of “two.sided” (default), “greater” or “less”.

  • var.equal: a logical variable indicating whether to treat the two variances as being equal. If TRUE then the pooled variance is used to estimate the variance otherwise the Welch test is used.

Running the code:

Example made up data of 18 people, 9 women 9 men:

women_wt <- c(38.9, 61.2, 73.3, 21.8, 63.4, 64.6, 48.4, 48.8, 48.5)
men_wt <- c(67.8, 60, 63.4, 76, 89.4, 73.3, 67.3, 61.3, 62.4)

Weight <- data.frame(
  group = rep(c("Women", "Man"), each = 9), 
  weight = c(women_wt, men_wt)
)
print(Weight)
   group weight
1  Women   38.9
2  Women   61.2
3  Women   73.3
4  Women   21.8
5  Women   63.4
6  Women   64.6
7  Women   48.4
8  Women   48.8
9  Women   48.5
10   Man   67.8
11   Man   60.0
12   Man   63.4
13   Man   76.0
14   Man   89.4
15   Man   73.3
16   Man   67.3
17   Man   61.3
18   Man   62.4
group_by(Weight, group) %>%
  summarise(
    count = n(),
    mean = mean(weight, na.rm = TRUE),
    sd = sd(weight, na.rm = TRUE)
  )
# A tibble: 2 × 4
  group count  mean    sd
  <chr> <int> <dbl> <dbl>
1 Man       9  69.0  9.38
2 Women     9  52.1 15.6 
Visualising and running code:
ggboxplot(Weight, x = "group", y = "weight",
          color = "group", palette = c("#00AFBB", "#E7B800"),
          ylab = "Weight", xlab = "Groups")

Preliminary test to check independent t-test assumptions
  1. Assumption 1: Are the 2 samples independents? Yes since the samples from men and women are not related.
  2. Assumption 2: is the data from each of the 2 groups normally distributed? Use Shapiro-Wilk normality test to find this out.
with(Weight, shapiro.test(weight[group == "Man"]))

    Shapiro-Wilk normality test

data:  weight[group == "Man"]
W = 0.86425, p-value = 0.1066
# This would also be used for woman however, i am getting code error the p-value waould otherwise be 0.6

From this output we can see that the the two p-values are greater than the significance level 0.05 implying that the distribution of the data are not significantly different from the normal distribution. Basically the data is normal!

Warning

If the data is not normally distributed it is recommended to use the non-parametric 2-samples Wilcoxon rank test!

  1. Assumption 3: Do the 2 populations have the same variances?

the F-test is used to trest for homogeneity is variances using the var.test() function:

res.ftest <- var.test(weight ~ group, data = Weight)
res.ftest

    F test to compare two variances

data:  weight by group
F = 0.36134, num df = 8, denom df = 8, p-value = 0.1714
alternative hypothesis: true ratio of variances is not equal to 1
95 percent confidence interval:
 0.08150656 1.60191315
sample estimates:
ratio of variances 
         0.3613398 

The p-value of the f-test is 0.171…., this is greater than the significance level of 0.05. there is no significant difference between the variances of the 2 data sets. Therefore, we can use the classic t-test which assumes equality of the 2 variances.

Running the two-sample t-test
  1. Method 1, the data are saved in 2 different numeric vectors:
test1 <- t.test(women_wt, men_wt, var.equal = TRUE)

test1

    Two Sample t-test

data:  women_wt and men_wt
t = -2.7842, df = 16, p-value = 0.01327
alternative hypothesis: true difference in means is not equal to 0
95 percent confidence interval:
 -29.748019  -4.029759
sample estimates:
mean of x mean of y 
 52.10000  68.98889 
  1. Method 2, the data are saved in a data frame:
test2 <- t.test(weight ~ group, data = Weight, var.equal = TRUE)

test2

    Two Sample t-test

data:  weight by group
t = 2.7842, df = 16, p-value = 0.01327
alternative hypothesis: true difference in means between group Man and group Women is not equal to 0
95 percent confidence interval:
  4.029759 29.748019
sample estimates:
  mean in group Man mean in group Women 
           68.98889            52.10000 

In the result above :

  • t is the t-test statistic value (t = 2.784),

  • df is the degrees of freedom (df= 16),

  • p-value is the significance level of the t-test (p-value = 0.01327).

  • conf.int is the confidence interval of the mean at 95% (conf.int = [4.0298, 29.748]);

  • sample estimates is he mean value of the sample (mean = 68.9888889, 52.1).

You can use as always alternative = “less” and alternative = “greater” if you want to check for mens weight being less than womens weight on average and if average mens weight is greater than average womens weight.

As the p-value of the test is 0.01327, is less than the significance level of 0.05. We can conclude that mens average weight is significantly different from womens average weight.

Again the following can be used to get results:

  • statistic: the value of the t test statistics

  • parameter: the degrees of freedom for the t test statistics

  • p.value: the p-value for the test

  • conf.int: a confidence interval for the mean appropriate to the specified alternative hypothesis.

  • estimate: the means of the two groups being compared (in the case of independent t test) or difference in means (in the case of paired t test).

Unpaired Two-Samples Wilcoxon Test (non-parametric)

The unpaired 2-samples Wilcoxon test (also known as Wilcoxon rank sum test or Mann-Whitney test) is a non-parametric alternative to the unpaired two-samples t-test, which can be used to compare two independent groups of samples, it is used when the data are not normally distributed.

R code for Wilcoxon Test:

wilcox.test(x, y, alternative = “two.sided”)

x, y = numeric vectors

Alternative: the alternative hypothesis. Allowed value is one of “two-sided” (default), “greater” or “less”

Wilcoxon test:

This example will use the weight data used in the previous tests again!

We want to know if the median women’s weight differs from the median men’s weight?

Computing summary statistics by groups:

Code
group_by(Weight, group) %>%
  summarise(
    count = n(),
    median = median(weight, na.rm = TRUE),
    IQR = IQR(weight, na.rm = TRUE)
  )
# A tibble: 2 × 4
  group count median   IQR
  <chr> <int>  <dbl> <dbl>
1 Man       9   67.3  10.9
2 Women     9   48.8  15  

Question: Is there any significant difference between womenand men wights?

Compute 2-samples wilcoxon test:

Method 1: The data are saved in two different numeric vectors?

Code
wilcox1 <- wilcox.test(weight ~ group, data = Weight, exact = FALSE)

wilcox1

    Wilcoxon rank sum test with continuity correction

data:  weight by group
W = 66, p-value = 0.02712
alternative hypothesis: true location shift is not equal to 0

Method 2: The data are saved in a data frame.

Code
wilcox2 <- wilcox.test(weight ~ group, data = Weight, exact = FALSE)

wilcox2

    Wilcoxon rank sum test with continuity correction

data:  weight by group
W = 66, p-value = 0.02712
alternative hypothesis: true location shift is not equal to 0

Again as the p-value is less than the significance level = 0.05, we can conclude that mens median weight is significantly different from womens median weight.

Again the greater and less command can be used for less than and greater than.

Paired Samples T-test

The paired samples t-test is used to compare the means between two related groups of samples. In this case, you have 2 values for the same samples.

E.g.

20 mice recieved a treatment X during 3 months. we want to know whether the treatment has an impact on the weight of the mice. Each mouse was weighed before and after the treatment, giving us 20 sets of values before treatment and 20 sets after treatment from measuring the weight of each mouse twicer.

In such situations, paired t-test can be used to compare the mean weights before and after treatment.

the test is performed as follows:

  1. Calculate the difference (d) between each pair of value.
  2. Compute the mean (m) and the standard deviation (s) of d
  3. Compare the average difference to 0. If there is any significant difference between the 2 pairs of samples, then the mean of d(m) is expected to be far from 0.
Warning

The Paired T-test can only be used when the difference d is normally distributed. This can be tested using the Shapiro-Wilk Test.

Typical research questions
  1. Whether the mean difference is equal to 0
  2. whether the mean difference is less than 0
  3. whether the mean difference is greater than 0

In statistics, we can define the corresponding null hypothesis (H0H0) as follow:

  1. H0:m=0H0:m=0

  2. H0:m≤0H0:m≤0

  3. H0:m≥0H0:m≥0

The corresponding alternative hypotheses (HaHa) are as follow:

  1. Ha:m≠0Ha:m≠0 (different)

  2. Ha:m>0Ha:m>0 (greater)

  3. Ha:m<0Ha:m<0 (less)

Test code in R

t.test(x, y, paired = TRUE, alternative = “two.sided”)

x, y: are numeric vectors

paired: a logical value specifying that we want to compute a paried t-test.

alternative: the alternative hypothesis. Allowed value is one of “two.sided” (default), “greater” or “less”.

Performing an example

Using the mice scenario mentioned earlier the data set is as follows:

Code
before <- c(200.1, 190.9, 192.7, 213, 241.4, 196.9, 172.2, 185.5, 205.2, 193.7)
after <- c(392.9, 393.2, 345.1, 393, 434, 427.9, 422, 383.9, 392.3, 352.2)

MiceX <- data.frame(
  group = rep(c("before", "after"), each = 10),
  weight = c(before, after)
)
MiceX
    group weight
1  before  200.1
2  before  190.9
3  before  192.7
4  before  213.0
5  before  241.4
6  before  196.9
7  before  172.2
8  before  185.5
9  before  205.2
10 before  193.7
11  after  392.9
12  after  393.2
13  after  345.1
14  after  393.0
15  after  434.0
16  after  427.9
17  after  422.0
18  after  383.9
19  after  392.3
20  after  352.2

We want to know, if there is any significant difference in the mean weights after treatment?

Calculating the summary of the data:

Code
group_by(MiceX, group) %>%
  summarise(
    count = n(),
    mean = mean(weight, na.rm = TRUE),
    sd = sd(weight, na.rm = TRUE)
  )
# A tibble: 2 × 4
  group  count  mean    sd
  <chr>  <int> <dbl> <dbl>
1 after     10  394.  29.4
2 before    10  199.  18.5

Visualising the data:

Code
MiceX$group <- factor(MiceX$group, levels = c("before", "after"))
MiceX %>%
  ggplot(aes(x = group, y = weight, color = group)) +
  geom_boxplot() +
  labs(y = "Weight", x = "Groups")

Paired data plot:

Code
pdMiceX <- paired(before, after)
plot(pdMiceX, type = "profile") + theme_bw()

Preliminary test to check paired t-test assumptions
  1. Assumption 1: Are the two samples paired?

Yes, since the data has been collected from measuring twice the weight of the same mice.

  1. Assuption 2: is this a large sample?

No, because n<30. Since the sample size is a=small we need to check whether the differences of the pairs follow a normal distribution.

Checking for normality:

We need to perform the Shapiro-Wilk test:

  • Null Hypothesis: the data are normally distibuted

  • Alternative hypothesis: the data are not normally distributed

Norm <- with(MiceX,
             weight[group =="before"] - weight[group == "after"])
shapiro.test(Norm)

    Shapiro-Wilk normality test

data:  Norm
W = 0.94536, p-value = 0.6141

As the p-value from the shapiro-wilk test is greater than 0.05 we can accept the null hypothesis and that the distribution of the differences (d) are not significantly different from normal distribution, therefore we assume normality.

Running the code in R
  1. Method 1: The data are saved in 2 different numeric vectors
Code
MX1 <- t.test(after, before, paired = TRUE)

MX1

    Paired t-test

data:  after and before
t = 20.883, df = 9, p-value = 6.2e-09
alternative hypothesis: true mean difference is not equal to 0
95 percent confidence interval:
 173.4219 215.5581
sample estimates:
mean difference 
         194.49 
  1. Method 2: The data are saved in a data frame.
Code
MX2 <- t.test(weight ~ group, data = MiceX, paired = TRUE)
Error in t.test.formula(weight ~ group, data = MiceX, paired = TRUE): cannot use 'paired' in formula method
Code
MX2
Error: object 'MX2' not found

Couldnt get method 2 to work!

Both methods should produce the same results

In the result above :

  • t is the t-test statistic value (t = 20.88),

  • df is the degrees of freedom (df= 9),

  • p-value is the significance level of the t-test (p-value = 6.210^{-9}).

  • conf.int is the confidence interval (conf.int) of the mean differences at 95% is also shown (conf.int= [173.42, 215.56])

  • sample estimates is the mean differences between pairs (mean = 194.49).

Again the code can be used with alternative = “less” and alternative = “greater” to test for average weight before treatment is less than the average weight after treatment or the average weight before treatment is greater than the average weight after.

The p-value of the test is 6.210 x10 -9 which is less than the significance level 0.05. We can then reject the null hypothesis and conclude that the average weight of the mice before treatment is significantly different from the average weight after treatment.

Again the following can be used to interpret the results:

  • statistic: the value of the t test statistics

  • parameter: the degrees of freedom for the t test statistics

  • p.value: the p-value for the test

  • conf.int: a confidence interval for the mean appropriate to the specified alternative hypothesis.

  • estimate: the means of the two groups being compared (in the case of independent t test) or difference in means (in the case of paired t test).

One-way ANOVA test in R

This one-way analysis of variance (ANOVA) also known as one-factor ANOVA is used to comparing the means in a situation where there are more than 2 groups. In one-way ANOVA, the data is organised into several groups base on one single grouping variable.

Assumptions of ANOVA

This test can only be run when:

  • The observations are obtained independently and randomly from the population defined by the factor levels

  • The data of each factor level are normally distributed.

  • These normal populations have a common variance, tested by Levene’s test.

How ANOVA works

Assume that we have 3 groups (A, B, C) to compare:

  1. Compute the common variance, which is called variance within samples (S2within) or residual variance.
  2. Compute the variance between sample means as follow:
  • Compute the mean of each group

  • compute the variance between sample means (S2between)

  1. Produce F-statistic as the ratio of S2between / S2within

Note that, a lower ratio (ratio < 1) indicates that there are no significant difference between the means of the samples being compared. However, a higher ratio implies that the variation among group means are significant.

Here the plant growth data which is a R standard.

#checking 10 random samples
set.seed(1234)
dplyr::sample_n(PlantGrowth, 15)
   weight group
1    6.15  trt2
2    3.83  trt1
3    5.29  trt2
4    5.12  trt2
5    4.50  ctrl
6    4.17  trt1
7    5.87  trt1
8    5.33  ctrl
9    5.26  trt2
10   4.61  ctrl
11   5.80  trt2
12   6.11  ctrl
13   5.58  ctrl
14   5.17  ctrl
15   6.31  trt2

In terms of R, the column “group” is called factor and the different categories (ctr, trt1, trt2) are named factor levels, the levels are ordered alphabetically.

levels(PlantGrowth$group)
[1] "ctrl" "trt1" "trt2"

If the levels are not in the correct order the following code can be used:

Plants <- PlantGrowth

Plants$group <- ordered(Plants$group, levels = c("ctrl", "trt1", "trt2"))

Computing the summary statistics:

Code
group_by(Plants, group) %>%
  summarise(
    bount = n(),
    mean = mean (weight, na.rm = TRUE),
    sd = sd(weight, na.rm = TRUE)
  )
# A tibble: 3 × 4
  group bount  mean    sd
  <ord> <int> <dbl> <dbl>
1 ctrl     10  5.03 0.583
2 trt1     10  4.66 0.794
3 trt2     10  5.53 0.443

Visualising the Data:

Box plot:

Code
ggboxplot(Plants, x = "group", y = "weight",
          color = "group", palette = c("#00AFBB", "#E7B800", "#FC4E07"),
          order = c("ctrl", "trt1", "trt2"),
          ylab = "Weight", xlab = "Treatment")

Mean plot:

Code
ggline(Plants, x = "group", y = "weight",
       add = c("mean_se", "jitter"),
               order = c("ctrl", "trt1", "trt2"),
       ylab = "Weight", xlab = "Treatment")

Running 1-way ANOVA test

We want to know if there is any significant difference between the average weights of plants in the 3 experimental conditions.

The R function aov() can be used. The function summary.aov() is used to summarise the analysis of variance model:

plant.aov <- aov(weight ~ group, data = Plants)

summary(plant.aov)
            Df Sum Sq Mean Sq F value Pr(>F)  
group        2  3.766  1.8832   4.846 0.0159 *
Residuals   27 10.492  0.3886                 
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

The output includes the columns F value and Pr(>F) corresponding to the p-value of the test.

Interpreting the results

As the p-value is less than the significance level 0.05, we can conclude that there are significant differences between the groups highlighted with “*” in the model summary.

Multiple pairwise-comparison between the means of groups

In the test a significant p-value indicates that some of the group means are different, but which ones>

Tukey Multiple pairwise-comparisons

As the ANOVA test is significant, Tukey HSD (Tukey Honest Significant Differences) can be used to compare the groups:

TukeyHSD(plant.aov)
  Tukey multiple comparisons of means
    95% family-wise confidence level

Fit: aov(formula = weight ~ group, data = Plants)

$group
            diff        lwr       upr     p adj
trt1-ctrl -0.371 -1.0622161 0.3202161 0.3908711
trt2-ctrl  0.494 -0.1972161 1.1852161 0.1979960
trt2-trt1  0.865  0.1737839 1.5562161 0.0120064
  • Diff: difference between means of the two groups

  • lwr, upr: the lower and upper end point of the confidence interval at 95%

  • p adj: p-value after adjustment for the multiple comparisons.

It can be seen from the output that the only difference between trt2 and trt1 is significant with a p-value of 0.012.

There are a couple of other methods that can be used to compare them like above, but this seems the most simple!

The ANOVA test assumes that the data are normally distributed and the variance across groups are homogeneous, and it can be checked!

Checking the homogeneity of variance assumption

The residuals versus fits plot can be used to check the homogeneity of variances.

The plot below shows no evident relationships between residuals and fitted values (the mean of each group), which is good, so we can assume the homogeneity of variances.

plot(plant.aov, 1)

Here points 17, 15, 4 are detected as outliers, which can affect normality and homogeneity of variance, it can be useful to remove outliers to meet test assumptions.

Bartlett’s test or Levene’s test can be used to check homogeneity of variances.

Levene’s test

Found in the car package it can be run using:

leveneTest(weight ~ group, data = Plants)
Levene's Test for Homogeneity of Variance (center = median)
      Df F value Pr(>F)
group  2  1.1192 0.3412
      27               

The test here shows that the p-value is not less than the significance level of 0.05. This means that there is no evidence to suggest that the variance across groups is statistically significantly different. Therefore, we can assume the homogeneity of variances in the different treatment groups.

Relaxing the homogeneity of variance assumption

ANOVA requires an assumption of equal variances for all groups. In our example homogeneity of variance assumption turned out to be fine.

How do we save our ANOVA test, in a situation where the homogeneity of variance assumption is violated?

This can be tested with the Welch one way test, which does not need this assumption to run.

  • ANOVA test with no assumption of equal variance
oneway.test(weight ~ group, data = Plants)

    One-way analysis of means (not assuming equal variances)

data:  weight and group
F = 5.181, num df = 2.000, denom df = 17.128, p-value = 0.01739
  • Pairwise t-test with no assumption of equal variance
pairwise.t.test(Plants$weight, Plants$group,
                p.adjust.method = "BH", pool.sd = FALSE)

    Pairwise comparisons using t tests with non-pooled SD 

data:  Plants$weight and Plants$group 

     ctrl  trt1 
trt1 0.250 -    
trt2 0.072 0.028

P value adjustment method: BH 
Checking Normality assumption

Normality plot of residuals, shown below, the quantiles of the residuals are plotted against the quantiles of the normal distribution. A 45-degree reference line is also plotted.

The normal probability plot of residuals is used to check the assumption that the residuals are normally distributed.

plot(plant.aov, 2)

As all the points fall around/on the reference line, we assume normality.

The conlcusion above is supported by the shapiro-wilk test on the ANOVA residuals, finding no indication that normality is violated:

aov_residuals <- residuals(object = plant.aov)

shapiro.test(x = aov_residuals)

    Shapiro-Wilk normality test

data:  aov_residuals
W = 0.96607, p-value = 0.4379
Kruskal-Wallis rank Sum test

This can be used as a non-parametric alternative to one way ANOVA and can be used when the ANOVA assumptions are not met.

kruskal.test(weight ~ group, data = Plants)

    Kruskal-Wallis rank sum test

data:  weight by group
Kruskal-Wallis chi-squared = 7.9882, df = 2, p-value = 0.01842

Two-way ANOVA Test

Two-way ANOVA is used to evaluate simultaneously the effect of two grouping variables (A and B) on a response variable.

Grouping variables are known as factors, the different categories (groups) of a factor are called levels. The number of levels can vary between factors. The level combinations of factors are called cell.

2 way test Hypotheses
  1. There is no difference in the means of factor A
  2. There is no difference in means of factor B
  3. There is no interaction between factors A and B

The alternative hypothesis for 1 and 2 is the means are not equal.

The alternative hypothesis for 3 is there is an interaction between A and B.

2 way ANOVA Assumptions

Like all ANOVA tests, it assumes that within each cell are normally distributed and have equal variances.

2 way ANOVA balanced designs

Balanced designs correspond to the situation where we have equal sample sizes within levels of our independent grouping levels.

Example

This example will use the R data set ToothGrowth.

Tooth <- ToothGrowth

Tooth$dose <- factor(Tooth$dose, 
                     levels = c(0.5, 1, 2),
                     labels = c("D0.5", "D1", "D2"))
head(Tooth)
   len supp dose
1  4.2   VC D0.5
2 11.5   VC D0.5
3  7.3   VC D0.5
4  5.8   VC D0.5
5  6.4   VC D0.5
6 10.0   VC D0.5

Question: We want to know if tooth length depends on supp and dose.

Creating frequency table:

table(Tooth$supp, Tooth$dose)
    
     D0.5 D1 D2
  OJ   10 10 10
  VC   10 10 10

Table shouws us that each factor has 10 subjects in each cell. Here we have a balanced design.

Visualising data

Box plot:

Code
ggboxplot(Tooth, x = "dose", y = "len", color = "supp",
          palette = c("#00AFBB", "#E7B800"))

Line plot with multiple groups:

Code
ggline(Tooth, x = "dose", y = "len", color = "supp",
       add = c("mean_se", "dotplot"),
       palette = c("#00AFBB", "#E7B800"))

Running the 2 way ANOVA test

We want to know if tooth length depends on supp and dose

The function aov() can be used to answer this question. The function summary.aov() is used to summarise the analysis of variance model.

Tooth.aov <- aov(len ~ supp + dose, data = Tooth)
summary (Tooth.aov)
            Df Sum Sq Mean Sq F value   Pr(>F)    
supp         1  205.4   205.4   14.02 0.000429 ***
dose         2 2426.4  1213.2   82.81  < 2e-16 ***
Residuals   56  820.4    14.7                     
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

The output of the test shows that we conclude that both supp and dose are statistically significant. Dose is the most significant factor variable, due to the lower p-value. These results would lead us to believe that changing delivery methods (supp) or dose of vitamin C, will impact significantly on the mean tooth length.

Note

Note, this method above is called additive model. It assumes that the 2 factor variables are independent. If the 2 variables might interect to create an synergistic effect, replace the + with *

Interpreting the results:

From the test we can conclude the following (based on the p-values and significance level of 0.05):

  • The p-value of supp is 0.000429 (significant), which indicates that the levels of supp are associated with significant different tooth length.

  • The p-value of dose is < 2 x10 -16 (significant), which indicates that the levels of dose are associated with significant different tooth length.

  • The p-value for the interaction between supp*dose is 0.02 (significant), which indicates that the relationships between dose and tooth length depends on the supp method.

Summary statistics

Mean and sd by groups:

group_by(Tooth, supp, dose) %>%
  summarise(
    count = n(),
    mean = mean(len, na.rm = TRUE),
    sd = sd (len, na.rm = TRUE)
  )
`summarise()` has grouped output by 'supp'. You can override using the
`.groups` argument.
# A tibble: 6 × 5
# Groups:   supp [2]
  supp  dose  count  mean    sd
  <fct> <fct> <int> <dbl> <dbl>
1 OJ    D0.5     10 13.2   4.46
2 OJ    D1       10 22.7   3.91
3 OJ    D2       10 26.1   2.66
4 VC    D0.5     10  7.98  2.75
5 VC    D1       10 16.8   2.52
6 VC    D2       10 26.1   4.80
Tukey HSD test

Testing to see which groups are significiant, in this example we don’t need to test the “supp” variable as it has only 2 levels, which have been already proven to be significantly different by ANOVA.

TukeyHSD(Tooth.aov, which = "dose")
  Tukey multiple comparisons of means
    95% family-wise confidence level

Fit: aov(formula = len ~ supp + dose, data = Tooth)

$dose
          diff       lwr       upr p adj
D1-D0.5  9.130  6.215909 12.044091 0e+00
D2-D0.5 15.495 12.580909 18.409091 0e+00
D2-D1    6.365  3.450909  9.279091 7e-06
Checking the ANOVA assumptions
Homogeneity of variance

The residuals versus fits plot is used to check homogeneity of variances. There is no evident relationship between the residuals and fitted values (the mean of each groups), which is good. so we can assume the homogeneity of variances.

plot(Tooth.aov, 1)

Levene’s test

Checking the homogeneity of variances:

leveneTest(len ~ supp*dose, data = Tooth)
Levene's Test for Homogeneity of Variance (center = median)
      Df F value Pr(>F)
group  5  1.7086 0.1484
      54               

This output shows to not be less than the significance level of 0.05. Meaning that there is no evidence to suggest that the variance across the groups is statistically significantly different. Therefore, we can assume homogeneity of variances in the different treatment groups.

Normality assumption

Normality plot of residuals, the plot shows the quantiles of the residuals are plotted against the quantiles of the normal distribution. with a 45-degree reference line. The normal probability plot of residuals should approx follow the straight line, and here they do!

plot(Tooth.aov, 2)

As they follow a rough straight line we can assume normality, this conclusion is supported by the shapiro-wilk test on the ANOVA residuals which finds no indication that normality is violated:

tooth_residuals <- residuals(object = Tooth.aov)

shapiro.test(x = tooth_residuals)

    Shapiro-Wilk normality test

data:  tooth_residuals
W = 0.96168, p-value = 0.05687

Two way ANOVA for unbalanced designs

An unbalanced design has unequal numbers of subjects in each group.

There are 3 fundamentally different ways to run ANOVA that is unbalanced. We shall be only using type-III sums of squares

Tooth_ANOVA <- aov(len ~ supp * dose, data = Tooth)

Anova(Tooth_ANOVA, type = "III")
Anova Table (Type III tests)

Response: len
             Sum Sq Df F value    Pr(>F)    
(Intercept) 1750.33  1 132.730 3.603e-16 ***
supp         137.81  1  10.450  0.002092 ** 
dose         885.26  2  33.565 3.363e-10 ***
supp:dose    108.32  2   4.107  0.021860 *  
Residuals    712.11 54                      
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

MANOVA Test

A MANOVA can be used in a situation where there are multiple response variables that can be tested simultaneously.

MANOVA stands for Multi-variate analysis of variance.

MANOVA Assumptions

MANOVA can be used is certain conditions:

  • The dependent variables should be normally distributed within groups.

  • Homogeneity of variances across the range of predictors

  • Linearity between all pairs of dependent variables, all pairs of covariates, and all dependent variable-covariate pairs in each cell.

Example

This example will use the R data Iris

Iris <- iris

We want to know if there is any significant difference, in sepal and petal length, between the different species.

sepl <- Iris$Sepal.Length
petl <- Iris$Petal.Length

iris.man <- manova(cbind(Sepal.Length, Petal.Length) ~ Species, data = iris)
summary(iris.man)
           Df Pillai approx F num Df den Df    Pr(>F)    
Species     2 0.9885   71.829      4    294 < 2.2e-16 ***
Residuals 147                                            
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
summary.aov(iris.man)
 Response Sepal.Length :
             Df Sum Sq Mean Sq F value    Pr(>F)    
Species       2 63.212  31.606  119.26 < 2.2e-16 ***
Residuals   147 38.956   0.265                      
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

 Response Petal.Length :
             Df Sum Sq Mean Sq F value    Pr(>F)    
Species       2 437.10 218.551  1180.2 < 2.2e-16 ***
Residuals   147  27.22   0.185                      
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

From the output it can be seen that the 2 variables are highly significantly different among species.

Krushkal-Wallis Test

This test is a non-parametric alternative to one-way ANOVA test, which extend the two-samples Wilcoxon test in the situation where there are more than 2 groups. It should be used when the assumptions of one-way ANOVA test are not met.

See the section week 7 post sessions tasks, one-way ANOVA test at the end for the example.

Week 7 Formative 2 notes

An abstract is crucial for a scientific paper to highlight its key points to intrigue those looking and the paper but also inform them whether it will be valuable to there own research and findings!

Video notes

Title and abstracts are very important to supply enough information about the paper and maybe the only part of the paper that gets read, however, these should not give you the answer or findings to the paper, if you do then it is guaranteed that your paper wont be read!

Title

This has to be descriptive but not too much, too wordy titles can be a turn off to reading the paper. But it cant be too vague as it may not show enough detail for researchers.

You need to think about the audience of this paper and the details you have in your paper and who is interested in the work you have done.

Titles are like the goldilocks effect it has to be just right.

With you title think about:

  1. key focus of the paper eg species, environment or process.
  2. Geographic region or habitat type
  3. type of analysis
  4. theory or theme
Abstract

An abstract shouldnt be a condensed introduction page!

  1. What is the context? (briefly)
  2. What did you do?
  3. How did you do it?
  4. What did you find out? (include results))
  5. What does it mean?
Formative notes

The following are notes made for the formative assessment 2 about the paper being studied:

With the increasing anthropogenic pressures on wildlife biodiversity and habitats, conservation and land management is key to combating this, as well as utilising degraded land is another crucial method in reducing biodiversity loss

One strategy: land sharing with requires and encourages farmers to adopt practices that are wildlife friendly, and therefore increasing and improving habitats available to wildlife.

having quality habitat over degraded ones benefits wildlife as they are able to perform energy gaining activities to the max such as feeding, foraging and resting, while in degraded areas, energy is expended travelling for food or evading predators.

This paper focuses on critically endagered Western Santa Cruz Galapagos Tortoises (Chelonoidis porteri), use this in the abstract!!

These toitoises travel for foraging when the low lying areas become depleted and travel to the higher humid conditions for quality habitat.

Now these humid highlands are highly modified habitats with the majority being farmland which has ongoing and unknown impacts on tortoise migratory behaviour.

This paper looks to support the migratory cycle of giant tortoises and was conducted in agricultural zones of Santa Cruz.

Aims:

Quantify the activity patterns (eating walking resting) on agricultural areas

examining the time spent on agricultural areas, season, sex, vegetation characteristics

the time spent doing various activities on farms

Methods

Locations Santa Cruz Island in the Galapagos

Behavioral observations

3 month periods a total of 242 behavioral observations of tortoises in 2019, 114 taken in the wet season, 128 in the dry season

each observation was conducted over 30 mins at a distance where all activities are recorded down

after 30 mins the vegetation was chatacterisied in 1m squared area by the tortoise, including the density of vegetation and mean height

tortoises were measured and sex recorded 109 male, 81 female and 52 unknown or juvenile

Thermal image of the tortoise was taken to gather the maximum, minimum and mean temp of the animal and the same done with the ground around the tortoise.

Ratio of time spent eating, walking and resting for each animal as these relate to energy aquasition and expenditure.

Testing to also see if vegetation characteristics have an influence on the tortoises behavioral patterns or activity with the present vegetation.

Results

resting was observed the most (51%), followed by eating and walking (24% and 10% respectively).

They also recorded other activities including the interaction with tourists, vehicles or livestock.

Land-use type had strong impact on tortoise activity. Tortoises spent most time eating in the touristic areas and resting in the abandoned areas significantly longer than the livestock or touristic areas.

Walking was found to not be influenced by land-use type or temperature.

Eating was influenced by vegetation characteristics, height and density/cover of vegetation had an impact.

Tortoises were less likely to walk when vegetation height and density increased. vegetation density however did not significantly impact resting behaviours.

impact of tourist interaction on tortoises is unknown and requires research!

further research is needed into farming practices and preferred tortoise survival conditions to ensure both parties benefit from this land-use.

Limitations

land-use was categoriesed generally and requires to be more detailed in the agricultural usage.

Observation windows were tended to the first half of the day, thus missing any activities performed in the latter half.

Assumptions were made in how energy is gained or expended on activity, eating resting walking, this needs further verification.

Conclusion

Agricultural areas are very important for these critically endangered tortoises. Study gives further understanding to how tortoises now use these areas.

Land-use type and vegetation characteristic influence the thermal signature of the tortoise and therefore aided in determining the behavioural patterns in agricultural areas.

This study found that tortoises are more likely to rest more in areas where the habitat is unfavourable to them, walk and move in areas with shorter or less vegetation and eat more where vegetation quality, cover and density is greatest all contributing to energy saving techniques.

Future research requires time spent in humid highlands impacting the migratory decision, time in certain habitats and animal well-being.

Week 8

Pre Session

#|error: true survey <- read.csv(“ED101surveyfall21.csv”,header=TRUE) head(survey)

Unfortunately the pre session tasks cannont be completed as of yet as the data file seems to no longer exits with the code I used above!

Here are the notes on correlation from the document instead.

Positive correlation is when 2 variables tend to change in the same direction

Negative correlation is when 2 variables tend to change in different directions.

A strong correlation is when the results change in accordance with one another in general.

The correlation Coefficient

Also known as Pearson’s R, the correlation coefficient is an index to express the direction and strngth of the relationship, it is bounded between -1 and 1 for any pair of variables.

The sign of the correlation coefficient indicates the direction of the correlation, while the absolute value of the correlation coefficient indicates its strength. This can be tested with:

cor.test(variable 1, variable 2)

when run in r the number presented in the results presented under cor is the correlation coefficient.

Values of 1 or -1 indicates a perfect correlation.

The correlation coefficient is an index to express the degree of relationship, the correlation coefficient tells us whether it is possible to predict one variable based on the other, but it does not tell us how to make that prediction.

The r value is best on linear graphs and can be influenced by outliers.

The scale of the axis also has no impact on the correlation of the data

Session notes

DA

Correlations

This session looked at correlations in data and R.

Correlations are useful to look at when you don’t expect effects, when you can explain effects and when you want to reduce dimensions of your data.

Pearson’s correlation can be used on data or a normal distribution and will give you a value between 1 and -1 relating to how the data relates, positive or negative and the strength of the correlation.

Spearman’s Correlation can be used on non-normal data and will give a rho value

Corrplot() or correlation plots are very useful tool for visualising data.

Correlations are used to describe data structure and understand complex phenomena.

Correlation has to not be confused with causation!

Outliers will have an impact on yout data!

RM

How to write introductions

Writing introductions in science use the funnel of scientific writing.

Imagine a funnel for the introduction you need to start broad, at the wide part of the funnel and define the research territory, then move through the funnel and identify the gap, narrowing the focus, by the end at the smallest width of the funnel you state your current aims and occupy the gap.

Discussion on the other hand works the other way, starting at the narrowest part of the funnel with the major findings of the study, widwning out slightly by findings in context of other studies, to the widest part of the funnel with the implications and generalisability.

An introduction should follow the MEAL plan (similar to the PEEL plan point evidenct explain link)

  • Main Idea: the main idea of your paragraph should be stated in the opening sentence, and everything in the paragraph should support this idea.

  • Evidence: the specific information that supports your amin idea.

  • Analysis: your interpretation of the evidence.

  • Link: ties everything together and also lead to the next paragraph.

Remember to plan your paragraphs and writing and clearly identify each step of the MEAL plan you have written.

Points for a good intro:

  • Hook the reader from the very first sentence

  • Easy to read

  • clearly presented

  1. Theory
  2. Gas
  3. Hypothesis
  4. Implications

Post Session

Understanding Correlations

Using this website, the following questions are completed:

  1. -0.77 is the strongest as it is the closest to 1 or -1, it cant be 1.05 as the scale does not go beyond 1.
  2. 0.55
  3. Non Linear
  4. a = c, b = d, c = e, d = a, e = b
  5. Outliers
  6. a) positive, b) generally negative, c) zero (I say none as people can be good at one and bad at the other), d) positive, e) zero, f) negative, g) positive, h) positive
  7. This should be a perfect negatice correlation, r=-1 as a small car age would mean a higher number for the year produced.
  8. There would be no change in the correlation by transforming the x to t-scores
  9. If r = 1 and Zx is -0.5 you would expect Zy to be -0.5. if r = -1 and Zx is 0.8 you would expect to see Zy also be 0.8
  10. I would guess that between 0.2 and 0.4 and between 0.5 and 0.7, the difference in the correlation would be approximately the same as the r value is equally different, the spacings in the correlation would also be equally spread, comparatively.
  11. 1? as there are only 2 recordings the correlation would be one.
  12. At first glance I think its e as one is positibe and one negative correlation but I think it could be b, but Im not fully sure! After asking AI the question to explain it, it told me that we needed to find the correlation coefficient (r squared) which found study one to be 0.04 and 0.16 for study therefore b is the correct answer, 4 times.
  13. ?
  14. I would guess c and maybe b
  15. ?
  16. This question is not clear to me?

Week 9

Pre-Session

Linear Regression Models

Video 1

Supposedly the easiest thing in the world to do!

Running the test:

cars %>%
  lm(dist ~ speed, data = .) %>%
  summary()

Call:
lm(formula = dist ~ speed, data = .)

Residuals:
    Min      1Q  Median      3Q     Max 
-29.069  -9.525  -2.272   9.215  43.201 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -17.5791     6.7584  -2.601   0.0123 *  
speed         3.9324     0.4155   9.464 1.49e-12 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 15.38 on 48 degrees of freedom
Multiple R-squared:  0.6511,    Adjusted R-squared:  0.6438 
F-statistic: 89.57 on 1 and 48 DF,  p-value: 1.49e-12

In the code above lm() is the linear model function. The dependent variable is inputted first, here it is distance, while speed is the independent variable.

data = . is needed to tell r that the data frame needs to be at the end for it to work.

Residuals: each model has a line of best fit which tells us what we could expect to find on the y-axis with any given x-axis, however, in the model data points are away from the line of best fit, the distance between the observation and the model is the residual.

Coefficient: the estimate is what we are looking at this is the y intercept on the intercept line. The other coefficient is speed in this example and the value there is the slope of the line of best fit. Remaining on the speed line but at the other end is the p-value for the slope which needs to be smaller than our significance level of 0.05

Finally looking at the r-squared value, this tells us the percent change expected in y when we change y.

Important

Basically:

you need these for values to complete the linear regression model:

y intercept

slope

p-value

r-squared

with this data you can check the residuals and plot them for instance on a histogram:

mod <- lm(dist ~ speed, data = cars)
hist(mod$residuals)

This can be built on to then predict data at certain points on the x-axis in this case we will look at 10, 15, 20 and 50

new_speeds <- data.frame(speed = c(10, 15, 20, 50))
predict(mod, new_speeds) %>% round(1)
    1     2     3     4 
 21.7  41.4  61.1 179.0 

Therefor this model shows that at 10mph it would take 21.7 metres to stop and so on to 179.0m at 50 mph

This can also be done this way:

cars %>%
  lm(dist ~ speed, data = .) %>%
  predict(data.frame(speed = c(10, 15, 20, 100))) %>%
  round(1)
    1     2     3     4 
 21.7  41.4  61.1 375.7 
Video 2
  • Regression is the means of exploring the variation in some quantity.

  • The variation is separated into explains and unexplained components

A sample regression line is what is known as the line of best fit. That can be used to estimate and guess y values with an x value.

Degrees of freedom

We need to ask ourselves this question:

What is the minimum number of data pointed needed to run the regression model?

It first cant be 2 points as there then isnt a possibility for error, as a regression needs a possibility for error, this can only be done with at least 3 points, this will give you one degrees of freedom, its something but not much, this is because the line of best fit can escape the observations.

So now if we add in another variable to this model, for example if you had number of ice creams sold in a day, one variable will be weather, the other variable could be month. we can get a model such as y = m1x1 + c1 + m2x2 + c2. This will create a graph with 2 x-axis and we would then be plotting in space or 3d essentially. This will allow you to draw a plain of best fit on 3 points in that 3d space, this plain can be put into all those three points and again we are left with 0 error or r aquared of 1. This changes with a 4th observation creating an error and getting a single degree of freedom, we have an equation for this:

df = n - k - 1

n= number of observations

k= number of x variables.

By adding more explanatory variables (x-axis, or things that explain your results, like before in this case with ice creams, its time of year, weather or school holidays or location etc.), degrees of freedom are reduced. The opportunity for “error” in the model is reduced.

Adjusted R-squared

If an r squared value gives you 0.58 this tells you that 58% of recordings are being influenced by that variable. When a second variable is added the degrees of freedom drops by one and the r-squared will likely be bigger as you have another variable, this then with the r-squared will tell you for example 0.74, meaning 74% of results can be explained using both the added variables. Finally if a third explained variable is added, this again puts up the r-squared value, and decrease the degrees of freedom again. But with these have to be related, adding more variables into the model will also cause the r-squared value to increase, even if the variable is not relating to the data being collected.

We can then use the adjusted r-squared equation which factors in the loosing of degrees of freedom. adding varibales in r-squared just reduces the amount of room for error. Adjusted r-squared willl alter with the degrees of freedom and the reduction of error.

Writing Methods video

This is to explain what you have done to answer the research question.

There is a 4 step process, acknowledge the critical point, identify the alternative, select option and explain your choice. This process can be used at every stage of the Methods section, type of study, data collection, study site etc.

This process then shows the reader and you, what you have and haven’t done and why you have done it that way!

How to explain:

  1. use the aims and objectives
  2. conceptual framework
  3. past research, justifying our methods.
  4. General Methods Knowledge
  5. context of the research
  6. constraints

Session

DA

Linear Models

Linear models are a useful tool for looking at:

  • When you expect effects

  • when you are able to explain effects

  • when wanting to predict values

It is again important to know if your data is normally distributed.

As seen in the pre session tasks, linear models can be done with more than one x-axis or explained variable. This can create a 3d graphical of the data.

Remember that you can use the shapiro-wilk test to test for normality in the data and it is important to look at the residuals in the linear regression model as these can tell you more about your data and how things have worked.

Data can be predicted on data using the predict function with the example being run in the pre-session activities.

ANCOVA

ANCOVA or Analysis of convariance is a general linear model that uses ANOVA and regression. It evaluates whether the means of a dependent variable are equal across levels of one or more categorical independent variables and across one or more continuous variables.

Running an ANCOVA example

In this example we shall use the mtcars data and examine the effects of the number of cylinders (cyl) on miles per gallon (mpg), while controlling for the weight of the car (wt).

The code below shows mpg as the dependent variable, cyl as the independent variable, and wt as the covariate.

ancova_mtcars <- aov(mpg ~ cyl + wt, data = mtcars)
summary(ancova_mtcars)
            Df Sum Sq Mean Sq F value   Pr(>F)    
cyl          1  817.7   817.7  124.04 5.42e-12 ***
wt           1  117.2   117.2   17.77 0.000222 ***
Residuals   29  191.2     6.6                     
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Then we have to check the assumptions of the test, such as homogeneity of regression slopes and normality of residuals.

#checking homogeneity of regression slopes
regression <- aov(mpg ~ cyl * wt, data = mtcars)
summary(regression)
            Df Sum Sq Mean Sq F value   Pr(>F)    
cyl          1  817.7   817.7   145.9 1.28e-12 ***
wt           1  117.2   117.2    20.9 8.94e-05 ***
cyl:wt       1   34.2    34.2     6.1   0.0199 *  
Residuals   28  157.0     5.6                     
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#checking normality
shapiro.test(residuals(ancova_mtcars))

    Shapiro-Wilk normality test

data:  residuals(ancova_mtcars)
W = 0.93745, p-value = 0.06341
Interpreting the results
summary(ancova_mtcars)
            Df Sum Sq Mean Sq F value   Pr(>F)    
cyl          1  817.7   817.7  124.04 5.42e-12 ***
wt           1  117.2   117.2   17.77 0.000222 ***
Residuals   29  191.2     6.6                     
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
  1. df:

    • cyl: 1 df, showing 1 group comparison, comparing the different numbers of cylinders

    • wt: 1df, representing the covariate

    • residuals: 29 df which is the total number of observations minus the number of parameters estimated.

  2. Sum of Square (sum sq):

    • cyl: 817.7, the variability in mpg explained by the number of cylinders

    • wt: 117.2, the variability in mpg explained by the weight of the car

    • residuals: 191.2, the variability in mpg not explained by the model (error term).

  3. Mean Squares (Mean sq):

    • calculated by dividing the sum of squares by the corresponding degrees of freedom
  4. F-value: the ratio of the mean square of the factor to the mean square of the residuals

  5. Pr(>F) or p-value:

  • This is the probability of observing an F-value as extreme as, or more extreme than, the observed value under the null hypothesis.

  • In this test cyl had 5.42 x10-12, indicating a highly significant effect of the number of cylinders on mpg

  • wt: 0.000222, indicating a highly significant effect of weight on mpg.

Both cyl and wt have significant effects on mpg, as indicated by their p-values being less than the significance level of 0.05.

RM

Methods

Methods vs Methodology

The method refers to the techniques and procedures used to collect and analyse data, while methodology encompasses the overall research design, including the theoretical framework, research questions, and research approach.

This section must be clear to the reader about what is going on and make it easily reproducible.

Methods require a study area, the location, its characteristics and any useful details that are relevant to the research, do not put in facts just because they are interesting.

When it comes to the data analysis part, be descriptive with it don’t just list the tests used, state why you used the tests as well, like finding the association between or links of significance etc.

Post-Session

DA

For this post session we need to answer the following questions (i had actually done some of this earlier using the ANCOVA test in the session notes before seeing this, with the mtcars data!!)

Using a linear model, what is the effect of the weight of the vehicle on its efficiency (mpg)?

Calculate and interpret all coefficients using the summary and anova function.

#Running the Linear regression model:
wt_mpg <- lm(mpg ~ wt, data = mtcars)
summary(wt_mpg)

Call:
lm(formula = mpg ~ wt, data = mtcars)

Residuals:
    Min      1Q  Median      3Q     Max 
-4.5432 -2.3647 -0.1252  1.4096  6.8727 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept)  37.2851     1.8776  19.858  < 2e-16 ***
wt           -5.3445     0.5591  -9.559 1.29e-10 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 3.046 on 30 degrees of freedom
Multiple R-squared:  0.7528,    Adjusted R-squared:  0.7446 
F-statistic: 91.38 on 1 and 30 DF,  p-value: 1.294e-10

Interpreting the results of this linear model test:

Firstly, it is evident to see that weight has a significant impact on the mpg of a vehicle as the linear test produced a p-value of 1.294 x10 -10, this is lower than our significance level alpha of 0.05. The linear model test also tells us that the y-intercept of this data is at 37.28 and the regression line on this has a slope of -5.3445, this shows us that there is a negative effect on the data, showing that as the weight increases on the x-axis, the value on the y-axis decreases.

hist(wt_mpg$residuals)

wt_mpg_anova <- anova(wt_mpg)
wt_mpg_anova
Analysis of Variance Table

Response: mpg
          Df Sum Sq Mean Sq F value    Pr(>F)    
wt         1 847.73  847.73  91.375 1.294e-10 ***
Residuals 30 278.32    9.28                      
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

The ANOVA test provided the same result or p-value as the linear regression model. With 30 df for the residuals. Finally I think for good practice I will check the normality of the data using the shapiro-wilk test on the residuals of the test:

shapiro.test(residuals(wt_mpg))

    Shapiro-Wilk normality test

data:  residuals(wt_mpg)
W = 0.94508, p-value = 0.1044

This shapiro-wilk test on the residuals shows that the residuals of the linear regression model is normally distributed due to the p-value being greater than the significance level of 0.05.

RM

The paper I have chosen for this post session activity is: Comparing multispectral and hyperspectral UAV data for detecting peatland vegetation patterns.

I chose this paper as I have a background and interested into peatland environments, used for my undergraduate research project and specifically using this paper due to the usage of UAVs. An area I am looking to potentially take on in my career and hyperspectral imagery as I have used and conducted some prelimary research and a SPUR project with NTU using a hyperspec camera and may use it for my master’s project!! Worked out nicely.

Unaware of this when I first saw this paper, but it actually has somewhat of a graphical design for its methods (see below), more of a flow chart, but after reading the full methods, I am going to use this as the basis for my “attempt” at a graphical method.

My Attempt, Its not great and is pretty similar to theirs and is more messy and probably not strictly correct! It wouldn’t an art competitions:

Week 10

Pre-session

Session

DA

This session is looking at linear logistic models.

For correlations we have numerical vs numerical values on both axis, the same happens with linear models.

For logistic models we have numerical for x-axis and categorical is on the y-axis.

For example we can have 1-0, a-b, male-female, yes-no on the y-axis.

On these graphs you will get a curved line of regression, the steepness of the line is key to showing differences in the probabilities in the data.

Knowing the type of categorical variable is important!

It can be:

  • ordinal: Categories that maintain an order. High-low

  • Nominal: categories with no order ranking. Male-female

  • Binary: Nominal variable with 2 categories. 1-0

When visualising the data the curve should show a s curve, the stronger the the stronger the probabilities.

This uses the glm code function, and the main difference between this and linear models is the family category, this is the family of the error distribution, used in the code:

penguins %>%
  mutate(prop=ifelse(sex=="male",1,0)) -> penguins2
m1 <- glm(prop~body_mass_g,family = "binomial", data = penguins2)
m1

Call:  glm(formula = prop ~ body_mass_g, family = "binomial", data = penguins2)

Coefficients:
(Intercept)  body_mass_g  
   -5.16254      0.00124  

Degrees of Freedom: 332 Total (i.e. Null);  331 Residual
  (11 observations deleted due to missingness)
Null Deviance:      461.6 
Residual Deviance: 396.6    AIC: 400.6
summary(m1)

Call:
glm(formula = prop ~ body_mass_g, family = "binomial", data = penguins2)

Coefficients:
              Estimate Std. Error z value Pr(>|z|)    
(Intercept) -5.1625416  0.7243906  -7.127 1.03e-12 ***
body_mass_g  0.0012398  0.0001727   7.177 7.10e-13 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 461.61  on 332  degrees of freedom
Residual deviance: 396.64  on 331  degrees of freedom
  (11 observations deleted due to missingness)
AIC: 400.64

Number of Fisher Scoring iterations: 4

We should interpret the results the same as a linear model, but with an added bonus of the deviance to take into account, however, looking at this test we can see that there is significance to be able to judge the penguin sex using the body mass as the p=value is smaller than the significance level.

Coefficients

In the above output the coefficients shows the y intercept of the data. Here it is in the intercept column under estimate, at -5.16. For this example it is hypothetical and is not too interesting for us. The second intercept, here for body mass is for the inclination, but for this example this is the probability of having a male, our 1 value, increases by 0.00123 for every unit of gram. Every gram increase increases the probability of being a male by 0.00123. Being positive shows an s shape on the graph, low at the bottom left, high at the top right.

To model this, the sex = -5.16+0.0012 * body mass

if the penguin weighs 0, then it would here have a negative probability of having a sex.

Deviance and Pseudo-R squared.

2 deviance in the output, null and residual. This r-squared is calculated using the null and residual deviance. (null-residual)/null, to save these in r you need data$null.deviance/ -2 and data$deviance/ -2.

This is the overall effect size of our model, this formula is essentially the observed-expected squared, divided by the expected.

The output of this code will always be between 0 and 1, this is what we can call the pseudo-R squared. This can show the % of the results that can be explained by what you are testing.

When you call in summary on the glm you get a similar summary shown in the linear model, it shows the estimates in the first column which I think show the y-intercept (but check this!)

Again for this test only take the variables in the data you want to test that are relevant to each other (mainly the numerical variables) with the penguins data this can be the mass, flipper and bill length, the r squared value comes out to about 65% this means that with these 3 data we have a 65% probability of getting the gender correct.

Looking visualising the data of penguins and why observing a higher weight would cause a greater probability for that penguin to be a male compared with a female.

penguins %>%
  mutate(size=ifelse(body_mass_g>4202, "large", "small")) ->penguins2

xtabs(~size+sex, data=penguins2)
       sex
size    female male
  large     53   92
  small    112   76

We can create ratios of these by dividing small size female by small size male and large females number by large males, then divide the large over small proportions, this will give a number between 0 to 1, or an odds ratio.

Probabilities can be modeled as well to show these in a graph.

Visualising the graphic and know what type of data you have is the first and probably the most important part to stats analysis!!

RM

Results section

The best way is to structure the results using the same order as the methods, it will make the paper easier to follow.

How you present your results in important, too big tables should be avoided and would benefit having summary tables rather than the full data set, these should be used in the Appendices if possible.

Remember that all tables and figures need a caption and number to allow for track back later on!

Avoid just describing the figure or table you are using, comment about what and why this is useful and highlight important results.

Remember to include all the features on figures, tables and graphs, such as units and labels on the axis’

Figure captions are on the bottom, tables are on the top. Titles can be excluded on graphs and they are rare and unusual as they should be included in the figure caption.

Use clear and nice figures, R studio is a good program to make clear graphs.

Look at the ggplot gallery online to see what is available to use!

Remember not to discuss in the results section this should be in the discussion session!

Logistic Regression graph!

Use this video at the end which shows the code of how to plot a logistic regression probability graph which has a curve shaped data for a categorical dataset with 2 variables!

Post-session

Logisitc Regression

Caution

I found this logistic regression website difficult to use, there were too many creating new data files with the same name as before I got confused if that was the plan to overwrite them or not, I think that caused some mistake in the code later on as some of the values were different from the example!

Logistic regression is used to predict the class/category of individuals based on one or multiple predictor variables (x). Use it to model a binary outcome, that is a variable, which can have only 2 possible values: 0 or 1, yes or no etc.

Logistic regression is part of Generalised linear Model (GLM), developed for extending the linear regression model to other situations.

Logistic regression does not return directly the class of observations, it allows us to estimate the probability of class membership. The probability will range between 0 and 1. You need to decide the threshold probability at which the category flips from one to the other. Our default significance level is 0.05.

Example:

In this example a diabetes dataset from R will be used. PimaIndiansDiabetes2 in mlbench package.

Logsitic regression works for data that contains continuous and/or categorical predictor variables.

Performing these steps may improve the accuracy of your model:

  • Remove potential outliers

  • Make sure that the predictor variables are normally distributed. If not a log, root, Box-Cox transformation.

  • Remove highly correlated predictors to minimize overfitting. The presence of highly correlated predictors might lead to an unstable model solution.

# Loading data and removing NAs

data("PimaIndiansDiabetes2")

PimaIndiansDiabetes2 %>%
  na.omit() -> diabetes
summary(diabetes)
    pregnant         glucose         pressure         triceps     
 Min.   : 0.000   Min.   : 56.0   Min.   : 24.00   Min.   : 7.00  
 1st Qu.: 1.000   1st Qu.: 99.0   1st Qu.: 62.00   1st Qu.:21.00  
 Median : 2.000   Median :119.0   Median : 70.00   Median :29.00  
 Mean   : 3.301   Mean   :122.6   Mean   : 70.66   Mean   :29.15  
 3rd Qu.: 5.000   3rd Qu.:143.0   3rd Qu.: 78.00   3rd Qu.:37.00  
 Max.   :17.000   Max.   :198.0   Max.   :110.00   Max.   :63.00  
    insulin            mass          pedigree           age        diabetes 
 Min.   : 14.00   Min.   :18.20   Min.   :0.0850   Min.   :21.00   neg:262  
 1st Qu.: 76.75   1st Qu.:28.40   1st Qu.:0.2697   1st Qu.:23.00   pos:130  
 Median :125.50   Median :33.20   Median :0.4495   Median :27.00            
 Mean   :156.06   Mean   :33.09   Mean   :0.5230   Mean   :30.86            
 3rd Qu.:190.00   3rd Qu.:37.10   3rd Qu.:0.6870   3rd Qu.:36.00            
 Max.   :846.00   Max.   :67.10   Max.   :2.4200   Max.   :81.00            
# Inspecting the data:
sample_n(diabetes, 3)
    pregnant glucose pressure triceps insulin mass pedigree age diabetes
188        1     128       98      41      58 32.0    1.321  33      pos
639        7      97       76      32      91 40.9    0.871  32      pos
162        7     102       74      40     105 37.2    0.204  45      neg
# Splitting the data into training and test set:
set.seed(123)
training.samples <- diabetes$diabetes %>%
  createDataPartition(p = 0.8, list = FALSE)
train.data <- diabetes[training.samples, ]
test.data <- diabetes[-training.samples, ]

Computing logisitc regression

The function glm() can be used to compute logistic regression. You need to specify the option family = binomial, which tells R that we want to fit logistic regression.

Quick start R code

# Fit the model
model <- glm(diabetes ~., data = train.data, family = binomial)

# Summarise the model
summary(model)

Call:
glm(formula = diabetes ~ ., family = binomial, data = train.data)

Coefficients:
              Estimate Std. Error z value Pr(>|z|)    
(Intercept) -1.053e+01  1.440e+00  -7.317 2.54e-13 ***
pregnant     1.005e-01  6.127e-02   1.640  0.10092    
glucose      3.710e-02  6.486e-03   5.719 1.07e-08 ***
pressure    -3.876e-04  1.383e-02  -0.028  0.97764    
triceps      1.418e-02  1.998e-02   0.710  0.47800    
insulin      5.940e-04  1.508e-03   0.394  0.69371    
mass         7.997e-02  3.180e-02   2.515  0.01190 *  
pedigree     1.329e+00  4.823e-01   2.756  0.00585 ** 
age          2.718e-02  2.020e-02   1.346  0.17840    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 398.80  on 313  degrees of freedom
Residual deviance: 267.18  on 305  degrees of freedom
AIC: 285.18

Number of Fisher Scoring iterations: 5
# Making predictions:
probabilities <- model %>%
  predict(test.data, type = "response")
predicted.classes <- ifelse(probabilities > 0.5, "pos", "neg")

# Model accuracy:
mean(predicted.classes == test.data$diabetes)
[1] 0.7564103

Simple logistic regression

The simple logistic regression is used to predict the probability of class membership based on one single predictor variable.

The following R code builds a model to predict the probability of being diabetes-positive based on the plasma glucose concentration:

model2 <- glm(diabetes ~ glucose, data = train.data, family = binomial)
summary (model2)$coef
               Estimate  Std. Error   z value     Pr(>|z|)
(Intercept) -6.15882009 0.700096646 -8.797100 1.403974e-18
glucose      0.04327234 0.005341133  8.101716 5.418949e-16

The output above shows the estimate of the regression beta coefficients and their significance levels. The intercept (b0) is -6.16 and the coefficient of glucose variable is 0.043.

Using the logistic equation with the values for the intercept and glucose inputted can be used for each new glucose phasma concentration value, you can predict the probability of the individuals in being diabetes positive.

Predictions can be easily made using the function predict(). Using the option type = “response” to directly obtain the probabilities.

newdata <- data.frame(glucose = c(20, 180))
probabilities <- model2 %>%
  predict(newdata, type = "response")
predicted.classes <- ifelse(probabilities > 0.5, "pos", "neg")
predicted.classes
    1     2 
"neg" "pos" 

The logistic function gives an s-shaped probability curve illustrated as follows:

train.data  %>%
  mutate(prob = ifelse(diabetes == "pos", 1, 0)) %>%
  ggplot(aes(glucose, prob)) +
  geom_point(alpha = 0.2) +
  geom_smooth(method = "glm", method.args = list(family = "binomial")) +
  labs(
    title = "Logsitic Regression Model",
    x = "Plasma Glucose Concentration",
    y = "probability of being diabete-pos"
  )
`geom_smooth()` using formula = 'y ~ x'

Multiple logistic regression

The multiple logistic regression is used to predict the probability of class membership based on multiple predictor variables:

model3 <- glm(diabetes ~ glucose + mass + pregnant,
              data = train.data, family = binomial)
summary(model3)$coef
               Estimate  Std. Error   z value     Pr(>|z|)
(Intercept) -9.32369818 1.125997285 -8.280391 1.227711e-16
glucose      0.03886154 0.005404219  7.190962 6.433636e-13
mass         0.09458458 0.023529905  4.019760 5.825738e-05
pregnant     0.14466661 0.045125729  3.205857 1.346611e-03

Here we want to include all predictor variables availble in the data set, this is done by using ~.:

model4 <- glm(diabetes ~., data = train.data, family = binomial)
summary(model)$coef
                 Estimate  Std. Error     z value     Pr(>|z|)
(Intercept) -1.053400e+01 1.439679266 -7.31690975 2.537464e-13
pregnant     1.005031e-01 0.061266974  1.64041157 1.009196e-01
glucose      3.709621e-02 0.006486093  5.71934633 1.069346e-08
pressure    -3.875933e-04 0.013826185 -0.02803328 9.776356e-01
triceps      1.417771e-02 0.019981885  0.70952823 4.779967e-01
insulin      5.939876e-04 0.001508231  0.39383055 6.937061e-01
mass         7.997447e-02 0.031798907  2.51500698 1.190300e-02
pedigree     1.329149e+00 0.482291020  2.75590704 5.852963e-03
age          2.718224e-02 0.020199295  1.34570257 1.783985e-01

From the output above, the coefficients table shows the beta coefficient estimates and their significance levels. Columns are:

  • Estimate: the intercept (b0)and the beta coefficient estimates associated to each predictor variable.

  • Std.Error: the standard error of the coefficient estimates. This represents the accuracy of the coefficients. The larger the standard error, the less confident we are about the estimate.

  • z value: the z-statistic, which is the coefficient estimate (column 2) divided by the standard error of the estimate (column 3).

  • Pr(>|z|): The p-value corresponding to the z-statistic. The smaller the p-value, the more significant the estimate is.

Can also use the ceof() function to get these results:

coef(model4)
  (Intercept)      pregnant       glucose      pressure       triceps 
-1.053400e+01  1.005031e-01  3.709621e-02 -3.875933e-04  1.417771e-02 
      insulin          mass      pedigree           age 
 5.939876e-04  7.997447e-02  1.329149e+00  2.718224e-02 

However, I think the summary version gives more!

Interpretation

From our results it can be seen that 4 of the 8 predictors are significantly associated to the outcome. These include: Glucose, pregnant, mass and pedigree.

As the coefficient estimate of the variable glucose is positive, this means that an increase in glucose is associated with increase in the probability of being diabetes-positive. However, the coefficient for the variable pressure is negative, this means that an increase in blood pressure will be associated with a decreased probability of being diabetes-positive.

An important concept to understand, for interpreting the logistic coefficients, is the odds ratio. An odds ratio measures the association between a predictor variable (x) and the outcome variable (y). It represents the ratio if the odds that an event will occur given the presence of the predictor X, compared to the odds of the event occurring in the absence of that predictor.

For a given predictor (say x1), the associated beta coefficient (b1) in the logistic regression function corresponds to the log of the odds ratio for that predictor.

If the odds ratio is 2, then the odds that the even occurs are two times higher when the predictor x is present, vs x is absent.

For example, the regression coefficient for glucose is 0.042, this indicates that one unit increase in the glucose concentration will increase the odds of being diabetes-positive by exp(0.042)) 1.04 times.

From the logistic regression results, it can be noticed that some variables, triceps, insulin and age, are not statistically significant. keeping them in the model may contribute to overfitting. Therefore, they should be removed, this can be done automatically using statistical techniques, including stepwise regression and penalised regression methods. Or it can be done manually if the data set is small enough.

model5 <- glm(diabetes ~ pregnant + glucose + pressure + mass + pedigree, data = train.data, family = binomial)

Making predictions

We can make predictions using the test data in order to evaluate the performance of our logistic regression model.

The proceduce is as follows:

  1. predict the class membership probabilities of observations based on predictor variables
  2. Assign the observations to the class with highest probability score eg above 0.5.
Predict the probabilities of being diabetes-positive:
probs <- model %>%
  predict(test.data, type = "response")
head(probs)
       19        21        32        55        64        71 
0.1926284 0.4852623 0.6625272 0.7986815 0.2780734 0.1458773 

The output is the probability that the diabetes test will be positive. We know these values corresponf to the probability of the test to be positive, rather than negative, because the contrsts() function indicates that R has created a dummy variable with a 1 for “pos” and 0 for “neg”. THe probabilities always refer to the class dummy-coded as “1”.

This can be checked using:

contrasts(test.data$diabetes)
    pos
neg   0
pos   1
Predict the class of individuals

This then looks at cateogrising the probabilities, if the value is over 0.5 it is positive if below it is negative:

predict.class <- ifelse(probabilities > 0.5, "pos", "neg")
head(predict.class)
    1     2 
"neg" "pos" 

That code didnt work was meant to show the values 21 25 28 29 32 36 with neg pos neg pos pos neg, respectively below them.

Assessing model accuracy

The model accuracy is measured as the proportion of observations that have been correctly classified. Inversely, the classification error is defined as the proportion of observations that have been misclassified.

Proportion of correctly classified observations:

mean(predicted.classes == test.data$diabetes)
[1] 0.4230769
Warning

So as i mentioned at the top I found this page to be a bit messy for me and wasnt too clear on some things. The main a clear issue here is by repeating the code they get a value of 0.756 or 76% of data being correctly classified, mine for some reason following the same code only came to 0.423 or 43%……

Recreating the above in mtcars dataset

Looking at what we want to find:

Using the mtcars data we are going to look at the data first a see what we want to find out about the data:

summary(mtcars)
      mpg             cyl             disp             hp       
 Min.   :10.40   Min.   :4.000   Min.   : 71.1   Min.   : 52.0  
 1st Qu.:15.43   1st Qu.:4.000   1st Qu.:120.8   1st Qu.: 96.5  
 Median :19.20   Median :6.000   Median :196.3   Median :123.0  
 Mean   :20.09   Mean   :6.188   Mean   :230.7   Mean   :146.7  
 3rd Qu.:22.80   3rd Qu.:8.000   3rd Qu.:326.0   3rd Qu.:180.0  
 Max.   :33.90   Max.   :8.000   Max.   :472.0   Max.   :335.0  
      drat             wt             qsec             vs        
 Min.   :2.760   Min.   :1.513   Min.   :14.50   Min.   :0.0000  
 1st Qu.:3.080   1st Qu.:2.581   1st Qu.:16.89   1st Qu.:0.0000  
 Median :3.695   Median :3.325   Median :17.71   Median :0.0000  
 Mean   :3.597   Mean   :3.217   Mean   :17.85   Mean   :0.4375  
 3rd Qu.:3.920   3rd Qu.:3.610   3rd Qu.:18.90   3rd Qu.:1.0000  
 Max.   :4.930   Max.   :5.424   Max.   :22.90   Max.   :1.0000  
       am              gear            carb      
 Min.   :0.0000   Min.   :3.000   Min.   :1.000  
 1st Qu.:0.0000   1st Qu.:3.000   1st Qu.:2.000  
 Median :0.0000   Median :4.000   Median :2.000  
 Mean   :0.4062   Mean   :3.688   Mean   :2.812  
 3rd Qu.:1.0000   3rd Qu.:4.000   3rd Qu.:4.000  
 Max.   :1.0000   Max.   :5.000   Max.   :8.000  

From the summary I am going to attempt to look at the probability of having a car with an mpg greater than 22.8 mpg based on the cylinder number, weight, quarter of a mile time (qsec) and horse power.

Preparing the data:
#loading data and removing NA's
cars.na <- na.omit(mtcars)

#creating a binary variable for mpg:
cars.na$mpg_binary <- ifelse(cars.na$mpg > mean(cars.na$mpg), 1, 0)

#looking at the data:
sample_n(cars.na, 5)
               mpg cyl  disp  hp drat    wt  qsec vs am gear carb mpg_binary
Merc 280C     17.8   6 167.6 123 3.92 3.440 18.90  1  0    4    4          0
Mazda RX4     21.0   6 160.0 110 3.90 2.620 16.46  0  1    4    4          1
Honda Civic   30.4   4  75.7  52 4.93 1.615 18.52  1  1    4    2          1
Merc 450SE    16.4   8 275.8 180 3.07 4.070 17.40  0  0    3    3          0
Mazda RX4 Wag 21.0   6 160.0 110 3.90 2.875 17.02  0  1    4    4          1
#obataining training and testing set of data
cars.samples <- cars.na$mpg_binary %>%
  createDataPartition(p = 0.8, list = FALSE)
cars.training <- cars.na[cars.samples, ]
cars.test <- cars.na[-cars.samples, ]

Quick start R code:

#fitting the model
glm1 <- glm(mpg_binary ~., data = cars.training, family = binomial)
Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
#summary of the model
summary(glm1)

Call:
glm(formula = mpg_binary ~ ., family = binomial, data = cars.training)

Coefficients:
              Estimate Std. Error z value Pr(>|z|)
(Intercept)  3.163e+02  2.848e+06       0        1
mpg          3.709e+00  5.984e+04       0        1
cyl         -5.866e+01  2.704e+05       0        1
disp         7.136e-01  4.813e+03       0        1
hp           2.515e-01  9.932e+03       0        1
drat        -1.333e+01  5.109e+05       0        1
wt          -2.268e+01  7.143e+05       0        1
qsec        -1.076e+01  1.909e+05       0        1
vs           5.553e+01  2.711e+05       0        1
am          -3.536e+01  7.140e+05       0        1
gear         1.008e+01  4.064e+05       0        1
carb         5.082e+00  3.297e+05       0        1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 3.5426e+01  on 25  degrees of freedom
Residual deviance: 5.2059e-10  on 14  degrees of freedom
AIC: 24

Number of Fisher Scoring iterations: 25
#Making predictions:
probs1 <- predict(glm1, newdata = cars.test, type = "response")
predicted1 <- ifelse(probs1 > 0.5, "pos", "neg")

#model accuracy:
mean(predicted1 == cars.test$mpg_binary)
[1] 0
Simple logistic regression

Simple regression to see if the average mpg is affected by the hp

cars.simple <- glm(mpg_binary ~ hp, data = cars.training, family = binomial)
Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
summary(cars.simple)$coef
              Estimate Std. Error   z value  Pr(>|z|)
(Intercept) 21.9469129 13.4171460  1.635736 0.1018948
hp          -0.1926461  0.1198617 -1.607237 0.1080024

Something is going wrong im not getting the same output as AI showed me either its wrong or I am….

Final attempt at this!

Due to my issues I’m now recreating the code you went through in the tutorial session on regression etc. this should have the same results!

penguins %>%
  mutate(prop=ifelse(sex=="male",1,0)) ->penguins2

m1 <- glm(prop~body_mass_g, family = "binomial", data = penguins2)
summary(m1)

Call:
glm(formula = prop ~ body_mass_g, family = "binomial", data = penguins2)

Coefficients:
              Estimate Std. Error z value Pr(>|z|)    
(Intercept) -5.1625416  0.7243906  -7.127 1.03e-12 ***
body_mass_g  0.0012398  0.0001727   7.177 7.10e-13 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 461.61  on 332  degrees of freedom
Residual deviance: 396.64  on 331  degrees of freedom
  (11 observations deleted due to missingness)
AIC: 400.64

Number of Fisher Scoring iterations: 4

Okay so that works and I have now understood the ifelse code. This tells r to look the column in question and assign it with a categorical value. So if you want in the penguins data to separate the data by the mean body mass you can use ifelse(body_mass_g < mean, 1, 0) I think this would put the values higher than the mean as 1 and the rest as 0.

Above we have only used one predictor, this next one will use a second predictor:

m2 <- glm(prop~body_mass_g + species, family = "binomial", data=penguins2)
summary(m2)

Call:
glm(formula = prop ~ body_mass_g + species, family = "binomial", 
    data = penguins2)

Coefficients:
                   Estimate Std. Error z value Pr(>|z|)    
(Intercept)      -2.713e+01  2.998e+00  -9.049   <2e-16 ***
body_mass_g       7.373e-03  8.141e-04   9.056   <2e-16 ***
speciesChinstrap -2.559e-01  4.293e-01  -0.596    0.551    
speciesGentoo    -1.018e+01  1.195e+00  -8.520   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 461.61  on 332  degrees of freedom
Residual deviance: 212.09  on 329  degrees of freedom
  (11 observations deleted due to missingness)
AIC: 220.09

Number of Fisher Scoring iterations: 6

The above shows the additive version on logistic models, we can also have multiplicative models:

This is the important one to look at:

m3 <- glm(prop~body_mass_g+species+body_mass_g*species, family = "binomial", data = penguins2)
summary(m3)

Call:
glm(formula = prop ~ body_mass_g + species + body_mass_g * species, 
    family = "binomial", data = penguins2)

Coefficients:
                               Estimate Std. Error z value Pr(>|z|)    
(Intercept)                  -28.749639   4.819688  -5.965 2.45e-09 ***
body_mass_g                    0.007814   0.001312   5.958 2.56e-09 ***
speciesChinstrap              12.425763   6.489305   1.915   0.0555 .  
speciesGentoo                -26.283727  12.608883  -2.085   0.0371 *  
body_mass_g:speciesChinstrap  -0.003427   0.001757  -1.951   0.0511 .  
body_mass_g:speciesGentoo      0.003074   0.002657   1.157   0.2474    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 461.61  on 332  degrees of freedom
Residual deviance: 203.76  on 327  degrees of freedom
  (11 observations deleted due to missingness)
AIC: 215.76

Number of Fisher Scoring iterations: 7

The output of the additive shows only the 2 of the 3 species because r takes the first level of the data, can be alphabetically or not, it saying that with gentoo there are probably less males that the Adelie, this isnt really useful here as we aren’t looking for the difference in males and females but the difference in the inclinations between species on the second graph we made below or probabilities or fitness for predicting sex across the species.

Again in the one above 2/3 species are shown due to Adelie being used first.

The bottom 2 of the output are the important ones, as they are the effect of the interaction, comparing if there is an effect bwetween body mass and the species, the model compares the coefficients of each combination of the interaction, first comparing the interaction of the chinstap species to the adelie and then the combination of the gentoo to the adelie, it looks at whether these to are different from each other. We find an almost significant effect of the chinstrap, that slightly differs from the other species.

We can see this from the graph below as the slope on the graph is shallower for the chinstrap, this probably means that inclination of the chinstrap is smaller, or the effect to predict sex of the chinstrap species is reduced when using body mass as the defining factor.

Visualising the additive effects:

This will help to show the difference between the additive and multiplicative:

The first shows how we are comparing just the body mass of the penguins, the second takes species into account and provides far different results that the first. It shows how male and females can be easily predicted for sex when split into species and body mass due to the steepness of the curves.

penguins2 %>%
  ggplot(aes(body_mass_g, prop))+
  geom_jitter(height = 0.1)+
  stat_smooth(method="glm", se=FALSE, fullrange = TRUE, method.args = list(family=binomial))
`geom_smooth()` using formula = 'y ~ x'
Warning: Removed 11 rows containing non-finite outside the scale range
(`stat_smooth()`).
Warning: Removed 11 rows containing missing values or values outside the scale range
(`geom_point()`).

penguins2 %>%
  ggplot(aes(body_mass_g, prop, colour=species))+
  geom_jitter(height = 0.1)+
  stat_smooth(method="glm", se=FALSE, fullrange = TRUE, method.args = list(family=binomial))
`geom_smooth()` using formula = 'y ~ x'
Warning: Removed 11 rows containing non-finite outside the scale range
(`stat_smooth()`).
Warning: Removed 11 rows containing missing values or values outside the scale range
(`geom_point()`).

This shows just the effect of the mass on the sex, not the species here!

Now we are going test the species rather than the body mass:

m4 <- glm(prop~species, family="binomial", data=penguins2)
summary(m4)

Call:
glm(formula = prop ~ species, family = "binomial", data = penguins2)

Coefficients:
                  Estimate Std. Error z value Pr(>|z|)
(Intercept)      4.384e-16  1.655e-01   0.000    1.000
speciesChinstrap 1.026e-15  2.936e-01   0.000    1.000
speciesGentoo    5.043e-02  2.470e-01   0.204    0.838

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 461.61  on 332  degrees of freedom
Residual deviance: 461.56  on 330  degrees of freedom
  (11 observations deleted due to missingness)
AIC: 467.56

Number of Fisher Scoring iterations: 3

This shows that the sex is not influenced by the species in this model due to the p-values we got.

Presenting the data a little better
penguins2 %>%
  ggplot(aes(body_mass_g, prop))+
  geom_jitter(height = 0.1)+
  stat_smooth(method="glm", se=FALSE, fullrange = TRUE, method.args = list(family=binomial))+
  facet_wrap(.~species)
`geom_smooth()` using formula = 'y ~ x'
Warning: Removed 11 rows containing non-finite outside the scale range
(`stat_smooth()`).
Warning: Removed 11 rows containing missing values or values outside the scale range
(`geom_point()`).

As our test above showed there is more overlap with chinstrap compared with the other 2 species.

penguins2 %>%
  na.omit %>%
  ggplot(aes(species, body_mass_g, colour = sex))+
  geom_boxplot()

penguins2 %>%
  na.omit %>%
  ggplot(aes(species, body_mass_g, colour = sex))+
  geom_jitter()

To test this we can use this sort of code to conclude:

summary(lm(body_mass_g~species+sex+sex*species, data=penguins2))

Call:
lm(formula = body_mass_g ~ species + sex + sex * species, data = penguins2)

Residuals:
    Min      1Q  Median      3Q     Max 
-827.21 -213.97   11.03  206.51  861.03 

Coefficients:
                         Estimate Std. Error t value Pr(>|t|)    
(Intercept)               3368.84      36.21  93.030  < 2e-16 ***
speciesChinstrap           158.37      64.24   2.465  0.01420 *  
speciesGentoo             1310.91      54.42  24.088  < 2e-16 ***
sexmale                    674.66      51.21  13.174  < 2e-16 ***
speciesChinstrap:sexmale  -262.89      90.85  -2.894  0.00406 ** 
speciesGentoo:sexmale      130.44      76.44   1.706  0.08886 .  
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 309.4 on 327 degrees of freedom
  (11 observations deleted due to missingness)
Multiple R-squared:  0.8546,    Adjusted R-squared:  0.8524 
F-statistic: 384.3 on 5 and 327 DF,  p-value: < 2.2e-16

This gives us significance that there is the smallest difference with chinstrap compared with adelie. In this case it is a 2 way ANOVA.

The important part is testing the effect of one against another by using the term1+term2+term1*term2.

This shows how term 1 is influential on what you are testing, controlled by term 2.

Two-way ANOVA test:

anova(lm(body_mass_g~species+sex+sex*species, data=penguins2))
Analysis of Variance Table

Response: body_mass_g
             Df    Sum Sq  Mean Sq F value    Pr(>F)    
species       2 145190219 72595110 758.358 < 2.2e-16 ***
sex           1  37090262 37090262 387.460 < 2.2e-16 ***
species:sex   2   1676557   838278   8.757 0.0001973 ***
Residuals   327  31302628    95727                      
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

This is 2 way as it tests species, sex and the interaction between species and sex. This is important to know that id the interaction between 2 variables is significant you can almost forget the other two as I assume they would also be significant. They can be reported on but the significance of the interaction is more important.

Looking at interactions of variables and there significance is the most important part to be looking at in the logistic regression.

Week 11

Pre session

Polynomial regression

Quadratic Models

Polynomial regression is a special case of Linear regression where the relationship between x and y is modeled using a polynomial, rather than a line. When x and y have a non-linear relationship.

Poisson Distribution

Looking at the Bike share data:

glimpse(Bikeshare)
Rows: 8,645
Columns: 15
$ season     <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,…
$ mnth       <fct> Jan, Jan, Jan, Jan, Jan, Jan, Jan, Jan, Jan, Jan, Jan, Jan,…
$ day        <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,…
$ hr         <fct> 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 1…
$ holiday    <dbl> 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,…
$ weekday    <dbl> 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6,…
$ workingday <dbl> 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,…
$ weathersit <fct> clear, clear, clear, clear, clear, cloudy/misty, clear, cle…
$ temp       <dbl> 0.24, 0.22, 0.22, 0.24, 0.24, 0.24, 0.22, 0.20, 0.24, 0.32,…
$ atemp      <dbl> 0.2879, 0.2727, 0.2727, 0.2879, 0.2879, 0.2576, 0.2727, 0.2…
$ hum        <dbl> 0.81, 0.80, 0.80, 0.75, 0.75, 0.75, 0.80, 0.86, 0.75, 0.76,…
$ windspeed  <dbl> 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0896, 0.0000, 0.0…
$ casual     <dbl> 3, 8, 5, 3, 0, 0, 2, 1, 1, 8, 12, 26, 29, 47, 35, 40, 41, 1…
$ registered <dbl> 13, 32, 27, 10, 1, 1, 0, 2, 7, 6, 24, 30, 55, 47, 71, 70, 5…
$ bikers     <dbl> 16, 40, 32, 13, 1, 1, 2, 3, 8, 14, 36, 56, 84, 94, 106, 110…

We are looking at the number of bikers as the response variable. The linking variables are temp, work day, and weather. This data looks at bike usage every hour.

Looking at the data:

ggplot(Bikeshare, aes(y= bikers,
                      x= temp)) +
  geom_jitter(alpha = 0.2) + theme_minimal() + facet_wrap(~weathersit)

ggplot(Bikeshare, aes(y= bikers,
                      x= temp)) +
  geom_jitter(alpha = 0.2) + theme_minimal() + facet_wrap(~workingday)

Looking at these graphs it is clear to see that a linear regression will not work due to the line most likely producing negative results in the number of bike rentals, obviously in the real world this couldn’t happen, so a logistic regression model is best.

We should also note the scale of our data being used to understand what it is, here temp only goes from 0-1, again that is not a real world temp for the whole year, looking at the data, this varibale is the normalised temperature for each reading calculated using a formula.

Poisson regression is used to model counts, for instance the number of bike rentals. The bike share model has one quantitative variable, temp and 2 categorical ones, working day and weather. The first one we look at is the working day, this is simple as it can only be either 1 = working day, 0 = not a working day. This means we are effectively going to have 2 models, 1 when it is a working day and the other when it is not! For weather it is less simple, as we have 3 different variables, so we have 3 models, again looked in as 1s and 0s, 1s being true, with each, cloudy/misty, lightrain/snow, heavy rain/snow, each one has true or false variable. If all are false then the weather was clear. If one is true then that was the weather.

Overall our model will need 6 coefficients (intercept, temp, working day and 3 weathers).

bike.m1 <- glm(bikers ~ temp + workingday+ weathersit, data = Bikeshare, family = "poisson")
summary(bike.m1)

Call:
glm(formula = bikers ~ temp + workingday + weathersit, family = "poisson", 
    data = Bikeshare)

Coefficients:
                           Estimate Std. Error  z value Pr(>|z|)    
(Intercept)                3.885010   0.003231 1202.490  < 2e-16 ***
temp                       2.129054   0.004789  444.587  < 2e-16 ***
workingday                -0.008888   0.001943   -4.574 4.78e-06 ***
weathersitcloudy/misty    -0.042219   0.002132  -19.805  < 2e-16 ***
weathersitlight rain/snow -0.432090   0.004023 -107.412  < 2e-16 ***
weathersitheavy rain/snow -0.760995   0.166681   -4.566 4.98e-06 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for poisson family taken to be 1)

    Null deviance: 1052921  on 8644  degrees of freedom
Residual deviance:  816754  on 8639  degrees of freedom
AIC: 869804

Number of Fisher Scoring iterations: 5

The coefficient table shows the intercept and the coefficients of the variables.

Creating nice tables in R

bike.m1 %>%
  broom::tidy() %>%
  gt::gt()
term estimate std.error statistic p.value
(Intercept) 3.885009924 0.003230805 1202.489746 0.000000e+00
temp 2.129054163 0.004788834 444.587172 0.000000e+00
workingday -0.008888259 0.001943032 -4.574427 4.775243e-06
weathersitcloudy/misty -0.042218993 0.002131712 -19.805205 2.684687e-87
weathersitlight rain/snow -0.432089576 0.004022716 -107.412409 0.000000e+00
weathersitheavy rain/snow -0.760994643 0.166681086 -4.565573 4.981322e-06

lamda = (e3.89)(e2.13x1)(e-0.01x2)(e-0.04x3)(e-0.43x4)(e-0.76x5)

When all of the variables are zero, in this case, we would see the weather being clear and a non-working day, this would using the equation would give you e to the power of 3.89 or the intercept we found in the table above. This then = 48.9. All these values are in %

from the table above we can also see that for each additional unit of tempt, the expected number of bikers increases by a factor of e to the power of 2.13, looking at the temp row. This is a large factor as our temp scale is on a scale or 0-1 going from the very minimum temperature to the very maximum.

On a working day, the number of expected bikers decreases by a factor of 1, e to the power of -0.01. This can be seen in the table for workingday 1, the p-value is really small here showing a significant explainer of the number of bikes rented but the actual importance of this for changing the number of bikes rented is limited, and changes the number of bikes being ranted by about 1%.

Similarly if its cloudy or misty its around 1% again and 0.65 in light rain or snow, heavy rain or snow does 0.48% but this isnt concrete as we only had 1 readying in that weather condition.

Visualing the model:
ggplot(Bikeshare, aes(y= bikers,
                      x= temp,
                      colour = weathersit)) +
  geom_smooth(method = "glm", se = FALSE, method.args = list(family = "poisson")) +
  geom_jitter(alpha = 0.2) + theme_minimal() + facet_wrap(~workingday)
`geom_smooth()` using formula = 'y ~ x'

This model is showing us that as the temp increases on a working and non working day the number of bike rentals increases. Adding in colour for the weather type, as well.

Session

DA

Non-Linear Models

Models in algebra is f(y) = x

f is a function and therefore it makes x to be anything you need it to be. As we know for a line all you need is y=mx+c. Non-linear models are the same, and they look something like this:

y = a + Bx1 + Bx2 + ….. Bxn

It goes on for the number of variables.

Units of observations are important for these models.

For example with the penguins data, if the data is collected by camera traps and it only records data when 5 figures are detected, then for each data collection the units n=5, then from that 5 the data for males and females are then collected.

With the penguins data looking at the probability that a penguins is male or female we can get the function y = -5.16 (c) +0.00123 (m) x (body mass)

For non-linear data we need to change the family function in the code line:

glm(formula=sex~body_mass_g, family = ????, data = penguins)

When looking at the distribution family, if month of the year was on the x-axis and number of observations on the y, you might expect to see higher in the summer months and lower in the winter, creating an n shaped graph, if we got the p-value for that we could assume it would be high and then month not being significant but we need to see the data spread first as a straight line on a graph would not work.

For GKM range of y is restricted (eg counts, proportions, binary and duration)

effects are no additive (interactions are present)

variance depends on the mean (large the mean means larger the variance).

GLM specify a non-linear link function and variance function to allow for such things, while maintaining the simple interpretation of linear models.

One of the first things we need to check with the data is the shape of the correlation. This will ensure that the correct analysis on the data is taken.

Code to show how to plot a curved line of best fit using AI help!!!

penguins %>%
  ggplot(aes(x = bill_length_mm, y = bill_depth_mm)) +
  geom_point() +
  geom_smooth(method = "loess", se = FALSE) +
  labs(x = "Bill length (mm)",
       y = "Bill Depth (mm)")
`geom_smooth()` using formula = 'y ~ x'
Warning: Removed 2 rows containing non-finite outside the scale range
(`stat_smooth()`).
Warning: Removed 2 rows containing missing values or values outside the scale range
(`geom_point()`).

Poisson Distribution

This distribution is used to model count data, when our response variable takes a nonnegative integer. If Y~Poi (Lamda) with Poi>0

You cannot fit a normal distribution to a poisson distributions. Even if it looks like it does on a histogram, the use of a shapiro-wilk test will tell you otherwise!

If you find it to be poisson distributed then the family function used here is poisson:

glm(response~explanatory, family=poisson, data)

Overdisperstion

This can be modeled in R, overdispersion occurs when the observed variance in the data is greater than what the model expects. This generally occurs in poisson distribution. This can be combated in the glm using the family function of quasipoisson. This takes another parameter and considers the overdispertion, this doesn’t change the estimates if a poisson family code is used, the z-value changes as it uses the variance of the data, larger z-values tend to be significant to the data change.

This session is to show that not all data is normally distributed and that we need to check the data before performing some stats as R will fit anything you ask it to do. We have to fit the right test to the data we are looking at. I’m guessing this will be crucial for our summative data!

Post session

Polynomial models

1) Create a fictional dataframe or tibble using a quadratic function. Think on a question, imagine what kind of data would behave like a quadratic function.

2) make a graph using ‘ggplot’ to visualise the data

3) Fit a quadratic model

Section 1:

Quadratic formula:

y = ax squared + bx + c

Here:

  • a = 3

  • b = 5

  • c = -6

#creating data for a quadratic formula:
x <- seq(-15, 25, by = 1)
y <- 3 * x^2 + 5 * x - 6

#creating the data frame and tibble for this:
Quad <- data.frame(x = x, y = y)
quad <- as_tibble(Quad)

# Summary of the data:
summary(quad)
       x             y       
 Min.   :-15   Min.   :  -8  
 1st Qu.: -5   1st Qu.:  72  
 Median :  5   Median : 302  
 Mean   :  5   Mean   : 514  
 3rd Qu.: 15   3rd Qu.: 744  
 Max.   : 25   Max.   :1994  

Section 2:

#visualing the data:
quad %>%
ggplot(aes(x, y)) +
  geom_smooth(se = FALSE, colour = "blue", alpha = 0.2) +
  geom_point(colour = "darkred", size = 2) +
  labs(
    x = "X-Values",
    y = "Y-Values"
  ) +
  theme_minimal()
`geom_smooth()` using method = 'loess' and formula = 'y ~ x'

This sort of data could show (Ai gave me the help with business profits, saying the more produce you have to sell x, the more profit you would make.) I cant seem to think of anything else that would seem to work. If you had a a scale from 0-100 and a negative quadratic formula, that would show age of people living, roughly, but I dont know.

Section 3:

# Creating an x sqaured column in the data:
quad %>%
  mutate(x_squared = x^2) -> quad2

#adding the x_sqaured to the lm()

quad.lm <- lm(y ~ x + x_squared, data = quad2)

summary(quad.lm)
Warning in summary.lm(quad.lm): essentially perfect fit: summary may be
unreliable

Call:
lm(formula = y ~ x + x_squared, data = quad2)

Residuals:
       Min         1Q     Median         3Q        Max 
-3.436e-13 -1.959e-14  3.000e-15  3.511e-14  1.389e-13 

Coefficients:
              Estimate Std. Error    t value Pr(>|t|)    
(Intercept) -6.000e+00  1.729e-14 -3.470e+14   <2e-16 ***
x            5.000e+00  1.414e-15  3.536e+15   <2e-16 ***
x_squared    3.000e+00  9.716e-17  3.088e+16   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 7.784e-14 on 38 degrees of freedom
Multiple R-squared:      1, Adjusted R-squared:      1 
F-statistic: 1.057e+33 on 2 and 38 DF,  p-value: < 2.2e-16

Here we can see that this lm() worked as the estimate coefficients are at the values I made up in the original formula, as the p-values are less that 0.05 we can reject the null hypothesis here and be confident in saying that both x and x squared are strong influencers on y values.

Poisson Models

1) Load the data  ‘RecreationDemand’ of the package AER

2) Have a look at the data description using the “help” in Rstudio

3) What is the best model to predict the number of boat trips using a poissson model? Compare different models that you create.

4) Check for overdispersion of models

5) Draw a conclusion

Section 1:

The package has been imported into R studio in the Loading packages chunk.

Section 2:

Looking at the data using:

?RecreationDemand

This data looks at the number of recreational boating trips on Lake Somerville in Texas

659 observations were taken over 8 variables:

  • trips - number of recreational boating trips

  • quality - facility’s subjective quality ranking on a scale of 1 to 5

  • ski - factor, was the boat users water-skiing?

  • income - annual household income of the respondent in 1,000 USD

  • userfee - factor, did the individual pay an annual user fee at Lake Somerville

  • costC - expenditure when visiting Lake Conroe

  • costS - expenditure when visiting Lake Somerville

  • costH - expenditure when visiting Lake Houston.

Section 3:

Simple Poisson Model:

data("RecreationDemand")
glimpse(RecreationDemand)
Rows: 659
Columns: 8
$ trips   <dbl> 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,…
$ quality <dbl> 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,…
$ ski     <fct> yes, no, yes, no, yes, yes, no, yes, no, no, no, yes, no, no, …
$ income  <dbl> 4, 9, 5, 2, 3, 5, 1, 5, 2, 3, 2, 2, 2, 1, 4, 5, 3, 1, 3, 5, 6,…
$ userfee <fct> no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no…
$ costC   <dbl> 67.59, 68.86, 58.12, 15.79, 24.02, 129.46, 30.13, 31.29, 127.6…
$ costS   <dbl> 68.620, 70.936, 59.465, 13.750, 34.033, 137.377, 42.450, 36.79…
$ costH   <dbl> 76.800, 84.780, 72.110, 23.680, 34.547, 137.850, 44.100, 24.80…

trips quality

boat1 <- glm(trips ~ quality, family = poisson, data = RecreationDemand)

summary(boat1)

Call:
glm(formula = trips ~ quality, family = poisson, data = RecreationDemand)

Coefficients:
            Estimate Std. Error z value Pr(>|z|)    
(Intercept) -0.51378    0.05751  -8.933   <2e-16 ***
quality      0.54696    0.01518  36.022   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for poisson family taken to be 1)

    Null deviance: 4849.7  on 658  degrees of freedom
Residual deviance: 3294.4  on 657  degrees of freedom
AIC: 4051.5

Number of Fisher Scoring iterations: 7

This shows that the quality of the lake has an influence on the number of trips taken.

checking for overdispersion:

#pearson residuals
boat1.residuals <- residuals(boat1, type = "pearson")

#dispersion statistic, sum of squared pearson residuals/degrees od freedom

b1.dispersion <- sum(boat1.residuals^2) / df.residual(boat1)

#printing the stats:
b1.dispersion
[1] 14.00975

This shows that there is likely overdispersion as the value is not close to 1 and is far greater than it. So we should use a different model: Quasipoisson!

b1.quasi <- glm(trips ~ quality, family = quasipoisson, data = RecreationDemand)

summary(b1.quasi)

Call:
glm(formula = trips ~ quality, family = quasipoisson, data = RecreationDemand)

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -0.51378    0.21527  -2.387   0.0173 *  
quality      0.54696    0.05683   9.624   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for quasipoisson family taken to be 14.00976)

    Null deviance: 4849.7  on 658  degrees of freedom
Residual deviance: 3294.4  on 657  degrees of freedom
AIC: NA

Number of Fisher Scoring iterations: 7

trips income

boat2 <- glm(trips ~ income, family = poisson, data = RecreationDemand)

summary(boat2)

Call:
glm(formula = trips ~ income, family = poisson, data = RecreationDemand)

Coefficients:
            Estimate Std. Error z value Pr(>|z|)    
(Intercept)  1.17740    0.06063  19.419  < 2e-16 ***
income      -0.09994    0.01547  -6.462 1.03e-10 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for poisson family taken to be 1)

    Null deviance: 4849.7  on 658  degrees of freedom
Residual deviance: 4805.2  on 657  degrees of freedom
AIC: 5562.2

Number of Fisher Scoring iterations: 7

trips +income and quality

boat3 <- glm(trips ~ income + quality, family = poisson, data = RecreationDemand)

summary(boat3)

Call:
glm(formula = trips ~ income + quality, family = poisson, data = RecreationDemand)

Coefficients:
            Estimate Std. Error z value Pr(>|z|)    
(Intercept)  0.00864    0.08176   0.106    0.916    
income      -0.14593    0.01736  -8.404   <2e-16 ***
quality      0.55299    0.01519  36.408   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for poisson family taken to be 1)

    Null deviance: 4849.7  on 658  degrees of freedom
Residual deviance: 3217.6  on 656  degrees of freedom
AIC: 3976.6

Number of Fisher Scoring iterations: 7

checking for overdispersion:

b3.residuals <- residuals(boat3, type = "pearson")

b3.dispersion <- sum(b3.residuals^2)/df.residual(boat3)

b3.dispersion
[1] 12.42427

Again high, would then use quasi poisson here!

trips income and quality interaction

boat4 <- glm(trips ~ income + quality + income * quality, family = poisson, data = RecreationDemand)

summary(boat4)

Call:
glm(formula = trips ~ income + quality + income * quality, family = poisson, 
    data = RecreationDemand)

Coefficients:
                Estimate Std. Error z value Pr(>|z|)    
(Intercept)     0.095825   0.130858   0.732    0.464    
income         -0.171935   0.035460  -4.849 1.24e-06 ***
quality         0.524922   0.036306  14.458  < 2e-16 ***
income:quality  0.008202   0.009681   0.847    0.397    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for poisson family taken to be 1)

    Null deviance: 4849.7  on 658  degrees of freedom
Residual deviance: 3216.8  on 655  degrees of freedom
AIC: 3977.9

Number of Fisher Scoring iterations: 7

Checking for overdispersion:

b4.residuals <- residuals(boat4, type = "pearson")

b4.dispersion <- sum(b4.residuals^2) / df.residual(boat4)

b4.dispersion
[1] 12.41296

Value is high again will do quasi-poisson:

b4.quasi <- glm(trips ~ income + quality + income * quality, family = quasipoisson, data = RecreationDemand)

summary(b4.quasi)

Call:
glm(formula = trips ~ income + quality + income * quality, family = quasipoisson, 
    data = RecreationDemand)

Coefficients:
                Estimate Std. Error t value Pr(>|t|)    
(Intercept)     0.095825   0.461045   0.208    0.835    
income         -0.171935   0.124935  -1.376    0.169    
quality         0.524922   0.127913   4.104 4.58e-05 ***
income:quality  0.008202   0.034109   0.240    0.810    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(Dispersion parameter for quasipoisson family taken to be 12.41321)

    Null deviance: 4849.7  on 658  degrees of freedom
Residual deviance: 3216.8  on 655  degrees of freedom
AIC: NA

Number of Fisher Scoring iterations: 7

This shows that the interaction between quality and income does not influence the number of boat trips.

Week 12

Pre-session

Basic princliples of principal component analysis

Lets say we have a group of cars, they may look the same but they may all be difference groups, 3 of them, from the outset we cant oberve differences so we grab some data on each one like length, height, width etc.

if we plot 2 of the variables against each other we might find a correlation.

With 3 variables, we can compare the 3rd variable with 1 and 2 to see if there is any similar correlations or not. This can be plotted on a 3d scale.

But 4 variables or more we cannot do this, so we must draw a principal component analysis plot (PCA), this converts the correlations or lack of among all of the cells into a 2D graph. Cells that are highly correlated cluster together. These clusters in the graph will aid us to see the difference in the group.

The axis are ranked in order of importance, the differences in the first principal component axis (x-axis) are more important than the differences along the second principal component axis (y-axis).

Session

Multivariate Analysis

This method will help uncover patterns, trends and dependencies on data, then aiding in finding groupings in the data, if there is a link between factors. Then also help reduce the dimendions on the graph, as these graphs have the 4 axis in a cross.

This method is important as you can test multiable variables. eg an unvariate example can look at level of education with life satisfaction. In multivariarte we can look at the level of education with the life satisfaction and job satisfaction etc.

The position on the graph is important as closer together groups of data will show similar results compared with other groups this would hint at links between varaibles and the relationship between them.

The use of cluster dendrograms are useful to have a look at the data, it may not be the best in all cases.

Cluster analysis:

Using the Iris data about the iris plant, with three species we are going to look at the petal length and width:

iris %>%
  ggplot(aes(x=Petal.Length, y=Petal.Width, colour = Species)) +
  geom_point()

we can see a correlation here as the length increases the width aslo increases

clustering the data

set.seed(55)
cluster.iris <- kmeans(iris[, 3:4], 3, nstart = 20)
cluster.iris
K-means clustering with 3 clusters of sizes 50, 52, 48

Cluster means:
  Petal.Length Petal.Width
1     1.462000    0.246000
2     4.269231    1.342308
3     5.595833    2.037500

Clustering vector:
  [1] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
 [38] 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
 [75] 2 2 2 3 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 2 3 3 3 3
[112] 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3
[149] 3 3

Within cluster sum of squares by cluster:
[1]  2.02200 13.05769 16.29167
 (between_SS / total_SS =  94.3 %)

Available components:

[1] "cluster"      "centers"      "totss"        "withinss"     "tot.withinss"
[6] "betweenss"    "size"         "iter"         "ifault"      

we can use the result of this cluster to further explore the data. for example a boxplot can be used on the cluster means.

Here we need to create a table to see which species are in which group:

table(cluster.iris$cluster, iris$Species)
   
    setosa versicolor virginica
  1     50          0         0
  2      0         48         4
  3      0          2        46

Distance Methods

Euclidean:

This is used to create the triangle matrix

dist(iris[, 1:4])
            1         2         3         4         5         6         7
2   0.5385165                                                            
3   0.5099020 0.3000000                                                  
4   0.6480741 0.3316625 0.2449490                                        
5   0.1414214 0.6082763 0.5099020 0.6480741                              
6   0.6164414 1.0908712 1.0862780 1.1661904 0.6164414                    
7   0.5196152 0.5099020 0.2645751 0.3316625 0.4582576 0.9949874          
8   0.1732051 0.4242641 0.4123106 0.5000000 0.2236068 0.7000000 0.4242641
9   0.9219544 0.5099020 0.4358899 0.3000000 0.9219544 1.4594520 0.5477226
10  0.4690416 0.1732051 0.3162278 0.3162278 0.5291503 1.0099505 0.4795832
11  0.3741657 0.8660254 0.8831761 1.0000000 0.4242641 0.3464102 0.8660254
12  0.3741657 0.4582576 0.3741657 0.3741657 0.3464102 0.8124038 0.3000000
13  0.5916080 0.1414214 0.2645751 0.2645751 0.6403124 1.1618950 0.4898979
14  0.9949874 0.6782330 0.5000000 0.5196152 0.9746794 1.5716234 0.6164414
15  0.8831761 1.3601471 1.3638182 1.5297059 0.9165151 0.6782330 1.3601471
16  1.1045361 1.6278821 1.5874508 1.7146428 1.0862780 0.6164414 1.4933185
17  0.5477226 1.0535654 1.0099505 1.1661904 0.5477226 0.4000000 0.9539392
18  0.1000000 0.5477226 0.5196152 0.6557439 0.1732051 0.5916080 0.5099020
19  0.7416198 1.1747340 1.2369317 1.3228757 0.7937254 0.3316625 1.2083046
20  0.3316625 0.8366600 0.7549834 0.8660254 0.2645751 0.3872983 0.6480741
21  0.4358899 0.7071068 0.8306624 0.8774964 0.5385165 0.5385165 0.8602325
22  0.3000000 0.7615773 0.7000000 0.8062258 0.2645751 0.4123106 0.6000000
23  0.6480741 0.7810250 0.5099020 0.7071068 0.5656854 1.1224972 0.4582576
24  0.4690416 0.5567764 0.6480741 0.6480741 0.5291503 0.6782330 0.6244998
25  0.5916080 0.6480741 0.6403124 0.5385165 0.5744563 0.8306624 0.5477226
26  0.5477226 0.2236068 0.4690416 0.4242641 0.6324555 1.0099505 0.6082763
27  0.3162278 0.5000000 0.5099020 0.5477226 0.3464102 0.6480741 0.4582576
28  0.1414214 0.5916080 0.6164414 0.7211103 0.2449490 0.5291503 0.6244998
29  0.1414214 0.5000000 0.5477226 0.6782330 0.2828427 0.6480741 0.6082763
30  0.5385165 0.3464102 0.3000000 0.1732051 0.5385165 1.0148892 0.3162278
31  0.5385165 0.2449490 0.3316625 0.2236068 0.5744563 1.0246951 0.4242641
32  0.3872983 0.6782330 0.7810250 0.8774964 0.5000000 0.5385165 0.8124038
33  0.6244998 1.1489125 1.0535654 1.1704700 0.5567764 0.4582576 0.9486833
34  0.8062258 1.3416408 1.2845233 1.4247807 0.7810250 0.4795832 1.2083046
35  0.4582576 0.1414214 0.3000000 0.3000000 0.5196152 0.9848858 0.4472136
36  0.3741657 0.3000000 0.3162278 0.5099020 0.4472136 0.9695360 0.5000000
37  0.4123106 0.7874008 0.8544004 1.0049876 0.5196152 0.6082763 0.9165151
38  0.2449490 0.6082763 0.4690416 0.6000000 0.1414214 0.7211103 0.4123106
39  0.8660254 0.5099020 0.3605551 0.3000000 0.8544004 1.4177447 0.4690416
40  0.1414214 0.4582576 0.4898979 0.5830952 0.2449490 0.6480741 0.5196152
41  0.1732051 0.5291503 0.4358899 0.6082763 0.1732051 0.7000000 0.4242641
42  1.3490738 0.8185353 0.9273618 0.8366600 1.4000000 1.8814888 1.1090537
43  0.7681146 0.5477226 0.3000000 0.3000000 0.7280110 1.3000000 0.3162278
44  0.4582576 0.6782330 0.6557439 0.7000000 0.4582576 0.6082763 0.5477226
45  0.6164414 0.9848858 0.9591663 0.9695360 0.5830952 0.3741657 0.8185353
46  0.5916080 0.1414214 0.2645751 0.2645751 0.6403124 1.1269428 0.4472136
47  0.3605551 0.8485281 0.7810250 0.8660254 0.3000000 0.3872983 0.6782330
48  0.5830952 0.3605551 0.1414214 0.1414214 0.5656854 1.1224972 0.2236068
49  0.3000000 0.8124038 0.8062258 0.9219544 0.3316625 0.3605551 0.7745967
50  0.2236068 0.3162278 0.3316625 0.4582576 0.3000000 0.8062258 0.4242641
51  4.0037482 4.0963398 4.2766810 4.1773197 4.0607881 3.6124784 4.2308392
52  3.6166283 3.6864617 3.8496753 3.7336309 3.6633318 3.2465366 3.7854986
53  4.1641326 4.2367440 4.4158804 4.3058100 4.2190046 3.7868192 4.3669211
54  3.0935417 2.9698485 3.1543621 2.9849623 3.1480152 2.9444864 3.1272992
55  3.7920970 3.8118237 3.9974992 3.8729833 3.8496753 3.4698703 3.9560081
56  3.4161382 3.3911650 3.5510562 3.3926391 3.4568772 3.1543621 3.4899857
57  3.7854986 3.8600518 4.0112342 3.8897301 3.8249183 3.4073450 3.9344631
58  2.3452079 2.1470911 2.3065125 2.1118712 2.3874673 2.3280893 2.2781571
59  3.7496667 3.7881394 3.9749214 3.8548671 3.8078866 3.4146742 3.9357337
60  2.8879058 2.8053520 2.9495762 2.7784888 2.9223278 2.7055499 2.8827071
61  2.7037012 2.4617067 2.6476405 2.4515301 2.7586228 2.7147744 2.6495283
62  3.2280025 3.2449961 3.4029399 3.2680269 3.2710854 2.9189039 3.3361655
63  3.1464265 3.0413813 3.2588341 3.1080541 3.2186954 2.9832868 3.2634338
64  3.7000000 3.7121422 3.8794329 3.7376463 3.7456642 3.3896903 3.8209946
65  2.5806976 2.5592968 2.7202941 2.5806976 2.6267851 2.3366643 2.6627054
66  3.6276714 3.7000000 3.8807216 3.7762415 3.6851052 3.2588341 3.8353618
67  3.4351128 3.4336569 3.5749126 3.4205263 3.4669872 3.1464265 3.4942810
68  3.0099834 2.9715316 3.1527766 3.0000000 3.0626786 2.7784888 3.1160873
69  3.7682887 3.6918830 3.8961519 3.7496667 3.8340579 3.5468296 3.8794329
70  2.8827071 2.7928480 2.9782545 2.8160256 2.9376862 2.7073973 2.9495762
71  3.8535698 3.8935845 4.0311289 3.8923001 3.8845849 3.5085610 3.9420807
72  3.0757113 3.0740852 3.2588341 3.1304952 3.1336879 2.7928480 3.2202484
73  4.0472213 4.0187063 4.2071368 4.0620192 4.1036569 3.7709415 4.1701319
74  3.6578682 3.6565011 3.8314488 3.6851052 3.7067506 3.3674916 3.7828561
75  3.4161382 3.4467376 3.6318040 3.5114100 3.4741906 3.0935417 3.5916570
76  3.5972211 3.6510273 3.8340579 3.7229021 3.6551334 3.2465366 3.7907783
77  4.0472213 4.0804412 4.2731721 4.1545156 4.1085277 3.7121422 4.2391037
78  4.2449971 4.2953463 4.4698993 4.3497126 4.2965102 3.8832976 4.4147480
79  3.5312887 3.5383612 3.7027017 3.5623026 3.5763109 3.2264532 3.6414283
80  2.4939928 2.4186773 2.6153394 2.4698178 2.5573424 2.3194827 2.5980762
81  2.8178006 2.7000000 2.8879058 2.7202941 2.8740216 2.6758176 2.8653098
82  2.7018512 2.5787594 2.7712813 2.6038433 2.7604347 2.5729361 2.7549955
83  2.8948230 2.8548205 3.0364453 2.8913665 2.9495762 2.6608269 2.9983329
84  4.1352146 4.1170378 4.2825226 4.1279535 4.1785165 3.8470768 4.2225585
85  3.4117444 3.3985291 3.5298725 3.3674916 3.4380227 3.1400637 3.4423829
86  3.5199432 3.5972211 3.7322915 3.6069378 3.5510562 3.1448370 3.6414283
87  3.9115214 3.9786933 4.1545156 4.0422766 3.9648455 3.5411862 4.1024383
88  3.6180105 3.5580894 3.7669616 3.6262929 3.6864617 3.3867388 3.7549967
89  3.0000000 2.9983329 3.1464265 2.9966648 3.0364453 2.7239677 3.0740852
90  3.0215890 2.9291637 3.1032241 2.9376862 3.0708305 2.8407745 3.0626786
91  3.3120990 3.2434549 3.4073450 3.2357379 3.3541020 3.1032241 3.3555923
92  3.5958309 3.6221541 3.7854986 3.6482873 3.6400549 3.2726136 3.7229021
93  3.0099834 2.9546573 3.1400637 2.9899833 3.0659419 2.7892651 3.1064449
94  2.3874673 2.1794495 2.3537205 2.1633308 2.4372115 2.3748684 2.3388031
95  3.1527766 3.1032241 3.2680269 3.1080541 3.1968735 2.9223278 3.2140317
96  3.0740852 3.0789609 3.2326460 3.0838288 3.1128765 2.7910571 3.1654384
97  3.1256999 3.1144823 3.2726136 3.1224990 3.1670175 2.8548205 3.2093613
98  3.3451457 3.3645208 3.5425979 3.4132096 3.3985291 3.0347982 3.4957117
99  2.0904545 1.9131126 2.0856654 1.9157244 2.1424285 2.0566964 2.0639767
100 3.0577770 3.0298515 3.1953091 3.0446675 3.1032241 2.8053520 3.1400637
101 5.2848841 5.3385391 5.4726593 5.3357286 5.3131911 4.9061186 5.3758720
102 4.2083251 4.1809090 4.3347434 4.1773197 4.2461747 3.9255573 4.2638011
103 5.3018865 5.3572381 5.5290144 5.4064776 5.3507009 4.9223978 5.4680892
104 4.6904158 4.7085029 4.8682646 4.7222876 4.7307505 4.3566042 4.7989582
105 5.0566788 5.0911688 5.2469038 5.1097945 5.0960769 4.6978719 5.1710734
106 6.0950800 6.1595454 6.3364028 6.2153037 6.1457302 5.7052607 6.2801274
107 3.5916570 3.4799425 3.6083237 3.4205263 3.6166283 3.4263683 3.5312887
108 5.6364883 5.6868269 5.8660038 5.7384667 5.6877060 5.2659282 5.8137767
109 5.0477718 5.0408333 5.2249402 5.0813384 5.1009803 4.7349762 5.1797683
110 5.6391489 5.7471732 5.8940648 5.7844619 5.6762664 5.2057660 5.8077534
111 4.3566042 4.4192760 4.5738387 4.4519659 4.3977267 3.9774364 4.4977772
112 4.5199558 4.5210618 4.6936127 4.5530210 4.5683695 4.2011903 4.6368092
113 4.8538644 4.9020404 5.0695167 4.9457052 4.9010203 4.4833024 5.0049975
114 4.1904654 4.1340053 4.2918527 4.1303753 4.2308392 3.9370039 4.2272923
115 4.4170126 4.4022721 4.5442271 4.3965896 4.4508426 4.1146081 4.4609416
116 4.6260134 4.6808119 4.8270074 4.7010637 4.6626173 4.2497059 4.7423623
117 4.6454279 4.6829478 4.8456166 4.7095647 4.6882833 4.2918527 4.7780749
118 6.2401923 6.3694584 6.5207362 6.4140471 6.2785349 5.7913729 6.4397205
119 6.4984614 6.5314623 6.7178866 6.5901442 6.5536250 6.1343296 6.6708320
120 4.1412558 4.0620192 4.2508823 4.0877867 4.1964271 3.9179076 4.2190046
121 5.1215232 5.1903757 5.3488316 5.2297227 5.1643005 4.7275787 5.2744668
122 4.0286474 4.0024992 4.1436699 3.9862263 4.0607881 3.7483330 4.0620192
123 6.2112801 6.2617889 6.4467046 6.3229740 6.2657801 5.8360946 6.3992187
124 4.1097445 4.1060930 4.2813549 4.1436699 4.1605288 3.8013156 4.2284749
125 4.9699095 5.0428167 5.1942276 5.0695167 5.0079936 4.5760245 5.1137071
126 5.3122500 5.3898052 5.5587768 5.4387499 5.3591044 4.9173163 5.4963624
127 3.9774364 3.9812058 4.1496988 4.0124805 4.0249224 3.6633318 4.0902323
128 4.0074930 4.0311289 4.1856899 4.0472213 4.0472213 3.6742346 4.1121770
129 4.8404545 4.8518038 5.0149776 4.8733972 4.8836462 4.5066617 4.9477268
130 5.0970580 5.1584882 5.3385391 5.2172790 5.1497573 4.7222876 5.2886671
131 5.5461698 5.5919585 5.7775427 5.6550862 5.6017854 5.1788030 5.7314920
132 6.0141500 6.1546730 6.3126856 6.2153037 6.0572271 5.5596762 6.2401923
133 4.8805737 4.8918299 5.0537115 4.9132474 4.9234135 4.5453273 4.9849774
134 4.1605288 4.1689327 4.3416587 4.1988094 4.2083251 3.8457769 4.2871902
135 4.5705580 4.5475268 4.7169906 4.5552168 4.6141088 4.2883563 4.6626173
136 5.7887823 5.8600341 6.0406953 5.9321160 5.8438001 5.3916602 5.9883220
137 4.8918299 4.9598387 5.0921508 4.9628621 4.9203658 4.5022217 4.9939964
138 4.6065171 4.6508064 4.8062459 4.6690470 4.6454279 4.2473521 4.7318073
139 3.8961519 3.9153544 4.0669399 3.9268308 3.9344631 3.5693137 3.9912404
140 4.7968740 4.8600412 5.0269275 4.9101935 4.8445846 4.4124823 4.9618545
141 5.0199602 5.0724747 5.2287666 5.1048996 5.0616203 4.6411206 5.1526692
142 4.6368092 4.7021272 4.8682646 4.7602521 4.6861498 4.2497059 4.8031240
143 4.2083251 4.1809090 4.3347434 4.1773197 4.2461747 3.9255573 4.2638011
144 5.2573758 5.3207142 5.4753995 5.3497664 5.2971691 4.8682646 5.3972215
145 5.1361464 5.2067264 5.3535035 5.2325902 5.1730069 4.7391982 5.2678269
146 4.6540305 4.7000000 4.8641546 4.7455242 4.7010637 4.2848571 4.7968740
147 4.2766810 4.2497059 4.4305756 4.2883563 4.3301270 3.9887341 4.3840620
148 4.4598206 4.4988888 4.6615448 4.5332108 4.5044423 4.1024383 4.5934736
149 4.6508064 4.7180504 4.8487112 4.7191101 4.6786750 4.2649736 4.7497368
150 4.1400483 4.1533119 4.2988371 4.1496988 4.1737274 3.8183766 4.2178193
            8         9        10        11        12        13        14
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9   0.7874008                                                            
10  0.3316625 0.5567764                                                  
11  0.5000000 1.2845233 0.7874008                                        
12  0.2236068 0.6708204 0.3464102 0.6782330                              
13  0.4690416 0.4242641 0.1732051 0.9327379 0.4582576                    
14  0.9055385 0.3464102 0.7280110 1.3674794 0.8185353 0.5830952          
15  1.0440307 1.7916473 1.3114877 0.5830952 1.2328828 1.4317821 1.8083141
16  1.2369317 1.9974984 1.5556349 0.7874008 1.3638182 1.6941074 2.0420578
17  0.7000000 1.4317821 1.0099505 0.3464102 0.8602325 1.1269428 1.4662878
18  0.2000000 0.9273618 0.5000000 0.3872983 0.3872983 0.6164414 1.0099505
19  0.8366600 1.6124515 1.1000000 0.3872983 0.9949874 1.2569805 1.7320508
20  0.4242641 1.1489125 0.7549834 0.3316625 0.5196152 0.8831761 1.2165525
21  0.4472136 1.1575837 0.6244998 0.3605551 0.6082763 0.7874008 1.3190906
22  0.3741657 1.0862780 0.7000000 0.3605551 0.4795832 0.8246211 1.1747340
23  0.6708204 0.8306624 0.7745967 0.9486833 0.6633250 0.7549834 0.6855655
24  0.3872983 0.9110434 0.5291503 0.6164414 0.4472136 0.6557439 1.1180340
25  0.4472136 0.8124038 0.5196152 0.7810250 0.3000000 0.6480741 1.0295630
26  0.4123106 0.6403124 0.2000000 0.8124038 0.4472136 0.3000000 0.8660254
27  0.2236068 0.8306624 0.4472136 0.5477226 0.2828427 0.5744563 0.9949874
28  0.2236068 1.0049876 0.5099020 0.2828427 0.4242641 0.6557439 1.1090537
29  0.2236068 0.9433981 0.4472136 0.3741657 0.4472136 0.5744563 1.0344080
30  0.3741657 0.4690416 0.2645751 0.8660254 0.2236068 0.3162278 0.6782330
31  0.3741657 0.4898979 0.1732051 0.8544004 0.3000000 0.2449490 0.7211103
32  0.4472136 1.1401754 0.6557439 0.3605551 0.6403124 0.7874008 1.2727922
33  0.7348469 1.4491377 1.0440307 0.4582576 0.8185353 1.1747340 1.4764823
34  0.9486833 1.7029386 1.2609520 0.5196152 1.0816654 1.3928388 1.7262677
35  0.3162278 0.5477226 0.1000000 0.7810250 0.3316625 0.2000000 0.7348469
36  0.3605551 0.7000000 0.3464102 0.7071068 0.4898979 0.3605551 0.7416198
37  0.5477226 1.2569805 0.7549834 0.3000000 0.7681146 0.8717798 1.3190906
38  0.2645751 0.8660254 0.5099020 0.5291503 0.3162278 0.6082763 0.9000000
39  0.7483315 0.1414214 0.5567764 1.2369317 0.6403124 0.4242641 0.2449490
40  0.1000000 0.8660254 0.3741657 0.4242641 0.3162278 0.5196152 0.9848858
41  0.2449490 0.8602325 0.5000000 0.5000000 0.3872983 0.5830952 0.9055385
42  1.2288206 0.6244998 0.9380832 1.6792856 1.1832160 0.7937254 0.7810250
43  0.6633250 0.3162278 0.5567764 1.1357817 0.5385165 0.4690416 0.3162278
44  0.4242641 0.9591663 0.6557439 0.6082763 0.4582576 0.7615773 1.1135529
45  0.6082763 1.2609520 0.8831761 0.5477226 0.6164414 1.0344080 1.4177447
46  0.4690416 0.4242641 0.2645751 0.9327379 0.4582576 0.2000000 0.6164414
47  0.4242641 1.1575837 0.7416198 0.3316625 0.5000000 0.8831761 1.2409674
48  0.4582576 0.3605551 0.3464102 0.9486833 0.3464102 0.3000000 0.4795832
49  0.4242641 1.2083046 0.7280110 0.1000000 0.5916080 0.8717798 1.2884099
50  0.1414214 0.7211103 0.2645751 0.5744563 0.3000000 0.3741657 0.8246211
51  3.9648455 4.3794977 4.0435133 3.8065733 3.9912404 4.1785165 4.6882833
52  3.5623026 3.9230090 3.6359318 3.4554305 3.5637059 3.7643060 4.2391037
53  4.1170378 4.4977772 4.1856899 3.9824616 4.1327957 4.3162484 4.8135226
54  2.9866369 3.0886890 2.9478806 3.0708305 2.9444864 3.0298515 3.4322005
55  3.7296112 4.0435133 3.7709415 3.6496575 3.7336309 3.8897301 4.3692105
56  3.3256578 3.5383612 3.3421550 3.3331667 3.2848135 3.4496377 3.8729833
57  3.7282704 4.0767634 3.8065733 3.6290495 3.7188708 3.9344631 4.3931765
58  2.2113344 2.1794495 2.1307276 2.4124676 2.1307276 2.1886069 2.5238859
59  3.6918830 4.0360872 3.7389838 3.5916570 3.7013511 3.8639358 4.3577517
60  2.7802878 2.8930952 2.7748874 2.8705400 2.7166155 2.8618176 3.2295511
61  2.5690465 2.4939928 2.4556058 2.7730849 2.5000000 2.5019992 2.8390139
62  3.1543621 3.4336569 3.2031235 3.1176915 3.1336879 3.3181320 3.7589892
63  3.0545049 3.2326460 3.0133038 3.0822070 3.0463092 3.1064449 3.5707142
64  3.6249138 3.9012818 3.6619667 3.5791060 3.6041643 3.7788887 4.2308392
65  2.4959968 2.7367864 2.5258662 2.5099801 2.4698178 2.6324893 3.0643107
66  3.5818989 3.9711459 3.6523965 3.4496377 3.6027767 3.7828561 4.2836900
67  3.3481338 3.5707142 3.3852622 3.3496268 3.3015148 3.4942810 3.9000000
68  2.9206164 3.1511903 2.9223278 2.9257478 2.8948230 3.0315013 3.4856850
69  3.6837481 3.8768544 3.6687873 3.6851052 3.6742346 3.7643060 4.2154478
70  2.7820855 2.9427878 2.7586228 2.8372522 2.7477263 2.8530685 3.2832910
71  3.7815341 4.0570926 3.8457769 3.7349699 3.7483330 3.9623226 4.3794977
72  3.0049958 3.2969683 3.0364453 2.9597297 3.0033315 3.1511903 3.6235342
73  3.9686270 4.2083251 3.9799497 3.9370039 3.9547440 4.0877867 4.5442271
74  3.5791060 3.8457769 3.6027767 3.5411862 3.5580894 3.7188708 4.1773197
75  3.3555923 3.6905284 3.4014703 3.2695565 3.3630343 3.5242020 4.0124805
76  3.5454196 3.9102430 3.6055513 3.4322005 3.5608988 3.7322915 4.2272923
77  3.9912404 4.3324358 4.0348482 3.8858718 4.0049969 4.1581246 4.6551047
78  4.1892720 4.5287967 4.2497059 4.0841156 4.1928511 4.3737855 4.8507731
79  3.4554305 3.7229021 3.4942810 3.4190642 3.4336569 3.6083237 4.0521599
80  2.4020824 2.6134269 2.3874673 2.4372115 2.3874673 2.4879711 2.9478806
81  2.7110883 2.8337255 2.6720778 2.7928480 2.6720778 2.7586228 3.1764760
82  2.5942244 2.7184554 2.5495098 2.6795522 2.5573424 2.6362853 3.0610456
83  2.8089144 3.0413813 2.8178006 2.8142495 2.7892651 2.9240383 3.3749074
84  4.0509258 4.2720019 4.0718546 4.0348482 4.0174619 4.1797129 4.6076024
85  3.3181320 3.5085610 3.3496268 3.3436507 3.2588341 3.4539832 3.8379682
86  3.4583233 3.7920970 3.5425979 3.3778692 3.4365681 3.6687873 4.1060930
87  3.8613469 4.2320208 3.9293765 3.7389838 3.8729833 4.0583248 4.5486262
88  3.5383612 3.7656341 3.5284558 3.5199432 3.5369478 3.6304270 4.1012193
89  2.9137605 3.1543621 2.9495762 2.9154759 2.8740216 3.0610456 3.4828150
90  2.9189039 3.0561414 2.9000000 2.9849623 2.8757608 2.9899833 3.3970576
91  3.2093613 3.3615473 3.1984371 3.2603681 3.1575307 3.2954514 3.7013511
92  3.5242020 3.8183766 3.5707142 3.4684290 3.5057096 3.6905284 4.1448764
93  2.9206164 3.1320920 2.9189039 2.9359837 2.8982753 3.0215890 3.4684290
94  2.2561028 2.2293497 2.1679483 2.4494897 2.1863211 2.2248595 2.5748786
95  3.0577770 3.2449961 3.0626786 3.0886890 3.0166206 3.1638584 3.5818989
96  2.9899833 3.2465366 3.0248967 2.9782545 2.9546573 3.1400637 3.5749126
97  3.0397368 3.2771939 3.0675723 3.0380915 3.0049958 3.1780497 3.6083237
98  3.2771939 3.5860842 3.3181320 3.2140317 3.2726136 3.4380227 3.9115214
99  1.9697716 2.0049938 1.9104973 2.1424285 1.9157244 1.9748418 2.3452079
100 2.9698485 3.1937439 2.9883106 2.9782545 2.9376862 3.0951575 3.5270384
101 5.2191953 5.4972721 5.2924474 5.1487863 5.1874849 5.4092513 5.8189346
102 4.1206796 4.3104524 4.1436699 4.1243181 4.0779897 4.2449971 4.6454279
103 5.2478567 5.5821143 5.3113087 5.1332251 5.2488094 5.4350713 5.9059292
104 4.6162756 4.8795492 4.6583259 4.5628938 4.5891176 4.7738873 5.2105662
105 4.9899900 5.2706736 5.0467812 4.9183331 4.9689033 5.1633323 5.5982140
106 6.0448325 6.3953108 6.1081912 5.9118525 6.0506198 6.2353829 6.7186308
107 3.4741906 3.5028560 3.4525353 3.5972211 3.3882149 3.5256205 3.8379682
108 5.5803226 5.9143892 5.6329388 5.4635154 5.5812185 5.7584720 6.2401923
109 4.9749372 5.2316345 4.9979996 4.9173163 4.9618545 5.1097945 5.5668663
110 5.5973208 5.9757845 5.6973678 5.4497706 5.5982140 5.8283788 6.2872888
111 4.3000000 4.6292548 4.3749286 4.2023803 4.2918527 4.4977772 4.9487372
112 4.4474712 4.7053161 4.4821870 4.3965896 4.4305756 4.5934736 5.0378567
113 4.7968740 5.1176166 4.8600412 4.6968074 4.7937459 4.9809638 5.4415071
114 4.0975602 4.2485292 4.1060930 4.1255303 4.0521599 4.1988094 4.5858478
115 4.3358967 4.5276926 4.3760713 4.3324358 4.2953463 4.4743715 4.8559242
116 4.5661800 4.8692915 4.6411206 4.4833024 4.5497253 4.7592016 5.1894123
117 4.5793013 4.8774994 4.6324939 4.5011110 4.5628938 4.7528939 5.2048055
118 6.2040309 6.6174013 6.3071388 6.0282667 6.2112801 6.4459289 6.9260378
119 6.4420494 6.7557383 6.4876806 6.3300869 6.4459289 6.6075714 7.0851958
120 4.0472213 4.2071368 4.0286474 4.0681691 4.0162171 4.1231056 4.5497253
121 5.0695167 5.4074023 5.1468437 4.9547957 5.0665570 5.2706736 5.7271284
122 3.9395431 4.1158231 3.9686270 3.9560081 3.8897301 4.0669399 4.4474712
123 6.1587336 6.4984614 6.2112801 6.0315835 6.1660360 6.3364028 6.8242216
124 4.0373258 4.2965102 4.0706265 3.9912404 4.0236799 4.1809090 4.6281746
125 4.9142650 5.2488094 4.9919936 4.8062459 4.9030603 5.1176166 5.5686623
126 5.2621288 5.6258333 5.3329167 5.1283526 5.2649786 5.4635154 5.9455866
127 3.9051248 4.1677332 3.9446166 3.8600518 3.8884444 4.0558600 4.4977772
128 3.9357337 4.2083251 3.9874804 3.8858718 3.9115214 4.1024383 4.5354162
129 4.7686476 5.0259327 4.8114447 4.7148701 4.7465777 4.9234135 5.3572381
130 5.0447993 5.4009258 5.1029403 4.9173163 5.0517324 5.2316345 5.7227616
131 5.4927225 5.8300943 5.5443665 5.3721504 5.5009090 5.6683331 6.1554854
132 5.9849812 6.4265076 6.0917978 5.7887823 6.0041652 6.2337790 6.7305275
133 4.8093659 5.0645829 4.8538644 4.7560488 4.7874837 4.9648766 5.3953684
134 4.0865633 4.3588989 4.1194660 4.0336088 4.0681691 4.2355637 4.6904158
135 4.4833024 4.6968074 4.4933284 4.4665423 4.4463468 4.6021734 5.0338852
136 5.7463032 6.1155539 5.8180753 5.5991071 5.7645468 5.9447456 6.4342832
137 4.8311489 5.1322510 4.9142650 4.7486840 4.8052055 5.0338852 5.4497706
138 4.5398238 4.8383882 4.5978256 4.4631827 4.5188494 4.7191101 5.1643005
139 3.8223030 4.0853396 3.8729833 3.7815341 3.7947332 3.9862263 4.4124823
140 4.7455242 5.0892043 4.8176758 4.6292548 4.7486840 4.9416596 5.4092513
141 4.9628621 5.2735187 5.0338852 4.8682646 4.9537864 5.1526692 5.5955339
142 4.5902070 4.9386233 4.6690470 4.4698993 4.6000000 4.7906158 5.2545219
143 4.1206796 4.3104524 4.1436699 4.1243181 4.0779897 4.2449971 4.6454279
144 5.2009614 5.5235858 5.2744668 5.0970580 5.1903757 5.3972215 5.8455111
145 5.0823223 5.4064776 5.1652686 4.9779514 5.0714889 5.2867760 5.7245087
146 4.5989129 4.9142650 4.6669048 4.5033321 4.5978256 4.7843495 5.2354560
147 4.2000000 4.4294469 4.2201896 4.1701319 4.1844952 4.3243497 4.7644517
148 4.3977267 4.7010637 4.4575778 4.3162484 4.3874822 4.5760245 5.0259327
149 4.5891176 4.8887626 4.6722586 4.5110974 4.5617979 4.7916594 5.2057660
150 4.0607881 4.3023250 4.1060930 4.0323690 4.0224371 4.2178193 4.6314145
           15        16        17        18        19        20        21
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16  0.5477226                                                            
17  0.4690416 0.6164414                                                  
18  0.8888194 1.0908712 0.5196152                                        
19  0.5567764 0.6403124 0.5196152 0.7348469                              
20  0.7937254 0.8544004 0.3872983 0.3162278 0.6324555                    
21  0.8774964 1.0816654 0.6708204 0.4472136 0.5099020 0.5477226          
22  0.8426150 0.9219544 0.4123106 0.2449490 0.6480741 0.1414214 0.5099020
23  1.2806248 1.4628739 0.9273618 0.6557439 1.3228757 0.7416198 1.0816654
24  1.1489125 1.2727922 0.7874008 0.4123106 0.8062258 0.5744563 0.4358899
25  1.3601471 1.4177447 1.0049876 0.6000000 1.0099505 0.6480741 0.6324555
26  1.3416408 1.5811388 1.0488088 0.5567764 1.0723805 0.8185353 0.5744563
27  1.0954451 1.2247449 0.7071068 0.2645751 0.8185353 0.4358899 0.4582576
28  0.8366600 1.0488088 0.5291503 0.1732051 0.6244998 0.3316625 0.3000000
29  0.8717798 1.1401754 0.5830952 0.1732051 0.7141428 0.4358899 0.3605551
30  1.4177447 1.5779734 1.0535654 0.5477226 1.1747340 0.7348469 0.7348469
31  1.4035669 1.5968719 1.0630146 0.5477226 1.1489125 0.7745967 0.6782330
32  0.8062258 1.0440307 0.5385165 0.3464102 0.5477226 0.5099020 0.2828427
33  0.6855655 0.6557439 0.4582576 0.6480741 0.6480741 0.3741657 0.7615773
34  0.4123106 0.3605551 0.3872983 0.8124038 0.5477226 0.5830952 0.8602325
35  1.3076697 1.5394804 0.9848858 0.4690416 1.0862780 0.7348469 0.6164414
36  1.1313708 1.4352700 0.8366600 0.3872983 1.0535654 0.6855655 0.6708204
37  0.5916080 0.9643651 0.4582576 0.4242641 0.5477226 0.5477226 0.4242641
38  1.0099505 1.1747340 0.6633250 0.3000000 0.9000000 0.3605551 0.6244998
39  1.7233688 1.9313208 1.3601471 0.8717798 1.5811388 1.0862780 1.1489125
40  0.9695360 1.1832160 0.6480741 0.1732051 0.7549834 0.4123106 0.3605551
41  0.9539392 1.1618950 0.5744563 0.1414214 0.8602325 0.3741657 0.5830952
42  2.1447611 2.4289916 1.8384776 1.3453624 1.9621417 1.6278821 1.4798649
43  1.6155494 1.7916473 1.2369317 0.7745967 1.4899664 0.9486833 1.0954451
44  1.1000000 1.1618950 0.6708204 0.3741657 0.8246211 0.4472136 0.5830952
45  1.0295630 0.9380832 0.6782330 0.5916080 0.6403124 0.4123106 0.5744563
46  1.4317821 1.6703293 1.0908712 0.5830952 1.2409674 0.8602325 0.7874008
47  0.8306624 0.8774964 0.4795832 0.3741657 0.6164414 0.1414214 0.5099020
48  1.4560220 1.6431677 1.0862780 0.5916080 1.2922848 0.7937254 0.8774964
49  0.6557439 0.8306624 0.3605551 0.3162278 0.4690416 0.2449490 0.3741657
50  1.0816654 1.3228757 0.7549834 0.2449490 0.9165151 0.5291503 0.5099020
51  3.9711459 3.7907783 3.9509493 3.9749214 3.5014283 3.9268308 3.6110940
52  3.6851052 3.4842503 3.5972211 3.5818989 3.1827661 3.5341194 3.2511536
53  4.1713307 3.9874804 4.1303753 4.1340053 3.6891733 4.0902323 3.7775654
54  3.4684290 3.3926391 3.2664966 3.0594117 2.9291637 3.1080541 2.7784888
55  3.8961519 3.7443290 3.8105118 3.7589892 3.3896903 3.7429935 3.4161382
56  3.6810325 3.5171011 3.5142567 3.3852622 3.1368774 3.3704599 3.0822070
57  3.8665230 3.6400549 3.7643060 3.7496667 3.3615473 3.6905284 3.4322005
58  2.9017236 2.8705400 2.6191602 2.3130067 2.3769729 2.3937418 2.1095023
59  3.8236109 3.6715120 3.7603191 3.7215588 3.3211444 3.6972963 3.3630343
60  3.2832910 3.1464265 3.0397368 2.8478062 2.7404379 2.8618176 2.6095977
61  3.2511536 3.2572995 2.9949958 2.6758176 2.7313001 2.7820855 2.4494897
62  3.4205263 3.2403703 3.2680269 3.1890437 2.8930952 3.1638584 2.8896367
63  3.4292856 3.3970576 3.3015148 3.1224990 2.9034462 3.1796226 2.7802878
64  3.8716921 3.6945906 3.7483330 3.6687873 3.3436507 3.6414283 3.3436507
65  2.8670542 2.7349589 2.6720778 2.5396850 2.3302360 2.5436195 2.2605309
66  3.6469165 3.4785054 3.5972211 3.5958309 3.1606961 3.5594943 3.2419130
67  3.6905284 3.4899857 3.5071356 3.3985291 3.1511903 3.3660065 3.1192948
68  3.2771939 3.1654384 3.1304952 2.9849623 2.7331301 2.9916551 2.6551836
69  3.9974992 3.9115214 3.8704005 3.7349699 3.4770677 3.7696154 3.4073450
70  3.2233523 3.1416556 3.0413813 2.8530685 2.6795522 2.8879058 2.5495098
71  4.0211939 3.7854986 3.8665230 3.8131352 3.5014283 3.7603191 3.5298725
72  3.2526912 3.1272992 3.1304952 3.0413813 2.7294688 3.0413813 2.7110883
73  4.2284749 4.0914545 4.1158231 4.0162171 3.7054015 4.0162171 3.6810325
74  3.8444766 3.6878178 3.7282704 3.6318040 3.3120990 3.6124784 3.2939338
75  3.5199432 3.3749074 3.4365681 3.3852622 3.0099834 3.3674916 3.0364453
76  3.6496575 3.4899857 3.5860842 3.5651087 3.1543621 3.5369478 3.2140317
77  4.1036569 3.9572718 4.0521599 4.0187063 3.6097091 3.9987498 3.6565011
78  4.3011626 4.1109610 4.2284749 4.2107007 3.8065733 4.1725292 3.8716921
79  3.7188708 3.5425979 3.5791060 3.4957117 3.1906112 3.4727511 3.1843367
80  2.8106939 2.7568098 2.6419690 2.4637370 2.2737634 2.5079872 2.1470911
81  3.1968735 3.1336879 3.0000000 2.7874720 2.6551836 2.8372522 2.4959968
82  3.0886890 3.0397368 2.8948230 2.6739484 2.5475478 2.7294688 2.3769729
83  3.1591138 3.0495901 3.0000000 2.8618176 2.6210685 2.8757608 2.5475478
84  4.3474130 4.1689327 4.2047592 4.1024383 3.8144462 4.0828911 3.7907783
85  3.7067506 3.5014283 3.5014283 3.3749074 3.1638584 3.3421550 3.1128765
86  3.6400549 3.3955854 3.5057096 3.4813790 3.1272992 3.4146742 3.1874755
87  3.9446166 3.7603191 3.8858718 3.8794329 3.4539832 3.8379682 3.5312887
88  3.8196859 3.7403208 3.7134889 3.5888717 3.3015148 3.6193922 3.2434549
89  3.2649655 3.0886890 3.0822070 2.9647934 2.7221315 2.9410882 2.6776856
90  3.3749074 3.2726136 3.1733263 2.9866369 2.8319605 3.0166206 2.7055499
91  3.6455452 3.5114100 3.4568772 3.2832910 3.0951575 3.2893768 2.9899833
92  3.7536649 3.5679126 3.6318040 3.5637059 3.2280025 3.5298725 3.2403703
93  3.2863353 3.1843367 3.1272992 2.9782545 2.7477263 2.9983329 2.6627054
94  2.9291637 2.9154759 2.6608269 2.3558438 2.4062419 2.4474477 2.1377558
95  3.4554305 3.3166248 3.2710854 3.1192948 2.9103264 3.1224990 2.8266588
96  3.3181320 3.1448370 3.1543621 3.0430248 2.7748874 3.0166206 2.7386128
97  3.3808283 3.2171416 3.2109189 3.0919250 2.8390139 3.0757113 2.7928480
98  3.4914181 3.3391616 3.3837849 3.3136083 2.9698485 3.2954514 2.9765752
99  2.6057628 2.5903668 2.3302360 2.0493902 2.0928450 2.1400935 1.8439089
100 3.3271610 3.1827661 3.1543621 3.0232433 2.7856777 3.0199338 2.7239677
101 5.3916602 5.1215232 5.2602281 5.2421370 4.8928519 5.1749396 4.9598387
102 4.4485953 4.2555846 4.2766810 4.1689327 3.9166312 4.1496988 3.8858718
103 5.3282267 5.1156622 5.2678269 5.2668776 4.8456166 5.2191953 4.9295030
104 4.8352870 4.6238512 4.7180504 4.6572524 4.3162484 4.6162756 4.3393548
105 5.1623638 4.9325450 5.0507425 5.0179677 4.6583259 4.9699095 4.7095647
106 6.0835845 5.8711157 6.0522723 6.0646517 5.6124861 6.0116553 5.7113921
107 4.0249224 3.8652296 3.7603191 3.5510562 3.4828150 3.5623026 3.3391616
108 5.6595053 5.4598535 5.6187187 5.6089215 5.1749396 5.5623736 5.2516664
109 5.1749396 5.0059964 5.0852729 5.0169712 4.6636895 4.9989999 4.6765372
110 5.6053546 5.3347915 5.5479726 5.5991071 5.1468437 5.5181519 5.2848841
111 4.4249294 4.1952354 4.3243497 4.3162484 3.9306488 4.2626283 4.0062451
112 4.6636895 4.4799554 4.5486262 4.4833024 4.1496988 4.4609416 4.1641326
113 4.9091751 4.6968074 4.8270074 4.8155997 4.4192760 4.7717921 4.4911023
114 4.4654227 4.2918527 4.2778499 4.1484937 3.9331921 4.1460825 3.8768544
115 4.6357308 4.4192760 4.4508426 4.3680659 4.1206796 4.3428102 4.1133928
116 4.7138095 4.4698993 4.5934736 4.5814845 4.2201896 4.5265881 4.2906876
117 4.7476310 4.5343136 4.6497312 4.6119410 4.2391037 4.5661800 4.2860238
118 6.1562976 5.8855756 6.1400326 6.2088646 5.7105166 6.1163715 5.8694122
119 6.5169011 6.3253458 6.4768820 6.4668385 6.0398675 6.4311741 6.1139185
120 4.4056782 4.2883563 4.2602817 4.1109610 3.8704005 4.1303753 3.7920970
121 5.1487863 4.9122296 5.0705029 5.0813384 4.6690470 5.0239427 4.7644517
122 4.2906876 4.0853396 4.0951190 3.9849718 3.7603191 3.9623226 3.7255872
123 6.2080593 6.0133186 6.1822326 6.1830413 5.7349804 6.1392182 5.8215118
124 4.2649736 4.0951190 4.1436699 4.0718546 3.7496667 4.0570926 3.7549967
125 5.0159745 4.7686476 4.9295030 4.9325450 4.5265881 4.8672374 4.6162756
126 5.3103672 5.0892043 5.2706736 5.2829916 4.8321838 5.2220686 4.9325450
127 4.1376322 3.9572718 4.0074930 3.9382737 3.6207734 3.9179076 3.6290495
128 4.1641326 3.9547440 4.0274061 3.9686270 3.6455452 3.9306488 3.6674242
129 4.9769469 4.7696960 4.8569538 4.8020829 4.4654227 4.7686476 4.4922155
130 5.1068581 4.9132474 5.0734604 5.0705029 4.6249324 5.0229473 4.7085029
131 5.5587768 5.3721504 5.5226805 5.5163394 5.0803543 5.4781384 5.1584882
132 5.8932164 5.6364883 5.9016947 5.9849812 5.4607692 5.8940648 5.6338264
133 5.0159745 4.8062459 4.8928519 4.8404545 4.5066617 4.8072861 4.5354162
134 4.3116122 4.1340053 4.2035699 4.1303753 3.7894591 4.1036569 3.7973675
135 4.7801674 4.6054316 4.6551047 4.5453273 4.2449971 4.5232732 4.2166337
136 5.7471732 5.5434646 5.7227616 5.7532599 5.2915026 5.7061370 5.4055527
137 4.9809638 4.7085029 4.8528342 4.8476799 4.4877611 4.7770284 4.5672749
138 4.7138095 4.4877611 4.6086874 4.5727453 4.2035699 4.5199558 4.2532341
139 4.0693980 3.8600518 3.9217343 3.8561639 3.5482390 3.8196859 3.5623026
140 4.8238988 4.6076024 4.7528939 4.7581509 4.3428102 4.7095647 4.4317040
141 5.0813384 4.8476799 4.9819675 4.9769469 4.5945620 4.9264592 4.6722586
142 4.6518813 4.4384682 4.5760245 4.5923850 4.1821047 4.5486262 4.2790186
143 4.4485953 4.2555846 4.2766810 4.1689327 3.9166312 4.1496988 3.8858718
144 5.3047149 5.0616203 5.2172790 5.2182373 4.8176758 5.1584882 4.9040799
145 5.1807335 4.9254441 5.0813384 5.0921508 4.7000000 5.0289164 4.7947888
146 4.7138095 4.5011110 4.6173586 4.6097722 4.2296572 4.5705580 4.3023250
147 4.4530888 4.2976738 4.3255058 4.2379240 3.9370039 4.2355637 3.9242834
148 4.5530210 4.3416587 4.4485953 4.4204072 4.0521599 4.3794977 4.1060930
149 4.7507894 4.4799554 4.6162756 4.6065171 4.2544095 4.5365185 4.3289722
150 4.3335897 4.1133928 4.1785165 4.1024383 3.8065733 4.0607881 3.8118237
           22        23        24        25        26        27        28
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23  0.7416198                                                            
24  0.4582576 0.9591663                                                  
25  0.6164414 0.9433981 0.4795832                                        
26  0.7416198 0.9380832 0.4472136 0.5385165                              
27  0.3316625 0.7745967 0.2000000 0.4123106 0.4472136                    
28  0.3000000 0.7874008 0.4242641 0.5744563 0.5477226 0.3162278          
29  0.3872983 0.7483315 0.4472136 0.6403124 0.4898979 0.3464102 0.1414214
30  0.6782330 0.7280110 0.5196152 0.3741657 0.3605551 0.4123106 0.5916080
31  0.7071068 0.8062258 0.4795832 0.4242641 0.2236068 0.4123106 0.5744563
32  0.4242641 0.9848858 0.3872983 0.7483315 0.6082763 0.4123106 0.3000000
33  0.5099020 0.9327379 0.9219544 0.9055385 1.1269428 0.7937254 0.6082763
34  0.6782330 1.1532563 1.0723805 1.1747340 1.3152946 0.9848858 0.7681146
35  0.6633250 0.7681146 0.4582576 0.5099020 0.1732051 0.3872983 0.5000000
36  0.6244998 0.6000000 0.6000000 0.7549834 0.4472136 0.4898979 0.4690416
37  0.5291503 0.9539392 0.6708204 0.9273618 0.7681146 0.6244998 0.3605551
38  0.3872983 0.5099020 0.6164414 0.5567764 0.6480741 0.4242641 0.3464102
39  1.0295630 0.7000000 0.9110434 0.8246211 0.6708204 0.8062258 0.9643651
40  0.3605551 0.7348469 0.3741657 0.5000000 0.4242641 0.2449490 0.1414214
41  0.3162278 0.5196152 0.5000000 0.6480741 0.5916080 0.3316625 0.3000000
42  1.5394804 1.3416408 1.2489996 1.2922848 0.9165151 1.2489996 1.4071247
43  0.9055385 0.5385165 0.8660254 0.7483315 0.7000000 0.7280110 0.8774964
44  0.3162278 0.8306624 0.2645751 0.5477226 0.6403124 0.2236068 0.4582576
45  0.4123106 1.0677078 0.5477226 0.5385165 0.8831761 0.5099020 0.5477226
46  0.7745967 0.7549834 0.5567764 0.6480741 0.3000000 0.5000000 0.6557439
47  0.2449490 0.8062258 0.5916080 0.5830952 0.8062258 0.4582576 0.3316625
48  0.7416198 0.5656854 0.6633250 0.5744563 0.4898979 0.5291503 0.6782330
49  0.2828427 0.8660254 0.5744563 0.7071068 0.7681146 0.4795832 0.2236068
50  0.4690416 0.6403124 0.4358899 0.5477226 0.3605551 0.3000000 0.3000000
51  3.8858718 4.5880279 3.6646964 3.7629775 3.8845849 3.8275318 3.8742741
52  3.4856850 4.1641326 3.2465366 3.3241540 3.4785054 3.4088121 3.4957117
53  4.0459857 4.7370877 3.8105118 3.8974351 4.0249224 3.9749214 4.0373258
54  3.0298515 3.5651087 2.6627054 2.7055499 2.7766887 2.8337255 2.9983329
55  3.6864617 4.3474130 3.4088121 3.4971417 3.6027767 3.5805028 3.6715120
56  3.3136083 3.9127995 3.0149627 3.0232433 3.1859065 3.1733263 3.3090784
57  3.6441734 4.3162484 3.4132096 3.4727511 3.6537652 3.5707142 3.6674242
58  2.3086793 2.7313001 1.9131126 1.9000000 1.9748418 2.0639767 2.2759613
59  3.6482873 4.3197222 3.3852622 3.4626579 3.5749126 3.5524639 3.6249138
60  2.7874720 3.3196385 2.4535688 2.4677925 2.6191602 2.6115130 2.8000000
61  2.6944387 3.1000000 2.2781571 2.2803509 2.2912878 2.4351591 2.6324893
62  3.1032241 3.7389838 2.8248894 2.8896367 3.0430248 2.9899833 3.1176915
63  3.1096624 3.6823905 2.7495454 2.8160256 2.8354894 2.9257478 3.0364453
64  3.5888717 4.2272923 3.3120990 3.3496268 3.5028560 3.4741906 3.5846897
65  2.4718414 3.0757113 2.1587033 2.2338308 2.3622024 2.3280893 2.4779023
66  3.5114100 4.2023803 3.2710854 3.3749074 3.4899857 3.4380227 3.5014283
67  3.3090784 3.9115214 3.0298515 3.0413813 3.2341923 3.1843367 3.3316662
68  2.9342802 3.5355339 2.6191602 2.6400758 2.7604347 2.7820855 2.8982753
69  3.6972963 4.2965102 3.3555923 3.4423829 3.4899857 3.5355339 3.6578682
70  2.8178006 3.3808283 2.4677925 2.5019992 2.5903668 2.6362853 2.7802878
71  3.7067506 4.3416587 3.4568772 3.4957117 3.6945906 3.6124784 3.7456642
72  2.9782545 3.6193922 2.6795522 2.7694765 2.8670542 2.8530685 2.9597297
73  3.9560081 4.5825757 3.6496575 3.7080992 3.8105118 3.8209946 3.9319207
74  3.5623026 4.1928511 3.2771939 3.3000000 3.4438351 3.4380227 3.5411862
75  3.3136083 3.9786933 3.0413813 3.1272992 3.2357379 3.2109189 3.2939338
76  3.4856850 4.1665333 3.2310989 3.3301652 3.4409301 3.4000000 3.4727511
77  3.9484174 4.6216880 3.6823905 3.7696154 3.8678159 3.8522721 3.9217343
78  4.1218928 4.7979162 3.8704005 3.9534795 4.0865633 4.0373258 4.1231056
79  3.4146742 4.0484565 3.1320920 3.1843367 3.3331667 3.2969683 3.4190642
80  2.4351591 3.0166206 2.0832667 2.1563859 2.2135944 2.2583180 2.3874673
81  2.7622455 3.3015148 2.3958297 2.4310492 2.5019992 2.5651511 2.7202941
82  2.6551836 3.1906112 2.2847319 2.3173260 2.3790755 2.4535688 2.6038433
83  2.8089144 3.4146742 2.4859606 2.5475478 2.6495283 2.6570661 2.7856777
84  4.0261644 4.6411206 3.7336309 3.7589892 3.9115214 3.8961519 4.0249224
85  3.2848135 3.8652296 3.0033315 2.9949958 3.2031235 3.1527766 3.3136083
86  3.3674916 4.0261644 3.1416556 3.1874755 3.3955854 3.2939338 3.4073450
87  3.7907783 4.4766059 3.5496479 3.6373067 3.7682887 3.7148351 3.7868192
88  3.5524639 4.1653331 3.2202484 3.3045423 3.3511192 3.3985291 3.5028560
89  2.8827071 3.4899857 2.5961510 2.6172505 2.7964263 2.7531800 2.8948230
90  2.9427878 3.4971417 2.5942244 2.6305893 2.7331301 2.7622455 2.9240383
91  3.2280025 3.7907783 2.9034462 2.8948230 3.0413813 3.0610456 3.2109189
92  3.4785054 4.1243181 3.2109189 3.2526912 3.4132096 3.3719431 3.4799425
93  2.9308702 3.5270384 2.6000000 2.6551836 2.7495454 2.7712813 2.9017236
94  2.3600847 2.7892651 1.9544820 1.9621417 2.0049938 2.1118712 2.3151674
95  3.0577770 3.6414283 2.7386128 2.7622455 2.9017236 2.9017236 3.0495901
96  2.9631065 3.5791060 2.6814175 2.6944387 2.8722813 2.8372522 2.9647934
97  3.0166206 3.6262929 2.7221315 2.7495454 2.9103264 2.8827071 3.0182777
98  3.2403703 3.8923001 2.9614186 3.0298515 3.1543621 3.1288976 3.2264532
99  2.0445048 2.5039968 1.6401219 1.7088007 1.7406895 1.8083141 2.0174241
100 2.9563491 3.5594943 2.6476405 2.6870058 2.8266588 2.8124722 2.9512709
101 5.1244512 5.7680153 4.8918299 4.9355851 5.1410116 5.0467812 5.1759057
102 4.0865633 4.6850827 3.7907783 3.8236109 3.9837169 3.9534795 4.1048752
103 5.1710734 5.8506410 4.9284886 5.0059964 5.1487863 5.0941143 5.1797683
104 4.5661800 5.2057660 4.3011626 4.3301270 4.5011110 4.4609416 4.5760245
105 4.9173163 5.5686623 4.6636895 4.7180504 4.8877398 4.8259714 4.9426713
106 5.9699246 6.6580778 5.7367238 5.8051701 5.9472683 5.9000000 5.9690870
107 3.4885527 3.9749214 3.1559468 3.1352831 3.3045423 3.3045423 3.5128336
108 5.5208695 6.1991935 5.2773099 5.3310412 5.4726593 5.4396691 5.5108983
109 4.9446941 5.5874860 4.6583259 4.7106263 4.8311489 4.8270074 4.9295030
110 5.4763126 6.1692787 5.2782573 5.3600373 5.5443665 5.4350713 5.5190579
111 4.2107007 4.8805737 3.9724048 4.0509258 4.2166337 4.1352146 4.2402830
112 4.4022721 5.0428167 4.1194660 4.1833001 4.3162484 4.2883563 4.4056782
113 4.7191101 5.3907328 4.4698993 4.5530210 4.6968074 4.6368092 4.7349762
114 4.0755368 4.6540305 3.7603191 3.8039453 3.9420807 3.9268308 4.0914545
115 4.2731721 4.8713448 3.9887341 4.0546270 4.2154478 4.1533119 4.3185646
116 4.4710178 5.1283526 4.2308392 4.3092923 4.4833024 4.3931765 4.5144213
117 4.5177428 5.1749396 4.2638011 4.3092923 4.4743715 4.4249294 4.5276926
118 6.0868711 6.7926431 5.9076222 5.9674115 6.1595454 6.0580525 6.1139185
119 6.3827894 7.0590368 6.1261734 6.2016127 6.3206012 6.2952363 6.3741666
120 4.0644803 4.6486557 3.7296112 3.7656341 3.8587563 3.9000000 4.0336088
121 4.9739320 5.6524331 4.7423623 4.8270074 4.9869831 4.9061186 5.0029991
122 3.8961519 4.4821870 3.6041643 3.6386811 3.8118237 3.7643060 3.9306488
123 6.0967204 6.7808554 5.8532043 5.9203040 6.0481402 6.0183054 6.0844063
124 3.9949969 4.6335731 3.7054015 3.7815341 3.9025633 3.8768544 3.9962482
125 4.8218254 5.4954527 4.5956501 4.6551047 4.8373546 4.7539457 4.8518038
126 5.1836281 5.8719673 4.9598387 5.0169712 5.1768716 5.1185936 5.1865210
127 3.8561639 4.4944410 3.5721142 3.6455452 3.7788887 3.7416574 3.8652296
128 3.8742741 4.5144213 3.6083237 3.6619667 3.8288379 3.7709415 3.8961519
129 4.7116876 5.3525695 4.4395946 4.4966654 4.6486557 4.6054316 4.7275787
130 4.9829710 5.6674509 4.7455242 4.8052055 4.9436828 4.9071377 4.9699095
131 5.4323107 6.1139185 5.1826634 5.2583267 5.3795911 5.3497664 5.4203321
132 5.8668561 6.5825527 5.6947344 5.7671483 5.9439044 5.8455111 5.8847260
133 4.7486840 5.3888774 4.4766059 4.5398238 4.6904158 4.6432747 4.7686476
134 4.0521599 4.6936127 3.7749172 3.8131352 3.9585351 3.9382737 4.0435133
135 4.4743715 5.0842895 4.1844952 4.1785165 4.3370497 4.3416587 4.4575778
136 5.6586217 6.3553127 5.4267854 5.5335341 5.6524331 5.5955339 5.6630381
137 4.7265209 5.3786615 4.5022217 4.5585085 4.7634021 4.6572524 4.7822589
138 4.4732538 5.1283526 4.2261093 4.2626283 4.4429720 4.3840620 4.4899889
139 3.7616486 4.3954522 3.4928498 3.5454196 3.7148351 3.6551334 3.7868192
140 4.6583259 5.3394756 4.4192760 4.5122057 4.6551047 4.5858478 4.6765372
141 4.8713448 5.5371473 4.6281746 4.7148701 4.8723711 4.7937459 4.9050994
142 4.4911023 5.1730069 4.2520583 4.3760713 4.5033321 4.4226689 4.5188494
143 4.0865633 4.6850827 3.7907783 3.8236109 3.9837169 3.9534795 4.1048752
144 5.1097945 5.7810034 4.8764741 4.9446941 5.1166395 5.0378567 5.1400389
145 4.9769469 5.6462377 4.7497368 4.8321838 5.0079936 4.9112117 5.0219518
146 4.5110974 5.1788030 4.2591079 4.3669211 4.5011110 4.4294469 4.5387223
147 4.1689327 4.7947888 3.8639358 3.9446166 4.0484565 4.0385641 4.1653331
148 4.3243497 4.9849774 4.0681691 4.1448764 4.2953463 4.2343831 4.3439613
149 4.4855323 5.1351728 4.2602817 4.3150898 4.5221676 4.4147480 4.5420260
150 4.0062451 4.6281746 3.7389838 3.7643060 3.9522146 3.8961519 4.0323690
           29        30        31        32        33        34        35
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30  0.5744563                                                            
31  0.5385165 0.1414214                                                  
32  0.3000000 0.7615773 0.7071068                                        
33  0.7141428 1.0392305 1.0862780 0.7874008                              
34  0.8544004 1.2961481 1.3190906 0.8366600 0.3464102                    
35  0.4358899 0.2449490 0.1414214 0.6164414 1.0488088 1.2569805          
36  0.3464102 0.5000000 0.4582576 0.5744563 0.9746794 1.1357817 0.3316625
37  0.3316625 0.9055385 0.8602325 0.3162278 0.7071068 0.7071068 0.7483315
38  0.3741657 0.5000000 0.5567764 0.6244998 0.5916080 0.8544004 0.5196152
39  0.9000000 0.4690416 0.5099020 1.1135529 1.3784049 1.6309506 0.5477226
40  0.1414214 0.4582576 0.4358899 0.3605551 0.7141428 0.9000000 0.3605551
41  0.2645751 0.5291503 0.5477226 0.4690416 0.6928203 0.8717798 0.4690416
42  1.3114877 0.9746794 0.9110434 1.4387495 1.9519221 2.1517435 0.9219544
43  0.8306624 0.4242641 0.5099020 1.0583005 1.2247449 1.4899664 0.5477226
44  0.5000000 0.5830952 0.6000000 0.4690416 0.8124038 0.9695360 0.5830952
45  0.6782330 0.8062258 0.8426150 0.6403124 0.5916080 0.7810250 0.8544004
46  0.5744563 0.3162278 0.2449490 0.7348469 1.1916375 1.3928388 0.2000000
47  0.4582576 0.7211103 0.7615773 0.5477226 0.3464102 0.6000000 0.7348469
48  0.6324555 0.2236068 0.3000000 0.8544004 1.0908712 1.3453624 0.3316625
49  0.3316625 0.7874008 0.7874008 0.3741657 0.4242641 0.5477226 0.7211103
50  0.2236068 0.3741657 0.3464102 0.4690416 0.8366600 1.0295630 0.2449490
51  3.9509493 4.0422766 3.9874804 3.7202150 3.9974992 3.9471509 4.0124805
52  3.5749126 3.6041643 3.5594943 3.3541020 3.6345564 3.6207734 3.5986108
53  4.1133928 4.1749251 4.1218928 3.8871583 4.1725292 4.1364236 4.1533119
54  3.0446675 2.9017236 2.8460499 2.8774989 3.3196385 3.4029399 2.9086079
55  3.7389838 3.7536649 3.6972963 3.5199432 3.8665230 3.8587563 3.7349699
56  3.3808283 3.2832910 3.2434549 3.2031235 3.5185224 3.5805028 3.3075671
57  3.7509999 3.7603191 3.7229021 3.5355339 3.7868192 3.7815341 3.7682887
58  2.3108440 2.0518285 2.0074860 2.2022716 2.6514147 2.8017851 2.0904545
59  3.6959437 3.7296112 3.6728735 3.4799425 3.8013156 3.7881394 3.7080992
60  2.8600699 2.6888659 2.6551836 2.7000000 3.0675723 3.1670175 2.7294688
61  2.6551836 2.4041631 2.3452079 2.5455844 3.0430248 3.1843367 2.4207437
62  3.1906112 3.1511903 3.1096624 2.9849623 3.3090784 3.3361655 3.1606961
63  3.0789609 3.0149627 2.9410882 2.9000000 3.3630343 3.4132096 2.9849623
64  3.6592349 3.6193922 3.5749126 3.4612137 3.7656341 3.7920970 3.6276714
65  2.5416530 2.4718414 2.4269322 2.3473389 2.7294688 2.7838822 2.4799194
66  3.5749126 3.6455452 3.5902646 3.3451457 3.6537652 3.6180105 3.6180105
67  3.4088121 3.3090784 3.2787193 3.2264532 3.5114100 3.5707142 3.3451457
68  2.9631065 2.8896367 2.8372522 2.7874720 3.1448370 3.2046841 2.8930952
69  3.7067506 3.6537652 3.5874782 3.5057096 3.9458839 3.9736633 3.6318040
70  2.8337255 2.7202941 2.6645825 2.6645825 3.0789609 3.1559468 2.7239677
71  3.8275318 3.7735925 3.7443290 3.6249138 3.8832976 3.9089641 3.8026307
72  3.0232433 3.0149627 2.9580399 2.8124722 3.1921779 3.2078030 2.9983329
73  3.9949969 3.9534795 3.8974351 3.7934153 4.1581246 4.1797129 3.9458839
74  3.6138622 3.5679126 3.5199432 3.4249088 3.7349699 3.7696154 3.5735137
75  3.3630343 3.3882149 3.3316662 3.1464265 3.4871192 3.4813790 3.3674916
76  3.5440090 3.5958309 3.5397740 3.3181320 3.6428011 3.6180105 3.5707142
77  3.9899875 4.0311289 3.9711459 3.7696154 4.1024383 4.0804412 4.0037482
78  4.1976184 4.2249260 4.1749251 3.9736633 4.2743421 4.2532341 4.2130749
79  3.4914181 3.4467376 3.4029399 3.2893768 3.6110940 3.6386811 3.4554305
80  2.4372115 2.3685439 2.3043437 2.2561028 2.7037012 2.7658633 2.3515952
81  2.7676705 2.6324893 2.5748786 2.6057628 3.0446675 3.1320920 2.6362853
82  2.6495283 2.5159491 2.4556058 2.4919872 2.9376862 3.0282008 2.5159491
83  2.8460499 2.7838822 2.7294688 2.6551836 3.0479501 3.0967725 2.7802878
84  4.0963398 4.0187063 3.9761791 3.9051248 4.2201896 4.2602817 4.0360872
85  3.3911650 3.2603681 3.2357379 3.2202484 3.4942810 3.5707142 3.3090784
86  3.4942810 3.4785054 3.4496377 3.2863353 3.5185224 3.5298725 3.5014283
87  3.8626416 3.9127995 3.8613469 3.6373067 3.9306488 3.9025633 3.8948684
88  3.5538711 3.5242020 3.4554305 3.3526109 3.7815341 3.8026307 3.4957117
89  2.9698485 2.8827071 2.8478062 2.7874720 3.0935417 3.1543621 2.9103264
90  2.9782545 2.8460499 2.7964263 2.8071338 3.2155870 3.2954514 2.8600699
91  3.2756679 3.1368774 3.0951575 3.1144823 3.4583233 3.5440090 3.1654384
92  3.5566838 3.5270384 3.4842503 3.3555923 3.6496575 3.6715120 3.5355339
93  2.9597297 2.8861739 2.8301943 2.7730849 3.1733263 3.2264532 2.8827071
94  2.3452079 2.1047565 2.0518285 2.2293497 2.7073973 2.8478062 2.1283797
95  3.1144823 3.0049958 2.9614186 2.9376862 3.2939338 3.3630343 3.0248967
96  3.0413813 2.9664794 2.9291637 2.8600699 3.1559468 3.2124757 2.9899833
97  3.0903074 3.0099834 2.9698485 2.9051678 3.2280025 3.2832910 3.0298515
98  3.2969683 3.2924155 3.2403703 3.0886890 3.4234486 3.4351128 3.2832910
99  2.0469489 1.8493242 1.7944358 1.9078784 2.4124676 2.5337719 1.8601075
100 3.0182777 2.9359837 2.8913665 2.8319605 3.1843367 3.2403703 2.9495762
101 5.2602281 5.2172790 5.1903757 5.0477718 5.2782573 5.2820451 5.2478567
102 4.1749251 4.0743098 4.0373258 3.9824616 4.3034870 4.3497126 4.1012193
103 5.2564246 5.2820451 5.2345009 5.0299105 5.3084838 5.2782573 5.2744668
104 4.6540305 4.6054316 4.5661800 4.4530888 4.7275787 4.7465777 4.6227697
105 5.0209561 4.9919936 4.9537864 4.8062459 5.0793700 5.0793700 5.0059964
106 6.0473135 6.0876925 6.0382117 5.8223707 6.0811183 6.0415230 6.0761830
107 3.5721142 3.3451457 3.3211444 3.4278273 3.7696154 3.8871583 3.4073450
108 5.5883808 5.6124861 5.5623736 5.3721504 5.6373753 5.6124861 5.6035703
109 4.9979996 4.9689033 4.9162994 4.7906158 5.1176166 5.1234754 4.9648766
110 5.6053546 5.6524331 5.6169387 5.3712196 5.5830099 5.5344376 5.6559703
111 4.3197222 4.3278170 4.2883563 4.0951190 4.3669211 4.3508620 4.3324358
112 4.4754888 4.4407207 4.3931765 4.2638011 4.5912961 4.6000000 4.4429720
113 4.8104054 4.8238988 4.7780749 4.5836667 4.8754487 4.8528342 4.8197510
114 4.1545156 4.0360872 3.9962482 3.9635842 4.3208795 4.3737855 4.0607881
115 4.3874822 4.2965102 4.2638011 4.1809090 4.5055521 4.5365185 4.3243497
116 4.5934736 4.5814845 4.5464272 4.3692105 4.6400431 4.6292548 4.5945620
117 4.6065171 4.5880279 4.5464272 4.3965896 4.6679760 4.6701178 4.5967380
118 6.2048368 6.2745518 6.2377881 5.9774577 6.1473572 6.0901560 6.2745518
119 6.4459289 6.4699304 6.4156060 6.2209324 6.5192024 6.4853681 6.4544558
120 4.0902323 3.9924930 3.9370039 3.9064050 4.2965102 4.3474130 3.9949969
121 5.0823223 5.1048996 5.0635956 4.8518038 5.1166395 5.0852729 5.1048996
122 4.0012498 3.8858718 3.8548671 3.8105118 4.1255303 4.1785165 3.9217343
123 6.1595454 6.1975802 6.1441029 5.9371710 6.2120850 6.1749494 6.1814238
124 4.0632499 4.0323690 3.9824616 3.8496753 4.1976184 4.2071368 4.0298883
125 4.9355851 4.9426713 4.9061186 4.7148701 4.9527770 4.9345719 4.9527770
126 5.2687759 5.3075418 5.2621288 5.0487622 5.2867760 5.2545219 5.3018865
127 3.9344631 3.9000000 3.8535698 3.7215588 4.0583248 4.0706265 3.9025633
128 3.9724048 3.9306488 3.8923001 3.7643060 4.0583248 4.0755368 3.9458839
129 4.8010416 4.7602521 4.7180504 4.5891176 4.8928519 4.9010203 4.7707442
130 5.0477718 5.0882217 5.0368641 4.8301139 5.0941143 5.0645829 5.0744458
131 5.4936327 5.5308227 5.4763126 5.2697249 5.5614746 5.5272054 5.5127126
132 5.9741108 6.0728906 6.0315835 5.7428216 5.9160798 5.8446557 6.0613530
133 4.8414874 4.8010416 4.7592016 4.6270941 4.9345719 4.9406477 4.8114447
134 4.1170378 4.0816663 4.0348482 3.9166312 4.2213742 4.2402830 4.0865633
135 4.5310043 4.4452222 4.4022721 4.3520110 4.6432747 4.6904158 4.4654227
136 5.7367238 5.8051701 5.7515215 5.4972721 5.7844619 5.7253821 5.7810034
137 4.8672374 4.8414874 4.8145612 4.6497312 4.8785244 4.8744230 4.8682646
138 4.5716518 4.5464272 4.5088801 4.3646306 4.6184413 4.6249324 4.5617979
139 3.8626416 3.8118237 3.7749172 3.6565011 3.9534795 3.9761791 3.8301436
140 4.7528939 4.7853944 4.7391982 4.5210618 4.8062459 4.7728398 4.7770284
141 4.9819675 4.9849774 4.9446941 4.7528939 5.0348784 5.0129831 4.9889879
142 4.5912961 4.6378875 4.5902070 4.3485630 4.6572524 4.6119410 4.6227697
143 4.1749251 4.0743098 4.0373258 3.9824616 4.3034870 4.3497126 4.1012193
144 5.2211110 5.2258971 5.1874849 4.9969991 5.2507142 5.2297227 5.2335456
145 5.1029403 5.1097945 5.0744458 4.8733972 5.1273775 5.1019604 5.1195703
146 4.6108568 4.6270941 4.5814845 4.3760713 4.6893496 4.6615448 4.6206060
147 4.2272923 4.1833001 4.1303753 4.0149720 4.3886217 4.4022721 4.1785165
148 4.4192760 4.4136153 4.3703547 4.1976184 4.4944410 4.4855323 4.4158804
149 4.6270941 4.5978256 4.5716518 4.4113490 4.6411206 4.6411206 4.6260134
150 4.1109610 4.0360872 4.0037482 3.9153544 4.1892720 4.2249260 4.0657103
           36        37        38        39        40        41        42
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37  0.5916080                                                            
38  0.4690416 0.6244998                                                  
39  0.6403124 1.2083046 0.7937254                                        
40  0.3741657 0.4582576 0.3162278 0.8306624                              
41  0.3316625 0.5099020 0.2645751 0.7874008 0.2645751                    
42  1.0392305 1.5652476 1.3784049 0.7141428 1.2727922 1.3000000          
43  0.6082763 1.1401754 0.6557439 0.2000000 0.7549834 0.6782330 0.9110434
44  0.6403124 0.7071068 0.5567764 0.9273618 0.4358899 0.4242641 1.3674794
45  0.9486833 0.8062258 0.6480741 1.2369317 0.6000000 0.6855655 1.7262677
46  0.3605551 0.8717798 0.6403124 0.4242641 0.5196152 0.5477226 0.7681146
47  0.7280110 0.5830952 0.3605551 1.1045361 0.4123106 0.4472136 1.6462078
48  0.4472136 0.9539392 0.5099020 0.3000000 0.5477226 0.5196152 0.9165151
49  0.6557439 0.3464102 0.4358899 1.1575837 0.3605551 0.4242641 1.6278821
50  0.2236068 0.5477226 0.3316625 0.6782330 0.1732051 0.2449490 1.1269428
51  4.2059482 3.9166312 4.1412558 4.4497191 3.9153544 4.1060930 4.4530888
52  3.8131352 3.5818989 3.7389838 3.9962482 3.5242020 3.7054015 4.0124805
53  4.3588989 4.0951190 4.2965102 4.5727453 4.0718546 4.2626283 4.5607017
54  3.1796226 3.1527766 3.2015621 3.1937439 2.9715316 3.1591138 3.0479501
55  3.9572718 3.7509999 3.9242834 4.1267421 3.6905284 3.8820098 4.0718546
56  3.5707142 3.4612137 3.5114100 3.6304270 3.3060551 3.4957117 3.5958309
57  3.9887341 3.7682887 3.8974351 4.1496988 3.6945906 3.8704005 4.1821047
58  2.3874673 2.4919872 2.4207437 2.2912878 2.2181073 2.3895606 2.1587033
59  3.9268308 3.6972963 3.8807216 4.1170378 3.6496575 3.8483763 4.0816663
60  3.0033315 2.9883106 2.9732137 2.9883106 2.7748874 2.9410882 2.9359837
61  2.7147744 2.8248894 2.7910571 2.6153394 2.5709920 2.7531800 2.3811762
62  3.3970576 3.2419130 3.3406586 3.5142567 3.1272992 3.3030289 3.5071356
63  3.2372828 3.1416556 3.2771939 3.3361655 3.0232433 3.2357379 3.1685959
64  3.8716921 3.7040518 3.8091994 3.9874804 3.5958309 3.7868192 3.9610605
65  2.7239677 2.6210685 2.6944387 2.8195744 2.4738634 2.6476405 2.8035692
66  3.8183766 3.5566838 3.7656341 4.0435133 3.5355339 3.7242449 4.0373258
67  3.6027767 3.4914181 3.5242020 3.6565011 3.3316662 3.5057096 3.6578682
68  3.1527766 3.0347982 3.1176915 3.2449961 2.8948230 3.1000000 3.1906112
69  3.8755645 3.7563280 3.9012818 3.9761791 3.6523965 3.8483763 3.8183766
70  2.9916551 2.9291637 2.9916551 3.0430248 2.7622455 2.9597297 2.9410882
71  4.0410395 3.8807216 3.9509493 4.1352146 3.7589892 3.9242834 4.1557190
72  3.2280025 3.0577770 3.2062439 3.3808283 2.9698485 3.1606961 3.3316662
73  4.1904654 4.0360872 4.1689327 4.3023250 3.9370039 4.1340053 4.2047592
74  3.8236109 3.6619667 3.7656341 3.9357337 3.5496479 3.7509999 3.8961519
75  3.5874782 3.3734256 3.5482390 3.7709415 3.3151169 3.5099858 3.7376463
76  3.7788887 3.5369478 3.7336309 3.9862263 3.5014283 3.6918830 3.9648455
77  4.2190046 3.9837169 4.1833001 4.4147480 3.9471509 4.1460825 4.3588989
78  4.4294469 4.1988094 4.3726422 4.6076024 4.1496988 4.3347434 4.5803930
79  3.6972963 3.5411862 3.6428011 3.8078866 3.4278273 3.6110940 3.7802116
80  2.6038433 2.5159491 2.6191602 2.7073973 2.3748684 2.5748786 2.6191602
81  2.9086079 2.8757608 2.9257478 2.9376862 2.6944387 2.8896367 2.8106939
82  2.7892651 2.7586228 2.8106939 2.8231188 2.5768197 2.7766887 2.6944387
83  3.0298515 2.9137605 3.0133038 3.1320920 2.7820855 2.9748950 3.0692019
84  4.2918527 4.1581246 4.2379240 4.3646306 4.0274061 4.2154478 4.3058100
85  3.5749126 3.4914181 3.4899857 3.5958309 3.3075671 3.4770677 3.6027767
86  3.7269290 3.5298725 3.6207734 3.8626416 3.4307434 3.5972211 3.9230090
87  4.1036569 3.8535698 4.0422766 4.3069711 3.8183766 4.0062451 4.2988371
88  3.7349699 3.5916570 3.7536649 3.8626416 3.5028560 3.7067506 3.7215588
89  3.1654384 3.0512293 3.0951575 3.2388269 2.8948230 3.0740852 3.2465366
90  3.1288976 3.0822070 3.1256999 3.1559468 2.9034462 3.0886890 3.0545049
91  3.4423829 3.3793490 3.4014703 3.4612137 3.1953091 3.3882149 3.3926391
92  3.7749172 3.5972211 3.7054015 3.9012818 3.4942810 3.6823905 3.8923001
93  3.1368774 3.0315013 3.1272992 3.2264532 2.8948230 3.0903074 3.1432467
94  2.4207437 2.5159491 2.4738634 2.3430749 2.2583180 2.4351591 2.1771541
95  3.2893768 3.2046841 3.2526912 3.3391616 3.0397368 3.2264532 3.2832910
96  3.2449961 3.1144823 3.1701735 3.3316662 2.9681644 3.1559468 3.3391616
97  3.2848135 3.1654384 3.2264532 3.3645208 3.0182777 3.2031235 3.3481338
98  3.5142567 3.3256578 3.4684290 3.6687873 3.2419130 3.4351128 3.6400549
99  2.1330729 2.2045408 2.1931712 2.1071308 1.9672316 2.1307276 1.9824228
100 3.2046841 3.0951575 3.1638584 3.2832910 2.9478806 3.1336879 3.2449961
101 5.4799635 5.2971691 5.3823787 5.5749439 5.1951901 5.3535035 5.5830099
102 4.3577517 4.2497059 4.3069711 4.4022721 4.1024383 4.2755117 4.3416587
103 5.4909016 5.2516664 5.4267854 5.6621551 5.2086467 5.3907328 5.6258333
104 4.8682646 4.6957428 4.7937459 4.9668904 4.5891176 4.7738873 4.9335586
105 5.2392748 5.0497525 5.1662365 5.3535035 4.9608467 5.1341991 5.3244718
106 6.2904690 6.0299254 6.2201286 6.4761099 6.0024995 6.1919302 6.4366140
107 3.6932371 3.7215588 3.6578682 3.6041643 3.4785054 3.6345564 3.5213634
108 5.8266629 5.5821143 5.7576037 5.9983331 5.5398556 5.7358522 5.9539903
109 5.2057660 5.0249378 5.1672043 5.3244718 4.9416596 5.1371198 5.2325902
110 5.8566202 5.5883808 5.7567352 6.0440053 5.5587768 5.7210139 6.0712437
111 4.5497253 4.3324358 4.4743715 4.7042534 4.2661458 4.4350874 4.7053161
112 4.6808119 4.5099889 4.6378875 4.7937459 4.4170126 4.6000000 4.7254629
113 5.0378567 4.8155997 4.9779514 5.1971146 4.7602521 4.9365980 5.1633323
114 4.3197222 4.2391037 4.2918527 4.3439613 4.0816663 4.2508823 4.2497059
115 4.5661800 4.4564560 4.5199558 4.6130250 4.3185646 4.4698993 4.5596052
116 4.8145612 4.6162756 4.7391982 4.9446941 4.5365185 4.6957428 4.9416596
117 4.8311489 4.6314145 4.7560488 4.9608467 4.5475268 4.7318073 4.9376108
118 6.4730209 6.1717096 6.3545259 6.6850580 6.1611687 6.3364028 6.7275553
119 6.6745786 6.4358372 6.6279710 6.8425142 6.4007812 6.5924199 6.7594378
120 4.2579338 4.1617304 4.2532341 4.3104524 4.0236799 4.2213742 4.1701319
121 5.3169540 5.0813384 5.2430907 5.4827001 5.0328918 5.2019227 5.4708317
122 4.1773197 4.0865633 4.1218928 4.2047592 3.9255573 4.0865633 4.1605288
123 6.3984373 6.1424751 6.3387696 6.5825527 6.1155539 6.3111013 6.5222695
124 4.2649736 4.0987803 4.2320208 4.3840620 4.0062451 4.1880783 4.3139309
125 5.1730069 4.9446941 5.0813384 5.3244718 4.8805737 5.0527220 5.3329167
126 5.5172457 5.2564246 5.4313902 5.7035077 5.2211110 5.4101756 5.6956123
127 4.1376322 3.9736633 4.0963398 4.2532341 3.8755645 4.0533936 4.2000000
128 4.1833001 4.0162171 4.1158231 4.2906876 3.9089641 4.0828911 4.2731721
129 5.0089919 4.8373546 4.9527770 5.1127292 4.7402532 4.9173163 5.0586559
130 5.2915026 5.0348784 5.2211110 5.4817880 5.0019996 5.1990384 5.4516053
131 5.7288742 5.4799635 5.6762664 5.9135438 5.4497706 5.6435804 5.8532043
132 6.2489999 5.9245253 6.1359596 6.4915329 5.9371710 6.1155539 6.5352888
133 5.0477718 4.8774994 4.9939964 5.1507281 4.7812132 4.9547957 5.0950957
134 4.3301270 4.1545156 4.2720019 4.4474712 4.0558600 4.2497059 4.4011362
135 4.7296934 4.5934736 4.6658333 4.7937459 4.4598206 4.6604721 4.7275787
136 5.9791304 5.7043843 5.9270566 6.1919302 5.7000000 5.8804762 6.1457302
137 5.0921508 4.8969378 4.9929951 5.2057660 4.8052055 4.9598387 5.2297227
138 4.7979162 4.6010868 4.7116876 4.9203658 4.5099889 4.6914816 4.9132474
139 4.0693980 3.9127995 4.0024992 4.1677332 3.7973675 3.9686270 4.1521079
140 4.9869831 4.7476310 4.9244289 5.1652686 4.7063787 4.8805737 5.1429563
141 5.2057660 4.9929951 5.1400389 5.3507009 4.9295030 5.0941143 5.3272882
142 4.8207883 4.5793013 4.7728398 5.0109879 4.5497253 4.7127487 4.9839743
143 4.3577517 4.2497059 4.3069711 4.4022721 4.1024383 4.2755117 4.3416587
144 5.4534393 5.2297227 5.3721504 5.6008928 5.1672043 5.3376025 5.5910643
145 5.3329167 5.1117512 5.2516664 5.4799635 5.0497525 5.2086467 5.4808758
146 4.8311489 4.6162756 4.7833043 4.9909919 4.5628938 4.7275787 4.9537864
147 4.4170126 4.2684892 4.4011362 4.5210618 4.1701319 4.3520110 4.4192760
148 4.6400431 4.4384682 4.5793013 4.7812132 4.3646306 4.5387223 4.7528939
149 4.8507731 4.6604721 4.7507894 4.9618545 4.5639895 4.7180504 4.9909919
150 4.3150898 4.1725292 4.2355637 4.3874822 4.0398020 4.2130749 4.3749286
           43        44        45        46        47        48        49
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44  0.8366600                                                            
45  1.1180340 0.4795832                                                  
46  0.4690416 0.6480741 0.9949874                                        
47  0.9695360 0.5099020 0.3605551 0.8831761                              
48  0.2236068 0.6708204 0.9486833 0.3000000 0.8062258                    
49  1.0488088 0.5477226 0.5000000 0.8717798 0.2449490 0.8660254          
50  0.6164414 0.4898979 0.7416198 0.3741657 0.5477226 0.4123106 0.5099020
51  4.4452222 3.7868192 3.5791060 4.1206796 3.8755645 4.2532341 3.8496753
52  3.9912404 3.3570821 3.1654384 3.6945906 3.4856850 3.8131352 3.4856850
53  4.5727453 3.9331921 3.7336309 4.2555846 4.0385641 4.3863424 4.0211939
54  3.2434549 2.8178006 2.7622455 2.9563491 3.0626786 3.0967725 3.0757113
55  4.1412558 3.5425979 3.3852622 3.8223030 3.6945906 3.9623226 3.6810325
56  3.6469165 3.1432467 2.9883106 3.3852622 3.3136083 3.4914181 3.3436507
57  4.1400483 3.5128336 3.3120990 3.8626416 3.6414283 3.9686270 3.6551334
58  2.3515952 2.0663978 2.0784610 2.1142375 2.3515952 2.2315914 2.3937418
59  4.1267421 3.5227830 3.3406586 3.8065733 3.6428011 3.9420807 3.6262929
60  3.0149627 2.5709920 2.4939928 2.7766887 2.8195744 2.8809721 2.8653098
61  2.6981475 2.4535688 2.4839485 2.4372115 2.7386128 2.5787594 2.7604347
62  3.5199432 2.9376862 2.7892651 3.2388269 3.1192948 3.3555923 3.1352831
63  3.3896903 2.9342802 2.8530685 3.0545049 3.1256999 3.2186954 3.1032241
64  3.9974992 3.4380227 3.2634338 3.7148351 3.5860842 3.8301436 3.6000000
65  2.8337255 2.2825424 2.1817424 2.5475478 2.5039968 2.6720778 2.5199206
66  4.0435133 3.3955854 3.2093613 3.7188708 3.5114100 3.8548671 3.4885527
67  3.6619667 3.1352831 2.9765752 3.4190642 3.3151169 3.5128336 3.3570821
68  3.2695565 2.7730849 2.6267851 2.9782545 2.9308702 3.1016125 2.9410882
69  4.0211939 3.5142567 3.4263683 3.6945906 3.7242449 3.8548671 3.7080992
70  3.0822070 2.6267851 2.5357445 2.7892651 2.8354894 2.9240383 2.8460499
71  4.1303753 3.5468296 3.3719431 3.8807216 3.7148351 3.9761791 3.7496667
72  3.3985291 2.8195744 2.6870058 3.0805844 2.9949958 3.2218007 2.9849623
73  4.3301270 3.7934153 3.6523965 4.0236799 3.9635842 4.1617304 3.9610605
74  3.9509493 3.4161382 3.2372828 3.6646964 3.5510562 3.7815341 3.5623026
75  3.7815341 3.1780497 3.0116441 3.4612137 3.3166248 3.5986108 3.3015148
76  3.9912404 3.3600595 3.1843367 3.6674242 3.4885527 3.8052595 3.4684290
77  4.4283180 3.8223030 3.6469165 4.1000000 3.9458839 4.2426407 3.9230090
78  4.6119410 3.9887341 3.8078866 4.3046487 4.1243181 4.4339599 4.1170378
79  3.8183766 3.2526912 3.0967725 3.5355339 3.4234486 3.6537652 3.4380227
80  2.7440845 2.2516660 2.1725561 2.4228083 2.4596748 2.5729361 2.4515301
81  2.9849623 2.5592968 2.4939928 2.6925824 2.7874720 2.8319605 2.7982137
82  2.8722813 2.4556058 2.3916521 2.5748786 2.6776856 2.7166155 2.6851443
83  3.1575307 2.6324893 2.5179357 2.8548205 2.8266588 2.9899833 2.8301943
84  4.3829214 3.8587563 3.7013511 4.1121770 4.0286474 4.2261093 4.0509258
85  3.6013886 3.1032241 2.9495762 3.3778692 3.2908965 3.4612137 3.3451457
86  3.8470768 3.2280025 3.0282008 3.5916570 3.3674916 3.6837481 3.3970576
87  4.3069711 3.6701499 3.4785054 3.9937451 3.7881394 4.1231056 3.7749172
88  3.9038443 3.3852622 3.2787193 3.5693137 3.5693137 3.7296112 3.5468296
89  3.2449961 2.7110883 2.5573424 2.9883106 2.8896367 3.0886890 2.9240383
90  3.1937439 2.7386128 2.6589472 2.9154759 2.9698485 3.0446675 2.9899833
91  3.4899857 3.0430248 2.9137605 3.2341923 3.2310989 3.3421550 3.2649655
92  3.9064050 3.3316662 3.1511903 3.6249138 3.4756294 3.7376463 3.4899857
93  3.2572995 2.7513633 2.6419690 2.9546573 2.9478806 3.0919250 2.9512709
94  2.4103942 2.1189620 2.1400935 2.1517435 2.4062419 2.2847319 2.4351591
95  3.3630343 2.8722813 2.7495454 3.0935417 3.0708305 3.2093613 3.0967725
96  3.3376639 2.8035692 2.6324893 3.0757113 2.9597297 3.1764760 2.9899833
97  3.3763886 2.8460499 2.6962938 3.1080541 3.0232433 3.2171416 3.0495901
98  3.6796739 3.0951575 2.9308702 3.3734256 3.2434549 3.5028560 3.2403703
99  2.1633308 1.7944358 1.8411953 1.8814888 2.1118712 2.0273135 2.1307276
100 3.3015148 2.7784888 2.6476405 3.0232433 2.9698485 3.1416556 2.9899833
101 5.5677644 4.9699095 4.7864392 5.3235327 5.1322510 5.4175640 5.1672043
102 4.4204072 3.9012818 3.7669616 4.1641326 4.1036569 4.2743421 4.1352146
103 5.6656862 5.0398413 4.8507731 5.3646994 5.1710734 5.4909016 5.1672043
104 4.9749372 4.4147480 4.2308392 4.7063787 4.5617979 4.8145612 4.5836667
105 5.3572381 4.7644517 4.5880279 5.0852729 4.9234135 5.1971146 4.9416596
106 6.4791975 5.8532043 5.6453521 6.1741396 5.9581876 6.3000000 5.9497899
107 3.6373067 3.2603681 3.1906112 3.4394767 3.5199432 3.5270384 3.5846897
108 6.0049979 5.4018515 5.1932649 5.7026310 5.5045436 5.8266629 5.4990908
109 5.3469618 4.7927028 4.6281746 5.0467812 4.9446941 5.1788030 4.9446941
110 6.0274373 5.3581713 5.1478151 5.7489129 5.4763126 5.8566202 5.4836119
111 4.7000000 4.0681691 3.8884444 4.4170126 4.2201896 4.5321077 4.2296572
112 4.8104054 4.2402830 4.0877867 4.5188494 4.4136153 4.6465041 4.4204072
113 5.2009614 4.5771170 4.4022721 4.9040799 4.7275787 5.0299105 4.7275787
114 4.3714986 3.8742741 3.7709415 4.1121770 4.1048752 4.2308392 4.1340053
115 4.6260134 4.0767634 3.9661064 4.3749286 4.3104524 4.4866469 4.3428102
116 4.9406477 4.3162484 4.1496988 4.6701178 4.4888751 4.7812132 4.5066617
117 4.9648766 4.3760713 4.1856899 4.6850827 4.5133136 4.7979162 4.5265881
118 6.6640828 5.9958319 5.7480431 6.3835727 6.0638272 6.4853681 6.0671245
119 6.8571131 6.2513998 6.0671245 6.5436993 6.3796552 6.6805688 6.3671030
120 4.3520110 3.8807216 3.7669616 4.0595566 4.0767634 4.1964271 4.0841156
121 5.4790510 4.8373546 4.6529560 5.1903757 4.9819675 5.3094256 4.9859803
122 4.2190046 3.7013511 3.5791060 3.9774364 3.9217343 4.0804412 3.9623226
123 6.5916614 5.9791304 5.7758116 6.2793312 6.0835845 6.4109282 6.0704201
124 4.4022721 3.8288379 3.6891733 4.1036569 4.0124805 4.2367440 4.0149720
125 5.3169540 4.6893496 4.4877611 5.0428167 4.8197510 5.1497573 4.8342528
126 5.7000000 5.0724747 4.8518038 5.4046276 5.1662365 5.5208695 5.1643005
127 4.2673177 3.6891733 3.5468296 3.9761791 3.8742741 4.1036569 3.8820098
128 4.2953463 3.7134889 3.5496479 4.0236799 3.8845849 4.1352146 3.9051248
129 5.1244512 4.5497253 4.3897608 4.8456166 4.7222876 4.9648766 4.7370877
130 5.4854353 4.8713448 4.6583259 5.1778374 4.9648766 5.3028294 4.9547957
131 5.9236813 5.3094256 5.1166395 5.6080300 5.4249424 5.7428216 5.4101756
132 6.4699304 5.7879185 5.5362442 6.1757591 5.8412327 6.2841069 5.8326666
133 5.1623638 4.5836667 4.4294469 4.8836462 4.7634021 5.0039984 4.7780749
134 4.4609416 3.9038443 3.7269290 4.1737274 4.0472213 4.2930176 4.0570926
135 4.8145612 4.3197222 4.1388404 4.5497253 4.4586994 4.6572524 4.4833024
136 6.1951594 5.5389530 5.3525695 5.8736701 5.6621551 6.0124870 5.6409219
137 5.1942276 4.5760245 4.3920383 4.9457052 4.7370877 5.0408333 4.7686476
138 4.9203658 4.3324358 4.1352146 4.6508064 4.4665423 4.7560488 4.4866469
139 4.1725292 3.5958309 3.4380227 3.9051248 3.7749172 4.0149720 3.7986840
140 5.1652686 4.5232732 4.3439613 4.8641546 4.6669048 4.9909919 4.6626173
141 5.3507009 4.7212287 4.5541190 5.0665570 4.8877398 5.1865210 4.8959167
142 5.0109879 4.3485630 4.1928511 4.7021272 4.5155288 4.8373546 4.5044423
143 4.4204072 3.9012818 3.7669616 4.1641326 4.1036569 4.2743421 4.1352146
144 5.5973208 4.9709154 4.7812132 5.3188345 5.1137071 5.4313902 5.1254268
145 5.4726593 4.8321838 4.6540305 5.1990384 4.9909919 5.3103672 5.0049975
146 4.9949975 4.3577517 4.2071368 4.6957428 4.5354162 4.8270074 4.5332108
147 4.5475268 3.9924930 3.8716921 4.2449971 4.1928511 4.3852024 4.1928511
148 4.7853944 4.1737274 4.0062451 4.4966654 4.3358967 4.6184413 4.3428102
149 4.9497475 4.3335897 4.1509035 4.7031904 4.4966654 4.7968740 4.5299007
150 4.3920383 3.8405729 3.6715120 4.1412558 4.0112342 4.2402830 4.0459857
           50        51        52        53        54        55        56
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51  4.0422766                                                            
52  3.6428011 0.6403124                                                  
53  4.1940434 0.2645751 0.6480741                                        
54  3.0364453 1.8867962 1.3820275 1.8574176                              
55  3.7986840 0.6557439 0.4242641 0.5830952 1.2845233                    
56  3.4000000 1.3784049 0.8306624 1.3152946 0.7348469 0.8306624          
57  3.8131352 0.7348469 0.2645751 0.6708204 1.4899664 0.5567764 0.8602325
58  2.2516660 2.6776856 2.1400935 2.7018512 0.9746794 2.1587033 1.5264338
59  3.7643060 0.5196152 0.4242641 0.5099020 1.3892444 0.2449490 0.9110434
60  2.8442925 2.0322401 1.4352700 2.0149442 0.5196152 1.4832397 0.7937254
61  2.5961510 2.6532998 2.1563859 2.6514147 0.8246211 2.0856654 1.4899664
62  3.2295511 1.2288206 0.6164414 1.2247449 0.8544004 0.7483315 0.4582576
63  3.1000000 1.6278821 1.2884099 1.6370706 0.5916080 1.1045361 0.8888194
64  3.7013511 0.9486833 0.4795832 0.8544004 1.1045361 0.4358899 0.4690416
65  2.5632011 1.8083141 1.2569805 1.8601075 0.7280110 1.3638182 0.9110434
66  3.6565011 0.4358899 0.3464102 0.5477226 1.5000000 0.4242641 1.0535654
67  3.4278273 1.4317821 0.8246211 1.3638182 0.8888194 0.9273618 0.3000000
68  2.9883106 1.4866069 1.0099505 1.5033296 0.5916080 1.0000000 0.5196152
69  3.7349699 1.3000000 1.0198039 1.2083046 0.8888194 0.6782330 0.8062258
70  2.8390139 1.7832555 1.2845233 1.7916473 0.3162278 1.2449900 0.7071068
71  3.8652296 1.1747340 0.6557439 1.0535654 1.3638182 0.8062258 0.7348469
72  3.0708305 1.2124356 0.7348469 1.2569805 0.7810250 0.7483315 0.6403124
73  4.0336088 1.0148892 0.8124038 0.8485281 1.2369317 0.4690416 0.8062258
74  3.6537652 1.0049876 0.6164414 0.9273618 1.0535654 0.5099020 0.4582576
75  3.4263683 0.7874008 0.4123106 0.8306624 1.1224972 0.3872983 0.7348469
76  3.6180105 0.5385165 0.3162278 0.6000000 1.3674794 0.3162278 0.9327379
77  4.0607881 0.4582576 0.6480741 0.3464102 1.6093477 0.3741657 1.1445523
78  4.2649736 0.5567764 0.6480741 0.3162278 1.7578396 0.5291503 1.2041595
79  3.5298725 1.0677078 0.5000000 1.0049876 0.9486833 0.5196152 0.3741657
80  2.4556058 1.9104973 1.4491377 1.9748418 0.6855655 1.4628739 1.0630146
81  2.7622455 1.9467922 1.4491377 1.9544820 0.3000000 1.4000000 0.8544004
82  2.6438608 2.0124612 1.5297059 2.0346990 0.4358899 1.4899664 0.9643651
83  2.8722813 1.5394804 1.0295630 1.5684387 0.5196152 1.0392305 0.6244998
84  4.1243181 1.2041595 0.8831761 1.0099505 1.3076697 0.7211103 0.7416198
85  3.3985291 1.6278821 1.0198039 1.5556349 0.8888194 1.1224972 0.4123106
86  3.5468296 1.0583005 0.4582576 1.0344080 1.3416408 0.7937254 0.7348469
87  3.9382737 0.3316625 0.3741657 0.2828427 1.6155494 0.3741657 1.0816654
88  3.5916570 1.1832160 0.9327379 1.1357817 0.8944272 0.6082763 0.7874008
89  2.9916551 1.5394804 0.9380832 1.5427249 0.7141428 1.0677078 0.4582576
90  2.9765752 1.8000000 1.2609520 1.7804494 0.2000000 1.2206556 0.6164414
91  3.2771939 1.6552945 1.1269428 1.5968719 0.5099020 1.0816654 0.3162278
92  3.6027767 0.9273618 0.3872983 0.8660254 1.1045361 0.4582576 0.4690416
93  2.9816103 1.5264338 1.0295630 1.5362291 0.4358899 0.9899495 0.5567764
94  2.2912878 2.6324893 2.1118712 2.6570661 0.9110434 2.1071308 1.5066519
95  3.1256999 1.5716234 1.0099505 1.5427249 0.4582576 1.0099505 0.3316625
96  3.0692019 1.4212670 0.8426150 1.4247807 0.7615773 0.9643651 0.3741657
97  3.1144823 1.4282857 0.8426150 1.4177447 0.6633250 0.9219544 0.3162278
98  3.3496268 0.9486833 0.4582576 0.9643651 0.9695360 0.4795832 0.5477226
99  2.0049938 2.6608269 2.1424285 2.7147744 1.1135529 2.1840330 1.6552945
100 3.0397368 1.4899664 0.9219544 1.4866069 0.5477226 0.9643651 0.4000000
101 5.3047149 1.8439089 1.8083141 1.6155494 2.6608269 1.8027756 2.0736441
102 4.1928511 1.4491377 1.0630146 1.2529964 1.3490738 0.9539392 0.8602325
103 5.3254108 1.4071247 1.6881943 1.1874342 2.7018512 1.5652476 2.1447611
104 4.6957428 1.2449900 1.1832160 0.9899495 1.9519221 1.0677078 1.3527749
105 5.0695167 1.4628739 1.4933185 1.2124356 2.3537205 1.4035669 1.7832555
106 6.1237244 2.1213203 2.5000000 1.9364917 3.5071356 2.3685439 2.9495762
107 3.5369478 2.2427661 1.6673332 2.1354157 0.9000000 1.6431677 0.9433981
108 5.6586217 1.7029386 2.0566964 1.5000000 3.0232433 1.9052559 2.4617067
109 5.0447993 1.3964240 1.5362291 1.1401754 2.2293497 1.2884099 1.7406895
110 5.6841886 1.8357560 2.0880613 1.6673332 3.2295511 2.0928450 2.6248809
111 4.3806392 0.8774964 0.7874008 0.6782330 1.8734994 0.8124038 1.2845233
112 4.5188494 1.1045361 1.0246951 0.8544004 1.7378147 0.8185353 1.2247449
113 4.8733972 1.1000000 1.2489996 0.8602325 2.2516660 1.1401754 1.7000000
114 4.1629317 1.6217275 1.2165525 1.4352700 1.2529964 1.0677078 0.9110434
115 4.4068129 1.6613248 1.3000000 1.4662878 1.6613248 1.2449900 1.2569805
116 4.6465041 1.2369317 1.1313708 1.0295630 2.0760539 1.1401754 1.5132746
117 4.6593991 1.0440307 1.0677078 0.7874008 1.9974984 0.9695360 1.3892444
118 6.2952363 2.3430749 2.7166155 2.2045408 3.8974351 2.7092434 3.2634338
119 6.5145990 2.5495098 2.9068884 2.3515952 3.7868192 2.7221315 3.2863353
120 4.1060930 1.4491377 1.1874342 1.2767145 1.1401754 0.8774964 0.8602325
121 5.1497573 1.3490738 1.5264338 1.1357817 2.5806976 1.4730920 2.0099751
122 4.0124805 1.5874508 1.1000000 1.4247807 1.2489996 1.0723805 0.8124038
123 6.2345810 2.2383029 2.6343880 2.0542639 3.5874782 2.4698178 3.0545049
124 4.1060930 0.9695360 0.7141428 0.7810250 1.3638182 0.4795832 0.8831761
125 4.9989999 1.2609520 1.3784049 1.0392305 2.4433583 1.3638182 1.8248288
126 5.3450912 1.3747727 1.7262677 1.1832160 2.8195744 1.6431677 2.2158520
127 3.9761791 0.9848858 0.6164414 0.8246211 1.2767145 0.4690416 0.7681146
128 4.0137264 1.0246951 0.6164414 0.8602325 1.3820275 0.6164414 0.7810250
129 4.8435524 1.3490738 1.3152946 1.0908712 2.0639767 1.1704700 1.5297059
130 5.1234754 1.1532563 1.5427249 0.9591663 2.5903668 1.4071247 2.0174241
131 5.5668663 1.5905974 1.9697716 1.3928388 2.9376862 1.7944358 2.4103942
132 6.0745370 2.1023796 2.5436195 1.9974984 3.7762415 2.5396850 3.1527766
133 4.8836462 1.4035669 1.3638182 1.1489125 2.1047565 1.2247449 1.5842980
134 4.1617304 0.9055385 0.7280110 0.7000000 1.4628739 0.5385165 0.8717798
135 4.5585085 1.4071247 1.2922848 1.1789826 1.7378147 1.1000000 1.1916375
136 5.8206529 1.8165902 2.2203603 1.6522712 3.2771939 2.0904545 2.7568098
137 4.9173163 1.5297059 1.4387495 1.3228757 2.3706539 1.4866069 1.7720045
138 4.6227697 1.0816654 1.0488088 0.8366600 1.9874607 1.0000000 1.3527749
139 3.9000000 1.1000000 0.6164414 0.9591663 1.2767145 0.6480741 0.6855655
140 4.8228622 1.0000000 1.1958261 0.7810250 2.2803509 1.1180340 1.7262677
141 5.0408333 1.3820275 1.4560220 1.1575837 2.4186773 1.3928388 1.8734994
142 4.6636895 0.9949874 1.1224972 0.8246211 2.1931712 1.0677078 1.7000000
143 4.1928511 1.4491377 1.0630146 1.2529964 1.3490738 0.9539392 0.8602325
144 5.2829916 1.5132746 1.6613248 1.2884099 2.6664583 1.6062378 2.0808652
145 5.1643005 1.5198684 1.5937377 1.3114877 2.6019224 1.5811388 2.0322401
146 4.6722586 1.0908712 1.1224972 0.8831761 2.0904545 1.0392305 1.5905974
147 4.2638011 1.1489125 0.9539392 0.9433981 1.4282857 0.6708204 1.0295630
148 4.4743715 0.9486833 0.8888194 0.7141428 1.8493242 0.8062258 1.2884099
149 4.6754679 1.4071247 1.2369317 1.2124356 2.1587033 1.3152946 1.5556349
150 4.1412558 1.2529964 0.8602325 1.0677078 1.4525839 0.8602325 0.8306624
           57        58        59        60        61        62        63
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58  2.2561028                                                            
59  0.5916080 2.2181073                                                  
60  1.5000000 0.8366600 1.5811388                                        
61  2.2759613 0.4582576 2.1610183 0.9219544                              
62  0.7141428 1.5556349 0.8366600 0.8246211 1.5968719                    
63  1.4662878 1.3190906 1.1401754 1.0295630 1.1357817 0.9695360          
64  0.4898979 1.9519221 0.5196152 1.2206556 1.9026298 0.5567764 1.0723805
65  1.3964240 0.9591663 1.4142136 0.5477226 1.1269428 0.7071068 0.9486833
66  0.5744563 2.2583180 0.3162278 1.6309506 2.2516660 0.8366600 1.2727922
67  0.7937254 1.5937377 1.0295630 0.7874008 1.6155494 0.4242641 1.1401754
68  1.1532563 1.2409674 1.0099505 0.7483315 1.2206556 0.6000000 0.5477226
69  1.1269428 1.8493242 0.8366600 1.2727922 1.6522712 0.9055385 0.7348469
70  1.4212670 0.9327379 1.3000000 0.5385165 0.8831761 0.7681146 0.5196152
71  0.4690416 2.1283797 0.9327379 1.3076697 2.1400935 0.7000000 1.5132746
72  0.9327379 1.4764823 0.7874008 0.9165151 1.4798649 0.4000000 0.6782330
73  0.8306624 2.1863211 0.6164414 1.5033296 2.0371549 0.9486833 1.1135529
74  0.6708204 1.8973666 0.5291503 1.2247449 1.8248288 0.6480741 0.9486833
75  0.6480741 1.8947295 0.3605551 1.2845233 1.8708287 0.5567764 0.9110434
76  0.5567764 2.1494185 0.2449490 1.5165751 2.1283797 0.7348469 1.1489125
77  0.7416198 2.4859606 0.3162278 1.8384776 2.3937418 1.1045361 1.3416408
78  0.5916080 2.6419690 0.5830952 1.9078784 2.5748786 1.1489125 1.6186414
79  0.5477226 1.7748239 0.6403124 1.0246951 1.7492856 0.3316625 0.9949874
80  1.6278821 0.8485281 1.4832397 0.7615773 0.9219544 0.9695360 0.7071068
81  1.5842980 0.7874008 1.4628739 0.5291503 0.7141428 0.9165151 0.5830952
82  1.6763055 0.7211103 1.5362291 0.6164414 0.6708204 1.0099505 0.6164414
83  1.1874342 1.1401754 1.0862780 0.6324555 1.1532563 0.5291503 0.5830952
84  0.7810250 2.2135944 0.8602325 1.4560220 2.1000000 0.9591663 1.3490738
85  0.9746794 1.5165751 1.2247449 0.7071068 1.5524175 0.5830952 1.2247449
86  0.3741657 2.0024984 0.8426150 1.2369317 2.0784610 0.5196152 1.4317821
87  0.4582576 2.4372115 0.3162278 1.7492856 2.4062419 0.9486833 1.4282857
88  1.0862780 1.8083141 0.7000000 1.2767145 1.6370706 0.8544004 0.5916080
89  1.0148892 1.2569805 1.1224972 0.5477226 1.3453624 0.3741657 0.9486833
90  1.3638182 0.9746794 1.3152946 0.3872983 0.9165151 0.7000000 0.6557439
91  1.1747340 1.2845233 1.1618950 0.6244998 1.2083046 0.6708204 0.7810250
92  0.4242641 1.9104973 0.5196152 1.1789826 1.8920888 0.4582576 1.0816654
93  1.1789826 1.1747340 1.0488088 0.6480741 1.1357817 0.5477226 0.4898979
94  2.2383029 0.1414214 2.1679483 0.8485281 0.3605551 1.5362291 1.2247449
95  1.0908712 1.2165525 1.0954451 0.5099020 1.1958261 0.4690416 0.7348469
96  0.9273618 1.3601471 0.9949874 0.6855655 1.4212670 0.3605551 0.9000000
97  0.9273618 1.3379088 0.9848858 0.6244998 1.3711309 0.3000000 0.8426150
98  0.6480741 1.7406895 0.5000000 1.1000000 1.7262677 0.3872983 0.8426150
99  2.2847319 0.3872983 2.2383029 0.9746794 0.7211103 1.5779734 1.3820275
100 1.0295630 1.2369317 1.0344080 0.5567764 1.2569805 0.3605551 0.7416198
101 1.5811388 3.5085610 1.9104973 2.6814175 3.4467376 2.1189620 2.7477263
102 0.9273618 2.2248595 1.1357817 1.4317821 2.1213203 1.0344080 1.5198684
103 1.5556349 3.6290495 1.6093477 2.8618176 3.5185224 2.1656408 2.5826343
104 1.0049876 2.8530685 1.1575837 2.0736441 2.7477263 1.4899664 1.9442222
105 1.3038405 3.2572995 1.5066519 2.4556058 3.1591138 1.8466185 2.3600847
106 2.3748684 4.4440972 2.3769729 3.6918830 4.3104524 3.0016662 3.3421550
107 1.6278821 1.3928388 1.7944358 0.7615773 1.3228757 1.1747340 1.4282857
108 1.9390719 3.9560081 1.9052559 3.2202484 3.8183766 2.5436195 2.8478062
109 1.4317821 3.1843367 1.3638182 2.4617067 3.0116441 1.8814888 2.1118712
110 1.9157244 4.1012193 2.1307276 3.2954514 4.0509258 2.5806976 3.1717503
111 0.6082763 2.7276363 0.9165151 1.9339080 2.6925824 1.2083046 1.8601075
112 0.9055385 2.6739484 0.9643651 1.9104973 2.5495098 1.3076697 1.7058722
113 1.1090537 3.1654384 1.2247449 2.3874673 3.0740852 1.6911535 2.1771541
114 1.1180340 2.1307276 1.2727922 1.3638182 1.9974984 1.0862780 1.4764823
115 1.1401754 2.4839485 1.4525839 1.6763055 2.4083189 1.2922848 1.8894444
116 0.9327379 2.9291637 1.2727922 2.1118712 2.8861739 1.4628739 2.1307276
117 0.9000000 2.8982753 1.0392305 2.1213203 2.8089144 1.4628739 1.9442222
118 2.5632011 4.7749346 2.6907248 3.9924930 4.7127487 3.2588341 3.7656341
119 2.7892651 4.7465777 2.7549955 4.0087405 4.5716518 3.3660065 3.6262929
120 1.1832160 2.0952327 1.0246951 1.4525839 1.8814888 1.1357817 1.1180340
121 1.3638182 3.4770677 1.5459625 2.6814175 3.4029399 1.9824228 2.5278449
122 0.9695360 2.0518285 1.2609520 1.2369317 1.9899749 0.9327379 1.5264338
123 2.5238859 4.5343136 2.4738634 3.8026307 4.3783559 3.1272992 3.3970576
124 0.6633250 2.2912878 0.6855655 1.5394804 2.1863211 0.9110434 1.3379088
125 1.1874342 3.3196385 1.4212670 2.5179357 3.2603681 1.8275667 2.4083189
126 1.5968719 3.7229021 1.6309506 2.9698485 3.6290495 2.2494444 2.6608269
127 0.5567764 2.1771541 0.6782330 1.4071247 2.1000000 0.7615773 1.2961481
128 0.4582576 2.2360680 0.7745967 1.4352700 2.1931712 0.7874008 1.4491377
129 1.1489125 2.9849623 1.3000000 2.1977261 2.8670542 1.6155494 2.0712315
130 1.4525839 3.5014283 1.3784049 2.7820855 3.3896903 2.0639767 2.3832751
131 1.8734994 3.8807216 1.8055470 3.1527766 3.7376463 2.4617067 2.7459060
132 2.4207437 4.6443514 2.4959968 3.8871583 4.5891176 3.1192948 3.5958309
133 1.1958261 3.0232433 1.3638182 2.2315914 2.9068884 1.6552945 2.1260292
134 0.6480741 2.3685439 0.6244998 1.6340135 2.2671568 1.0049876 1.3820275
135 1.1747340 2.6324893 1.1618950 1.9261360 2.4779023 1.4730920 1.7000000
136 2.1213203 4.2107007 2.1142375 3.4626579 4.0914545 2.7367864 3.1032241
137 1.2083046 3.1953091 1.5968719 2.3643181 3.1654384 1.7578396 2.4596748
138 0.8544004 2.8670542 1.0677078 2.0784610 2.7946377 1.4282857 1.9646883
139 0.4795832 2.1118712 0.8124038 1.3038405 2.0808652 0.6782330 1.3856406
140 1.0677078 3.1796226 1.1874342 2.4062419 3.1048349 1.6763055 2.1886069
141 1.2845233 3.3136083 1.5033296 2.5099801 3.2357379 1.8493242 2.4124676
142 1.0246951 3.0692019 1.1747340 2.3021729 3.0116441 1.5684387 2.1260292
143 0.9273618 2.2248595 1.1357817 1.4317821 2.1213203 1.0344080 1.5198684
144 1.4798649 3.5637059 1.6792856 2.7604347 3.4828150 2.0928450 2.6343880
145 1.4035669 3.4727511 1.6792856 2.6570661 3.4161382 1.9949937 2.6153394
146 0.9949874 2.9832868 1.1747340 2.2000000 2.9103264 1.5099669 2.0639767
147 0.9055385 2.3811762 0.8774964 1.6462078 2.2360680 1.1000000 1.4106736
148 0.7348469 2.7440845 0.9327379 1.9570386 2.6720778 1.2688578 1.8248288
149 1.0000000 2.9647934 1.4317821 2.1330729 2.9495762 1.5264338 2.2649503
150 0.6708204 2.2891046 1.0000000 1.4764823 2.2383029 0.9486833 1.5811388
           64        65        66        67        68        69        70
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65  1.2124356                                                            
66  0.7000000 1.3784049                                                  
67  0.5567764 0.9273618 1.1135529                                        
68  0.8062258 0.6480741 1.1045361 0.7348469                              
69  0.7416198 1.3038405 1.0392305 1.0000000 0.9055385                    
70  1.0677078 0.5385165 1.3820275 0.8774964 0.3605551 0.9848858          
71  0.5477226 1.3674794 0.9848858 0.5567764 1.1789826 1.1269428 1.3711309
72  0.7141428 0.6480741 0.7874008 0.7615773 0.4472136 0.8124038 0.6244998
73  0.5000000 1.5427249 0.8831761 0.9486833 1.0862780 0.5099020 1.2845233
74  0.2236068 1.2165525 0.7615773 0.6480741 0.7071068 0.7071068 0.9949874
75  0.5099020 1.0630146 0.3872983 0.8544004 0.7280110 0.7810250 1.0000000
76  0.5916080 1.2884099 0.1414214 1.0099505 0.9899495 0.9055385 1.2609520
77  0.7141428 1.7029386 0.5099020 1.2569805 1.2884099 0.9055385 1.5588457
78  0.7416198 1.8275667 0.6782330 1.2247449 1.4832397 1.0862780 1.7406895
79  0.2449490 1.0049876 0.7416198 0.4123106 0.7000000 0.7280110 0.9165151
80  1.3601471 0.4472136 1.4899664 1.1916375 0.6164414 1.2884099 0.4358899
81  1.2288206 0.5830952 1.5427249 1.0099505 0.5291503 1.0862780 0.1732051
82  1.3304135 0.6000000 1.6062378 1.1224972 0.5830952 1.1916375 0.2645751
83  0.9000000 0.4242641 1.1224972 0.7615773 0.2828427 0.9273618 0.3000000
84  0.5000000 1.5937377 1.0862780 0.7874008 1.1832160 0.8124038 1.3747727
85  0.7416198 0.9486833 1.3114877 0.2000000 0.8124038 1.1313708 0.9000000
86  0.5830952 1.1445523 0.7937254 0.5744563 1.0246951 1.2206556 1.2569805
87  0.6403124 1.5811388 0.3162278 1.1224972 1.2569805 1.0488088 1.5394804
88  0.7071068 1.2206556 0.9000000 1.0148892 0.7681146 0.2645751 0.9055385
89  0.7937254 0.5099020 1.1489125 0.4472136 0.4690416 1.0954451 0.5744563
90  1.0099505 0.5744563 1.4035669 0.7416198 0.4795832 0.9327379 0.2449490
91  0.7615773 0.8660254 1.3152946 0.5196152 0.4795832 0.8660254 0.5291503
92  0.1414214 1.1269428 0.6403124 0.5196152 0.7681146 0.8185353 1.0392305
93  0.8426150 0.5477226 1.1224972 0.7348469 0.2449490 0.8124038 0.2645751
94  1.9209373 0.9486833 2.2135944 1.5937377 1.2000000 1.7720045 0.8774964
95  0.7416198 0.6324555 1.1916375 0.4690416 0.3741657 0.8602325 0.4123106
96  0.6782330 0.6244998 1.0440307 0.4358899 0.3872983 1.0344080 0.6000000
97  0.6480741 0.6082763 1.0440307 0.3872983 0.3872983 0.9327379 0.5477226
98  0.4242641 0.9219544 0.5567764 0.6708204 0.5744563 0.7549834 0.8485281
99  2.0346990 0.9000000 2.2293497 1.7058722 1.3228757 1.9261360 1.0295630
100 0.7348469 0.5196152 1.0908712 0.5000000 0.3316625 0.9000000 0.4242641
101 1.7606817 2.8017851 1.9924859 1.9570386 2.5436195 2.1142375 2.7386128
102 0.7348469 1.6401219 1.3076697 0.8062258 1.3453624 0.9643651 1.4696938
103 1.7146428 2.8618176 1.7058722 2.1377558 2.4959968 1.9416488 2.7386128
104 1.0049876 2.1771541 1.3416408 1.3416408 1.7832555 1.3416408 2.0074860
105 1.4212670 2.5436195 1.6278821 1.7291616 2.2158520 1.7058722 2.4248711
106 2.5219040 3.6945906 2.4799194 2.9614186 3.2848135 2.7147744 3.5411862
107 1.3152946 1.2727922 1.9235384 0.8831761 1.2247449 1.3490738 1.1000000
108 2.0396078 3.2295511 2.0420578 2.4959968 2.7874720 2.2427661 3.0495901
109 1.3747727 2.5416530 1.5748016 1.8000000 2.0928450 1.4560220 2.3043437
110 2.2068076 3.2771939 2.1447611 2.5455844 3.0033315 2.5534291 3.2511536
111 0.8774964 1.9078784 0.9486833 1.2083046 1.6552945 1.3038405 1.8841444
112 0.8602325 1.9824228 1.1445523 1.2369317 1.6155494 1.0440307 1.8110770
113 1.2767145 2.3874673 1.3114877 1.6733201 2.0639767 1.5362291 2.2912878
114 0.8774964 1.6186414 1.4422205 0.8717798 1.3638182 0.9165151 1.4247807
115 1.1224972 1.8734994 1.5459625 1.1180340 1.7233688 1.3000000 1.8055470
116 1.1618950 2.1494185 1.3114877 1.4000000 1.9339080 1.5231546 2.1283797
117 0.9848858 2.1633308 1.1916375 1.3784049 1.7832555 1.3490738 2.0273135
118 2.8301943 3.9547440 2.7239677 3.2218007 3.6083237 3.1843367 3.8923001
119 2.8809721 4.0484565 2.8827071 3.3120990 3.6262929 2.9681644 3.8548671
120 0.7745967 1.6278821 1.2922848 1.0246951 1.1618950 0.5385165 1.2727922
121 1.5937377 2.6814175 1.5968719 1.9519221 2.3895606 1.8894444 2.6191602
122 0.8124038 1.4798649 1.3820275 0.6708204 1.3000000 1.0630146 1.3784049
123 2.6324893 3.8105118 2.5961510 3.0886890 3.3734256 2.7748874 3.6262929
124 0.5291503 1.5716234 0.8544004 0.9110434 1.2369317 0.7141428 1.4212670
125 1.4177447 2.5337719 1.4899664 1.7606817 2.2226111 1.8055470 2.4677925
126 1.7748239 2.9427878 1.7262677 2.2226111 2.5416530 2.0832667 2.8195744
127 0.4358899 1.4352700 0.8124038 0.7615773 1.1401754 0.7348469 1.3228757
128 0.4582576 1.4832397 0.8831761 0.7071068 1.2083046 0.9486833 1.4106736
129 1.1832160 2.3000000 1.4525839 1.5000000 1.9570386 1.4035669 2.1494185
130 1.5716234 2.7386128 1.5033296 2.0639767 2.3021729 1.8275667 2.5826343
131 1.9773720 3.1400637 1.9287302 2.4494897 2.7166155 2.1260292 2.9681644
132 2.7018512 3.7986840 2.5079872 3.1288976 3.4510868 3.0512293 3.7469988
133 1.2449900 2.3366643 1.5033296 1.5427249 2.0149442 1.4491377 2.1977261
134 0.4690416 1.6703293 0.8660254 0.9433981 1.2288206 0.8544004 1.4764823
135 0.9486833 2.0856654 1.4317821 1.2767145 1.5842980 1.1789826 1.8000000
136 2.3108440 3.4161382 2.1702534 2.7586228 3.0643107 2.4677925 3.3075671
137 1.4491377 2.4392622 1.6401219 1.6340135 2.2248595 1.8627936 2.4248711
138 0.9643651 2.1307276 1.2083046 1.3190906 1.7663522 1.3928388 2.0124612
139 0.4358899 1.3638182 0.9055385 0.5830952 1.1224972 0.9273618 1.3076697
140 1.2884099 2.3685439 1.2369317 1.6941074 2.0663978 1.5716234 2.3021729
141 1.4866069 2.5416530 1.5620499 1.8000000 2.2759613 1.7549929 2.4799194
142 1.2845233 2.2315914 1.1575837 1.6431677 2.0149442 1.5165751 2.2203603
143 0.7348469 1.6401219 1.3076697 0.8062258 1.3453624 0.9643651 1.4696938
144 1.6822604 2.7964263 1.7549929 2.0199010 2.4859606 1.9899749 2.7147744
145 1.6522712 2.6870058 1.7146428 1.9339080 2.4454039 1.9748418 2.6551836
146 1.1958261 2.1863211 1.2083046 1.5297059 1.9493589 1.4212670 2.1424285
147 0.7348469 1.7233688 1.0630146 1.0723805 1.3820275 0.7141428 1.5297059
148 0.8831761 1.9672316 1.0246951 1.2449900 1.6703293 1.2124356 1.8867962
149 1.2489996 2.2022716 1.4662878 1.4035669 2.0074860 1.7000000 2.2045408
150 0.6082763 1.6124515 1.1401754 0.7348469 1.3190906 1.0862780 1.5066519
           71        72        73        74        75        76        77
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72  1.0440307                                                            
73  0.8660254 0.9899495                                                  
74  0.7549834 0.7071068 0.5099020                                        
75  0.9165151 0.4358899 0.7549834 0.5196152                              
76  0.9219544 0.6782330 0.7745967 0.6480741 0.2645751                    
77  1.0630146 1.0677078 0.6000000 0.7348469 0.6557439 0.4898979          
78  0.8544004 1.2489996 0.6782330 0.8602325 0.8660254 0.6782330 0.4242641
79  0.5291503 0.5567764 0.6403124 0.3872983 0.4898979 0.6244998 0.8660254
80  1.6522712 0.7348469 1.6062378 1.2961481 1.1445523 1.3928388 1.7606817
81  1.5132746 0.7745967 1.4212670 1.1575837 1.1618950 1.4212670 1.7146428
82  1.6278821 0.8366600 1.5297059 1.2489996 1.2288206 1.4899664 1.7944358
83  1.1958261 0.3464102 1.1747340 0.8602325 0.7549834 1.0099505 1.3638182
84  0.6244998 1.1489125 0.4242641 0.5830952 0.9643651 0.9899495 0.8831761
85  0.6855655 0.9055385 1.1045361 0.8124038 1.0440307 1.2083046 1.4491377
86  0.4242641 0.8426150 1.0344080 0.7549834 0.7348469 0.7549834 1.0630146
87  0.8660254 0.9899495 0.7483315 0.7348469 0.5744563 0.3464102 0.3464102
88  1.1747340 0.6708204 0.5744563 0.6244998 0.6164414 0.7681146 0.8185353
89  0.9327379 0.5477226 1.1916375 0.8124038 0.8306624 1.0488088 1.4071247
90  1.2409674 0.6708204 1.2206556 0.9746794 1.0295630 1.2767145 1.5588457
91  1.0198039 0.7549834 0.9949874 0.7000000 0.9591663 1.1874342 1.3892444
92  0.5291503 0.6403124 0.6244998 0.3000000 0.4472136 0.5385165 0.7549834
93  1.1704700 0.3741657 1.0770330 0.7874008 0.7416198 1.0000000 1.3114877
94  2.1236761 1.4282857 2.1307276 1.8601075 1.8466185 2.1023796 2.4289916
95  0.9746794 0.5477226 1.0295630 0.7211103 0.8306624 1.0677078 1.3490738
96  0.8944272 0.5000000 1.0908712 0.6708204 0.7211103 0.9433981 1.2845233
97  0.8602325 0.4582576 1.0246951 0.6557439 0.7071068 0.9327379 1.2609520
98  0.8246211 0.3316625 0.7549834 0.4358899 0.2000000 0.4358899 0.7937254
99  2.2045408 1.4594520 2.2825424 1.9974984 1.8920888 2.1330729 2.5119713
100 0.9695360 0.4123106 1.0630146 0.7280110 0.7348469 0.9746794 1.3076697
101 1.4491377 2.3937418 1.6881943 1.9157244 2.1213203 1.9874607 1.7748239
102 0.6000000 1.2922848 0.7000000 0.8660254 1.1832160 1.2124356 1.1618950
103 1.6673332 2.3000000 1.5000000 1.8138357 1.9235384 1.7291616 1.3527749
104 0.9433981 1.6911535 0.8602325 1.1045361 1.3964240 1.3038405 1.0295630
105 1.2489996 2.0615528 1.2609520 1.5524175 1.7549929 1.6155494 1.3304135
106 2.5019992 3.1128765 2.2781571 2.5903668 2.7166155 2.5159491 2.1000000
107 1.2609520 1.3928388 1.4696938 1.3490738 1.6155494 1.8000000 1.9697716
108 2.0736441 2.6438608 1.7916473 2.0904545 2.2494444 2.0663978 1.6340135
109 1.4594520 1.9849433 1.0295630 1.4212670 1.6583124 1.5427249 1.1224972
110 2.0074860 2.7748874 2.1118712 2.3452079 2.4103942 2.1954498 1.9235384
111 0.7000000 1.4212670 0.9055385 1.0583005 1.1090537 0.9486833 0.8366600
112 0.8717798 1.4662878 0.6082763 0.9746794 1.1832160 1.0908712 0.8185353
113 1.1958261 1.8493242 1.1045361 1.4071247 1.5000000 1.3190906 1.0099505
114 0.7810250 1.3190906 0.7874008 0.9899495 1.2767145 1.3341664 1.3038405
115 0.7874008 1.5842980 1.0908712 1.3000000 1.4899664 1.4730920 1.4456832
116 0.8660254 1.7146428 1.1401754 1.3490738 1.4456832 1.3038405 1.1747340
117 0.9433981 1.6431677 0.8602325 1.0954451 1.3076697 1.1747340 0.8831761
118 2.7147744 3.4146742 2.7166155 2.9257478 3.0116441 2.7892651 2.4617067
119 2.8740216 3.4655447 2.5709920 2.9410882 3.0886890 2.9034462 2.4637370
120 1.0677078 1.1874342 0.4358899 0.7416198 1.0862780 1.1704700 1.0246951
121 1.4352700 2.1656408 1.4594520 1.7349352 1.8165902 1.6217275 1.3379088
122 0.5477226 1.2449900 0.9110434 0.9643651 1.2247449 1.2845233 1.3453624
123 2.6551836 3.2155870 2.3537205 2.6832816 2.8195744 2.6267851 2.1863211
124 0.6480741 1.0535654 0.3605551 0.6708204 0.8124038 0.7681146 0.6557439
125 1.2449900 2.0346990 1.3416408 1.5556349 1.6881943 1.5099669 1.2489996
126 1.7691806 2.3706539 1.6124515 1.8493242 1.9672316 1.7663522 1.3856406
127 0.5000000 0.9486833 0.4472136 0.6164414 0.7416198 0.7211103 0.7211103
128 0.3000000 1.0488088 0.6164414 0.6633250 0.8426150 0.8124038 0.8366600
129 1.0677078 1.8138357 0.9746794 1.3076697 1.5297059 1.4177447 1.1357817
130 1.6643317 2.1400935 1.3711309 1.6186414 1.7291616 1.5362291 1.1135529
131 2.0273135 2.5416530 1.7029386 2.0346990 2.1470911 1.9544820 1.5165751
132 2.6381812 3.2388269 2.5980762 2.7874720 2.8213472 2.5865034 2.2649503
133 1.1000000 1.8601075 1.0392305 1.3784049 1.5842980 1.4696938 1.2000000
134 0.7071068 1.1357817 0.3605551 0.5385165 0.8366600 0.7937254 0.5916080
135 1.0954451 1.6155494 0.7416198 0.9433981 1.3711309 1.3601471 1.0816654
136 2.2847319 2.8301943 2.0712315 2.4020824 2.4372115 2.2158520 1.8303005
137 1.0954451 2.0420578 1.4525839 1.6278821 1.7776389 1.6401219 1.5000000
138 0.8660254 1.6370706 0.9055385 1.0862780 1.3152946 1.1916375 0.9486833
139 0.2236068 0.9695360 0.6633250 0.6480741 0.8185353 0.8246211 0.9165151
140 1.2083046 1.8248288 1.1532563 1.4247807 1.4628739 1.2609520 0.9746794
141 1.2845233 2.0542639 1.3490738 1.6431677 1.7406895 1.5684387 1.3190906
142 1.1618950 1.7146428 1.1832160 1.4491377 1.3892444 1.1832160 1.0000000
143 0.6000000 1.2922848 0.7000000 0.8660254 1.1832160 1.2124356 1.1618950
144 1.5066519 2.2934690 1.5427249 1.8165902 1.9519221 1.7720045 1.4764823
145 1.3964240 2.2226111 1.5620499 1.8165902 1.9104973 1.7320508 1.5099669
146 1.0440307 1.6852300 1.0677078 1.3638182 1.3820275 1.2083046 1.0099505
147 0.8366600 1.2206556 0.4123106 0.8426150 1.0099505 0.9746794 0.7937254
148 0.7745967 1.4594520 0.7937254 1.0440307 1.1489125 1.0049876 0.8062258
149 0.8602325 1.8248288 1.3076697 1.4387495 1.5811388 1.4594520 1.3747727
150 0.3605551 1.2409674 0.7348469 0.7745967 1.0723805 1.0677078 1.0488088
           78        79        80        81        82        83        84
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79  0.8888194                                                            
80  1.9748418 1.1958261                                                  
81  1.8973666 1.0723805 0.4242641                                        
82  1.9949937 1.1789826 0.3464102 0.1414214                              
83  1.5362291 0.7280110 0.4690416 0.4472136 0.5099020                    
84  0.7745967 0.6403124 1.7378147 1.5099669 1.6309506 1.2806248          
85  1.4071247 0.6082763 1.2247449 1.0099505 1.1224972 0.8366600 0.9055385
86  0.9539392 0.5099020 1.4456832 1.4106736 1.5000000 1.0246951 0.9219544
87  0.3741657 0.7549834 1.7146428 1.7029386 1.7832555 1.3038405 0.9055385
88  1.0816654 0.7071068 1.1618950 1.0246951 1.1090537 0.8185353 0.9110434
89  1.4764823 0.6082763 0.7874008 0.7071068 0.7874008 0.4242641 1.1575837
90  1.6881943 0.8366600 0.6244998 0.3000000 0.4358899 0.3872983 1.2609520
91  1.4866069 0.6633250 0.9433981 0.6403124 0.7549834 0.5916080 0.9539392
92  0.7810250 0.2000000 1.3000000 1.2041595 1.3000000 0.8426150 0.6244998
93  1.4899664 0.6855655 0.5477226 0.4242641 0.5099020 0.1414214 1.1916375
94  2.6000000 1.7464249 0.7874008 0.7211103 0.6480741 1.0954451 2.1817424
95  1.4491377 0.5744563 0.7745967 0.5477226 0.6633250 0.3741657 1.0295630
96  1.3747727 0.5291503 0.8306624 0.7549834 0.8306624 0.4358899 1.0723805
97  1.3453624 0.4690416 0.8185353 0.7000000 0.7937254 0.3872983 1.0148892
98  0.9539392 0.3464102 1.0344080 1.0148892 1.0908712 0.6082763 0.9000000
99  2.6776856 1.8384776 0.7937254 0.9000000 0.8185353 1.1618950 2.3473389
100 1.4177447 0.5477226 0.7000000 0.5744563 0.6708204 0.2645751 1.0908712
101 1.3747727 1.8708287 3.0577770 2.8722813 2.9983329 2.5903668 1.4387495
102 0.9746794 0.7745967 1.8411953 1.5842980 1.7175564 1.3892444 0.3605551
103 1.0630146 1.8814888 3.0149627 2.8861739 2.9949958 2.5670995 1.4798649
104 0.7348469 1.1789826 2.3452079 2.1494185 2.2671568 1.8814888 0.6480741
105 0.9643651 1.5620499 2.7440845 2.5632011 2.6851443 2.2781571 1.0908712
106 1.8788294 2.7092434 3.8196859 3.6891733 3.7934153 3.3808283 2.2693611
107 1.9339080 1.1874342 1.4628739 1.1045361 1.2247449 1.2083046 1.2727922
108 1.4387495 2.2405357 3.3361655 3.1984371 3.3000000 2.9000000 1.7916473
109 0.9486833 1.5588457 2.6343880 2.4372115 2.5495098 2.1954498 1.0295630
110 1.5684387 2.3430749 3.5014283 3.4029399 3.5128336 3.0495901 2.0149442
111 0.4242641 0.9746794 2.1354157 2.0346990 2.1447611 1.6792856 0.8124038
112 0.5567764 1.0000000 2.1330729 1.9467922 2.0663978 1.6763055 0.5385165
113 0.6480741 1.4177447 2.5651511 2.4372115 2.5495098 2.1118712 1.0677078
114 1.1575837 0.8660254 1.8055470 1.5165751 1.6552945 1.3784049 0.5477226
115 1.1618950 1.1045361 2.1377558 1.9052559 2.0420578 1.7000000 0.8306624
116 0.7615773 1.2369317 2.4041631 2.2671568 2.3874673 1.9442222 0.9695360
117 0.5477226 1.1618950 2.3323808 2.1771541 2.2891046 1.8708287 0.7348469
118 2.1863211 3.0049958 4.1376322 4.0521599 4.1521079 3.6959437 2.6495283
119 2.2649503 3.0626786 4.1533119 3.9912404 4.1000000 3.7188708 2.5748786
120 1.0816654 0.8602325 1.6583124 1.3747727 1.4933185 1.2609520 0.5196152
121 0.9643651 1.7262677 2.8861739 2.7658633 2.8792360 2.4310492 1.3820275
122 1.1618950 0.7615773 1.7349352 1.4798649 1.6155494 1.3000000 0.6082763
123 2.0049938 2.8266588 3.9089641 3.7709415 3.8729833 3.4785054 2.3706539
124 0.5196152 0.6164414 1.7233688 1.5588457 1.6763055 1.2688578 0.4123106
125 0.8602325 1.5652476 2.7459060 2.6191602 2.7313001 2.2847319 1.2083046
126 1.1401754 1.9672316 3.0822070 2.9765752 3.0757113 2.6419690 1.5937377
127 0.5830952 0.4795832 1.6186414 1.4628739 1.5811388 1.1575837 0.4242641
128 0.6164414 0.5196152 1.7088007 1.5556349 1.6733201 1.2409674 0.4242641
129 0.8062258 1.3190906 2.4799194 2.2825424 2.4062419 2.0174241 0.8185353
130 0.9486833 1.7748239 2.8390139 2.7386128 2.8319605 2.4124676 1.4212670
131 1.3341664 2.1656408 3.2403703 3.1144823 3.2155870 2.8106939 1.7492856
132 2.0322401 2.8774989 3.9610605 3.9102430 4.0012498 3.5369478 2.5826343
133 0.8602325 1.3674794 2.5258662 2.3280893 2.4535688 2.0639767 0.8831761
134 0.5000000 0.6782330 1.7916473 1.6278821 1.7349352 1.3379088 0.3316625
135 0.9848858 1.1489125 2.1748563 1.9313208 2.0420578 1.7406895 0.5567764
136 1.6031220 2.4698178 3.5510562 3.4539832 3.5566838 3.1224990 2.1142375
137 1.0816654 1.5362291 2.7147744 2.5632011 2.6851443 2.2516660 1.2124356
138 0.6000000 1.1357817 2.3194827 2.1633308 2.2759613 1.8547237 0.7211103
139 0.7348469 0.4358899 1.6062378 1.4491377 1.5684387 1.1401754 0.4690416
140 0.6082763 1.4212670 2.5514702 2.4515301 2.5592968 2.1047565 1.1445523
141 0.9273618 1.5968719 2.7604347 2.6191602 2.7386128 2.3021729 1.2409674
142 0.6480741 1.3601471 2.4372115 2.3622024 2.4698178 2.0049938 1.2083046
143 0.9746794 0.7745967 1.8411953 1.5842980 1.7175564 1.3892444 0.3605551
144 1.1045361 1.8248288 3.0033315 2.8600699 2.9765752 2.5416530 1.4212670
145 1.1045361 1.7578396 2.9291637 2.7964263 2.9154759 2.4698178 1.4212670
146 0.6324555 1.2767145 2.3958297 2.2803509 2.3958297 1.9493589 1.0392305
147 0.6708204 0.8124038 1.8520259 1.6522712 1.7748239 1.4106736 0.4795832
148 0.4123106 1.0000000 2.1656408 2.0322401 2.1470911 1.7058722 0.7141428
149 0.9643651 1.3190906 2.4879711 2.3430749 2.4637370 2.0273135 1.0535654
150 0.8124038 0.6855655 1.8439089 1.6431677 1.7663522 1.3784049 0.3741657
           85        86        87        88        89        90        91
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86  0.7280110                                                            
87  1.3190906 0.7937254                                                  
88  1.1618950 1.1832160 0.9643651                                        
89  0.4898979 0.7549834 1.2727922 1.0344080                              
90  0.7416198 1.1832160 1.5264338 0.9165151 0.5196152                    
91  0.5196152 1.0295630 1.3674794 0.8602325 0.5196152 0.4242641          
92  0.7141428 0.4690416 0.6244998 0.7615773 0.7141428 0.9899495 0.7745967
93  0.8124038 1.0440307 1.2806248 0.7141428 0.4690416 0.3316625 0.5000000
94  1.5297059 2.0024984 2.3958297 1.7291616 1.2569805 0.9327379 1.2609520
95  0.5099020 0.9110434 1.2884099 0.8306624 0.3162278 0.3000000 0.2645751
96  0.5196152 0.7071068 1.1618950 0.9486833 0.1732051 0.5830952 0.4898979
97  0.4795832 0.7211103 1.1532563 0.8717798 0.1732051 0.4898979 0.4242641
98  0.8544004 0.6480741 0.7000000 0.6164414 0.6403124 0.8602325 0.7745967
99  1.6583124 2.0297783 2.4433583 1.8654758 1.3228757 1.0954451 1.4628739
100 0.5744563 0.8366600 1.2206556 0.8366600 0.2236068 0.3741657 0.4242641
101 2.0371549 1.7776389 1.7000000 2.2360680 2.3727621 2.5922963 2.3194827
102 0.8774964 0.9899495 1.1357817 1.1224972 1.2206556 1.3038405 1.0392305
103 2.2825424 1.8920888 1.4035669 2.0049938 2.4758837 2.6570661 2.4041631
104 1.4560220 1.2609520 1.0488088 1.4317821 1.7320508 1.9000000 1.5905974
105 1.8411953 1.5684387 1.3228757 1.8165902 2.1236761 2.3021729 2.0297783
106 3.1000000 2.7166155 2.1886069 2.7676705 3.3000000 3.4727511 3.1968735
107 0.7348469 1.4247807 1.9183326 1.4730920 1.0295630 0.8774964 0.7937254
108 2.6362853 2.2847319 1.7464249 2.2847319 2.8266588 2.9899833 2.7018512
109 1.9287302 1.7406895 1.2884099 1.5524175 2.1447611 2.2203603 1.9416488
110 2.6758176 2.2022716 1.8601075 2.6134269 2.8913665 3.1543621 2.9103264
111 1.3638182 0.9000000 0.6782330 1.3527749 1.5297059 1.7860571 1.5779734
112 1.3747727 1.1747340 0.8774964 1.1575837 1.5905974 1.7029386 1.4560220
113 1.8220867 1.4317821 1.0099505 1.6093477 2.0099751 2.1977261 1.9672316
114 0.9165151 1.1445523 1.3038405 1.1180340 1.2489996 1.2369317 1.0246951
115 1.1704700 1.1832160 1.3674794 1.4832397 1.5132746 1.6124515 1.4352700
116 1.5231546 1.1532563 1.0488088 1.6217275 1.7663522 1.9974984 1.7860571
117 1.5165751 1.2041595 0.8831761 1.4106736 1.7378147 1.9364917 1.6522712
118 3.3555923 2.8722813 2.4454039 3.2109189 3.5524639 3.8249183 3.5454196
119 3.4423829 3.1272992 2.5942244 3.0495901 3.6619667 3.7762415 3.5071356
120 1.1180340 1.3038405 1.1789826 0.7071068 1.2845233 1.1747340 0.9273618
121 2.0904545 1.6673332 1.3000000 1.9646883 2.3000000 2.5179357 2.2847319
122 0.7000000 0.9165151 1.2609520 1.2165525 1.0816654 1.1832160 0.9695360
123 3.2280025 2.8722813 2.3108440 2.8266588 3.4205263 3.5651087 3.2878564
124 1.0723805 0.8831761 0.6708204 0.8124038 1.2124356 1.3190906 1.1224972
125 1.8920888 1.4798649 1.1832160 1.8681542 2.1213203 2.3685439 2.1047565
126 2.3706539 1.9416488 1.4282857 2.1047565 2.5416530 2.7622455 2.4839485
127 0.9273618 0.7280110 0.6633250 0.8185353 1.0677078 1.2124356 1.0246951
128 0.8602325 0.6082763 0.7071068 1.0148892 1.0677078 1.2922848 1.0630146
129 1.6155494 1.4071247 1.1618950 1.5297059 1.8894444 2.0248457 1.7606817
130 2.2226111 1.8138357 1.2165525 1.8303005 2.3537205 2.5436195 2.2737634
131 2.6000000 2.2293497 1.6431677 2.1702534 2.7640550 2.9103264 2.6514147
132 3.2787193 2.7459060 2.2516660 3.0495901 3.4219877 3.7013511 3.4409301
133 1.6552945 1.4456832 1.2165525 1.5842980 1.9339080 2.0663978 1.8138357
134 1.1000000 0.9055385 0.6403124 0.8831761 1.2529964 1.4071247 1.1224972
135 1.3674794 1.3784049 1.1958261 1.2569805 1.6340135 1.7146428 1.3564660
136 2.9137605 2.4698178 1.9000000 2.5179357 3.0675723 3.2403703 3.0166206
137 1.7291616 1.3928388 1.3674794 1.9646883 2.0273135 2.2847319 2.0396078
138 1.4491377 1.1357817 0.9055385 1.4525839 1.6911535 1.9157244 1.6217275
139 0.7348469 0.5385165 0.7745967 0.9949874 0.9486833 1.1789826 0.9643651
140 1.8520259 1.4000000 0.9433981 1.6248077 2.0074860 2.2181073 2.0049938
141 1.9287302 1.5588457 1.2727922 1.8574176 2.1633308 2.3600847 2.1377558
142 1.8055470 1.3228757 0.9165151 1.5779734 1.9235384 2.1283797 1.9773720
143 0.8774964 0.9899495 1.1357817 1.1224972 1.2206556 1.3038405 1.0392305
144 2.1447611 1.7691806 1.4491377 2.0760539 2.3916521 2.6057628 2.3473389
145 2.0542639 1.6583124 1.4282857 2.0712315 2.3021729 2.5317978 2.3043437
146 1.6792856 1.2767145 0.9486833 1.5132746 1.8493242 2.0322401 1.8574176
147 1.2124356 1.1135529 0.8774964 0.8717798 1.3820275 1.4142136 1.2247449
148 1.3964240 1.0295630 0.7416198 1.2884099 1.5842980 1.7832555 1.5620499
149 1.5000000 1.1575837 1.2124356 1.7944358 1.7916473 2.0639767 1.8275667
150 0.8366600 0.7549834 0.9486833 1.1789826 1.1575837 1.3674794 1.0816654
           92        93        94        95        96        97        98
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93  0.8062258                                                            
94  1.8841444 1.1224972                                                  
95  0.7141428 0.3162278 1.1916375                                        
96  0.6000000 0.4582576 1.3527749 0.3316625                              
97  0.5830952 0.3872983 1.3228757 0.2236068 0.1414214                    
98  0.3464102 0.5916080 1.7000000 0.6403124 0.5291503 0.5099020          
99  1.9748418 1.2288206 0.3872983 1.3304135 1.4352700 1.4142136 1.7606817
100 0.6782330 0.2645751 1.2124356 0.1732051 0.2449490 0.1414214 0.5477226
101 1.8165902 2.5357445 3.4971417 2.3515952 2.3194827 2.2803509 2.1213203
102 0.8246211 1.3076697 2.2022716 1.1000000 1.1832160 1.1045361 1.0954451
103 1.7832555 2.5039968 3.5874782 2.4228083 2.3790755 2.3452079 2.0049938
104 1.1000000 1.8055470 2.8248894 1.6552945 1.6401219 1.6031220 1.3964240
105 1.4966630 2.2113344 3.2295511 2.0663978 2.0493902 2.0049938 1.7776389
106 2.5961510 3.3120990 4.3988635 3.2388269 3.1906112 3.1654384 2.8106939
107 1.3379088 1.1489125 1.4071247 0.8831761 1.1090537 1.0246951 1.4317821
108 2.1213203 2.8266588 3.9102430 2.7549955 2.7092434 2.6870058 2.3366643
109 1.4866069 2.1023796 3.1336879 2.0149442 2.0420578 1.9924859 1.7058722
110 2.2427661 3.0099834 4.0767634 2.9017236 2.8124722 2.7910571 2.4839485
111 0.9000000 1.6431677 2.7018512 1.5362291 1.4594520 1.4247807 1.1445523
112 0.9591663 1.5968719 2.6324893 1.4866069 1.5099669 1.4491377 1.2000000
113 1.3379088 2.0542639 3.1272992 1.9646883 1.9261360 1.8841444 1.5652476
114 0.9643651 1.2884099 2.1023796 1.0862780 1.2369317 1.1357817 1.1789826
115 1.1747340 1.6401219 2.4677925 1.4387495 1.5165751 1.4282857 1.4212670
116 1.1958261 1.9026298 2.9086079 1.7606817 1.7175564 1.6703293 1.4594520
117 1.0630146 1.8055470 2.8670542 1.6852300 1.6401219 1.6093477 1.3379088
118 2.8722813 3.6523965 4.7476310 3.5608988 3.4481879 3.4452866 3.1032241
119 2.9698485 3.6373067 4.6936127 3.5651087 3.5580894 3.5185224 3.1780497
120 0.9055385 1.1357817 2.0371549 1.0440307 1.2083046 1.1224972 1.0295630
121 1.6431677 2.3811762 3.4452866 2.2781571 2.2226111 2.1863211 1.8814888
122 0.8602325 1.2369317 2.0420578 0.9949874 1.0862780 1.0000000 1.1045361
123 2.7147744 3.4029399 4.4833024 3.3406586 3.3060551 3.2787193 2.9171904
124 0.6164414 1.1958261 2.2472205 1.1090537 1.1401754 1.0677078 0.8124038
125 1.4662878 2.2360680 3.2954514 2.1118712 2.0371549 2.0124612 1.7349352
126 1.8357560 2.5845696 3.6851052 2.5099801 2.4269322 2.4145393 2.0566964
127 0.5000000 1.0954451 2.1400935 0.9899495 1.0049876 0.9327379 0.7141428
128 0.5000000 1.1916375 2.2135944 1.0392305 1.0049876 0.9539392 0.7937254
129 1.2727922 1.9416488 2.9512709 1.8027756 1.8165902 1.7606817 1.5427249
130 1.6401219 2.3494680 3.4554305 2.3021729 2.2293497 2.2158520 1.8303005
131 2.0566964 2.7386128 3.8288379 2.6870058 2.6514147 2.6210685 2.2472205
132 2.7349589 3.5000000 4.6119410 3.4394767 3.3105891 3.3136083 2.9325757
133 1.3304135 1.9899749 2.9899833 1.8493242 1.8681542 1.8083141 1.5968719
134 0.5830952 1.2609520 2.3302360 1.1618950 1.1401754 1.1045361 0.8366600
135 1.0770330 1.6401219 2.5980762 1.4933185 1.5231546 1.4899664 1.3416408
136 2.3706539 3.0643107 4.1605288 3.0182777 2.9698485 2.9359837 2.5495098
137 1.4832397 2.2113344 3.1859065 2.0371549 1.9798990 1.9442222 1.7776389
138 1.0344080 1.7944358 2.8425341 1.6552945 1.5968719 1.5716234 1.3304135
139 0.4582576 1.0954451 2.0928450 0.9273618 0.9000000 0.8426150 0.7416198
140 1.3341664 2.0566964 3.1416556 1.9824228 1.9235384 1.8867962 1.5427249
141 1.5394804 2.2494444 3.2832910 2.1307276 2.1000000 2.0518285 1.7860571
142 1.3076697 1.9697716 3.0298515 1.9131126 1.8627936 1.8138357 1.4730920
143 0.8246211 1.3076697 2.2022716 1.1000000 1.1832160 1.1045361 1.0954451
144 1.7406895 2.4859606 3.5355339 2.3622024 2.3130067 2.2781571 2.0024984
145 1.6941074 2.4248711 3.4496377 2.2934690 2.2427661 2.2022716 1.9519221
146 1.2369317 1.9026298 2.9461840 1.8165902 1.7916473 1.7349352 1.4387495
147 0.8366600 1.3228757 2.3302360 1.2369317 1.3190906 1.2328828 1.0099505
148 0.9380832 1.6522712 2.7110883 1.5459625 1.5099669 1.4628739 1.1832160
149 1.2727922 1.9924859 2.9580399 1.8138357 1.7492856 1.7146428 1.5684387
150 0.6708204 1.3190906 2.2759613 1.1135529 1.1000000 1.0535654 0.9949874
           99       100       101       102       103       104       105
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100 1.3038405                                                            
101 3.6110940 2.3790755                                                  
102 2.3622024 1.1747340 1.3341664                                        
103 3.6959437 2.4248711 0.9486833 1.5684387                              
104 2.9748950 1.6941074 0.9000000 0.7416198 0.9110434                    
105 3.3555923 2.0928450 0.5099020 1.0770330 0.6164414 0.5000000          
106 4.5232732 3.2465366 1.5165751 2.3706539 0.8602325 1.6703293 1.3638182
107 1.6278821 1.0246951 2.3430749 1.1180340 2.6851443 1.8275667 2.1794495
108 4.0472213 2.7676705 1.3190906 1.9339080 0.5477226 1.2206556 1.0295630
109 3.3000000 2.0566964 1.1532563 1.1618950 0.7141428 0.6000000 0.6708204
110 4.1460825 2.8861739 0.9539392 2.0322401 0.7549834 1.4282857 1.0148892
111 2.7694765 1.5132746 1.0535654 0.8660254 1.0246951 0.6480741 0.7549834
112 2.7676705 1.5165751 1.1045361 0.6324555 0.9899495 0.3872983 0.6633250
113 3.2233523 1.9621417 0.8660254 1.1357817 0.5000000 0.6000000 0.4358899
114 2.2737634 1.1789826 1.5000000 0.2645751 1.7406895 0.9591663 1.2529964
115 2.5845696 1.4899664 1.1489125 0.5099020 1.5684387 0.9327379 1.0295630
116 2.9849623 1.7578396 0.7416198 0.9000000 0.9643651 0.6633250 0.5567764
117 2.9916551 1.7000000 0.9327379 0.8660254 0.7810250 0.2449490 0.5000000
118 4.8321838 3.5454196 1.6703293 2.7331301 1.2845233 2.0346990 1.7000000
119 4.8394215 3.5888717 1.8165902 2.6495283 1.2489996 1.9974984 1.6792856
120 2.2494444 1.1401754 1.8165902 0.6782330 1.7378147 1.0148892 1.4212670
121 3.5298725 2.2715633 0.7071068 1.4071247 0.4000000 0.8426150 0.4690416
122 2.1817424 1.0677078 1.4832397 0.3162278 1.8165902 1.0148892 1.3038405
123 4.6206060 3.3541020 1.7175564 2.4879711 1.0246951 1.7944358 1.5264338
124 2.3622024 1.1224972 1.4352700 0.5477226 1.3490738 0.7280110 1.0488088
125 3.3896903 2.1095023 0.6403124 1.2529964 0.5385165 0.6480741 0.3872983
126 3.7934153 2.5039968 1.1445523 1.7406895 0.3872983 1.0295630 0.8544004
127 2.2427661 0.9949874 1.4798649 0.5196152 1.4662878 0.8124038 1.1357817
128 2.3130067 1.0440307 1.3527749 0.4795832 1.4456832 0.7348469 1.0630146
129 3.0886890 1.8384776 0.7615773 0.8124038 0.7874008 0.3316625 0.3162278
130 3.5707142 2.2956481 1.3228757 1.6217275 0.5196152 0.9486833 0.9219544
131 3.9534795 2.6925824 1.3527749 1.8894444 0.4582576 1.2165525 1.0148892
132 4.6797436 3.4088121 1.7944358 2.7055499 1.2409674 2.0124612 1.7320508
133 3.1224990 1.8841444 0.7141428 0.8426150 0.7937254 0.4242641 0.3000000
134 2.4698178 1.1832160 1.4352700 0.6480741 1.2961481 0.5916080 1.0295630
135 2.8035692 1.5684387 1.3784049 0.7745967 1.3190906 0.5385165 1.0000000
136 4.2497059 3.0066593 1.4491377 2.2045408 0.6633250 1.5716234 1.2409674
137 3.2710854 2.0445048 0.4242641 1.1135529 0.9899495 0.7810250 0.5291503
138 2.9647934 1.6703293 0.8888194 0.8306624 0.8660254 0.2449490 0.5196152
139 2.1886069 0.9327379 1.4525839 0.4795832 1.5842980 0.8602325 1.1874342
140 3.2186954 1.9646883 0.9591663 1.2247449 0.5477226 0.7280110 0.5830952
141 3.3719431 2.1330729 0.6082763 1.2124356 0.5916080 0.7483315 0.3605551
142 3.0740852 1.8788294 1.1180340 1.2369317 0.8544004 0.9486833 0.8185353
143 2.3622024 1.1747340 1.3341664 0.0000000 1.5684387 0.7416198 1.0770330
144 3.6373067 2.3685439 0.5567764 1.4317821 0.4123106 0.8246211 0.3872983
145 3.5284558 2.2912878 0.5000000 1.3747727 0.6708204 0.9055385 0.4795832
146 3.0149627 1.8027756 0.9643651 1.0344080 0.8306624 0.7615773 0.6403124
147 2.4657656 1.2727922 1.4142136 0.5477226 1.3190906 0.7280110 1.0099505
148 2.8035692 1.5427249 1.0099505 0.7745967 0.9273618 0.5000000 0.6324555
149 3.0364453 1.8165902 0.6480741 0.9486833 1.1224972 0.7416198 0.6480741
150 2.4062419 1.1532563 1.2449900 0.3316625 1.4730920 0.6480741 1.0049876
          106       107       108       109       110       111       112
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100                                                                      
101                                                                      
102                                                                      
103                                                                      
104                                                                      
105                                                                      
106                                                                      
107 3.4799425                                                            
108 0.5291503 3.0282008                                                  
109 1.3379088 2.2226111 0.8774964                                        
110 0.9643651 3.1144823 1.0148892 1.4282857                              
111 1.8734994 1.8708287 1.4866069 1.0295630 1.3784049                    
112 1.8055470 1.7233688 1.3638182 0.6244998 1.5652476 0.5567764          
113 1.3601471 2.2405357 0.9949874 0.6633250 1.0198039 0.5477226 0.5744563
114 2.5357445 0.9899495 2.1095023 1.2961481 2.2181073 1.0677078 0.7937254
115 2.3706539 1.3228757 2.0149442 1.3228757 1.9000000 0.9000000 0.8124038
116 1.7916473 1.9339080 1.4662878 1.0392305 1.2165525 0.3741657 0.6403124
117 1.5842980 1.9544820 1.1357817 0.6164414 1.3038405 0.4898979 0.3872983
118 0.8185353 3.8236109 1.1357817 1.9131126 0.8602325 2.0976177 2.2248595
119 0.5477226 3.7376463 0.9273618 1.5716234 1.3892444 2.2649503 2.1023796
120 2.4738634 1.2609520 1.9899749 1.1445523 2.3685439 1.2288206 0.8124038
121 1.1747340 2.5079872 0.9273618 0.8888194 0.6708204 0.7810250 0.9055385
122 2.6343880 0.9110434 2.2135944 1.4662878 2.2113344 1.0049876 0.9055385
123 0.2645751 3.5860842 0.6082763 1.3928388 1.2247449 2.0396078 1.9157244
124 2.1817424 1.4730920 1.7320508 1.0049876 1.8841444 0.6082763 0.4242641
125 1.3076697 2.3409400 0.9848858 0.8602325 0.8124038 0.6480741 0.8062258
126 0.8062258 2.8354894 0.4358899 0.8831761 0.8124038 1.1575837 1.1789826
127 2.3086793 1.3711309 1.8627936 1.1575837 1.9544820 0.6164414 0.5567764
128 2.2869193 1.3638182 1.8466185 1.1916375 1.8708287 0.5291503 0.5916080
129 1.5748016 1.9261360 1.1832160 0.5567764 1.3000000 0.6557439 0.3741657
130 1.0246951 2.6907248 0.5567764 0.7348469 1.1224972 1.0862780 1.0344080
131 0.6082763 2.9899833 0.2645751 0.8246211 1.0198039 1.4071247 1.2845233
132 0.8831761 3.7934153 1.1045361 1.8788294 0.9327379 2.0024984 2.1633308
133 1.5779734 1.9493589 1.2124356 0.6164414 1.2727922 0.6782330 0.4358899
134 2.0832667 1.5652476 1.5937377 0.9110434 1.8574176 0.6708204 0.4690416
135 1.9748418 1.6583124 1.4764823 0.7549834 1.9157244 1.0630146 0.6633250
136 0.5477226 3.3181320 0.6782330 1.2609520 0.8062258 1.6031220 1.6062378
137 1.7146428 2.1142375 1.4491377 1.1704700 1.0535654 0.7000000 0.9165151
138 1.6583124 1.9026298 1.2206556 0.7348469 1.3190906 0.4690416 0.4582576
139 2.4269322 1.2489996 1.9874607 1.3190906 1.9949937 0.6480741 0.7141428
140 1.3928388 2.3086793 1.0488088 0.8062258 0.9949874 0.5196152 0.6782330
141 1.3820275 2.3021729 1.1180340 0.8717798 0.8717798 0.6782330 0.7681146
142 1.6703293 2.2538855 1.3747727 1.0677078 1.1747340 0.5099020 0.7810250
143 2.3706539 1.1180340 1.9339080 1.1618950 2.0322401 0.8660254 0.6324555
144 1.1000000 2.5337719 0.8660254 0.8717798 0.6324555 0.9055385 0.9643651
145 1.3674794 2.4413111 1.1704700 1.0677078 0.7071068 0.8124038 0.9848858
146 1.6763055 2.0832667 1.3527749 0.9273618 1.2083046 0.4242641 0.5916080
147 2.1307276 1.5000000 1.6911535 0.9000000 1.8947295 0.7416198 0.3741657
148 1.7832555 1.8411953 1.3784049 0.8306624 1.3820275 0.2236068 0.3464102
149 1.8973666 1.9157244 1.5874508 1.2124356 1.2529964 0.5567764 0.8366600
150 2.2869193 1.2727922 1.8466185 1.1747340 1.8814888 0.6633250 0.6244998
          113       114       115       116       117       118       119
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100                                                                      
101                                                                      
102                                                                      
103                                                                      
104                                                                      
105                                                                      
106                                                                      
107                                                                      
108                                                                      
109                                                                      
110                                                                      
111                                                                      
112                                                                      
113                                                                      
114 1.3114877                                                            
115 1.1357817 0.5196152                                                  
116 0.5291503 1.0770330 0.7549834                                        
117 0.4242641 1.0862780 1.0246951 0.5830952                              
118 1.7029386 2.9359837 2.6851443 2.0049938 1.9183326                    
119 1.7233688 2.7766887 2.6267851 2.1470911 1.9519221 1.2206556          
120 1.3747727 0.6557439 1.1045361 1.3747727 1.1090537 2.9715316 2.7018512
121 0.3605551 1.5842980 1.3190906 0.6403124 0.7000000 1.4177447 1.5620499
122 1.3601471 0.3316625 0.4898979 1.0246951 1.1180340 2.9478806 2.9223278
123 1.5165751 2.6419690 2.5159491 1.9748418 1.7204651 1.0198039 0.4123106
124 0.8888194 0.6708204 0.8124038 0.8185353 0.7000000 2.5632011 2.4939928
125 0.3741657 1.4628739 1.2288206 0.5477226 0.5099020 1.5033296 1.7233688
126 0.7348469 1.9442222 1.8138357 1.1747340 0.8831761 1.1224972 1.2922848
127 0.9899495 0.6480741 0.7810250 0.8366600 0.7874008 2.6495283 2.6362853
128 0.9695360 0.6782330 0.7280110 0.7348469 0.7211103 2.5690465 2.6400758
129 0.4582576 0.9746794 0.8366600 0.5385165 0.3872983 1.9773720 1.8601075
130 0.7071068 1.8165902 1.7691806 1.1916375 0.7874008 1.4352700 1.4525839
131 0.8944272 2.0493902 1.9519221 1.4000000 1.1045361 1.2409674 0.9643651
132 1.6340135 2.9137605 2.6944387 1.9773720 1.8574176 0.4123106 1.3490738
133 0.4690416 0.9899495 0.8062258 0.5099020 0.4690416 1.9748418 1.8520259
134 0.9000000 0.8426150 1.0295630 0.9219544 0.5744563 2.4515301 2.4248711
135 1.0723805 0.9433981 1.1747340 1.1618950 0.7000000 2.4186773 2.2494444
136 1.1000000 2.3558438 2.1587033 1.5394804 1.4317821 1.0049876 0.8944272
137 0.7141428 1.3000000 0.9273618 0.3872983 0.7549834 1.8357560 2.0736441
138 0.5099020 1.0677078 0.9848858 0.5477226 0.1414214 1.9442222 2.0371549
139 1.1045361 0.6480741 0.7280110 0.8366600 0.8602325 2.7018512 2.7766887
140 0.1732051 1.4035669 1.2165525 0.5567764 0.5196152 1.6822604 1.7832555
141 0.3464102 1.3711309 1.0723805 0.4472136 0.6480741 1.6552945 1.7175564
142 0.4690416 1.3784049 1.1445523 0.5477226 0.7615773 1.9235384 2.0322401
143 1.1357817 0.2645751 0.5099020 0.9000000 0.8660254 2.7331301 2.6495283
144 0.4898979 1.6124515 1.3453624 0.7211103 0.7348469 1.3490738 1.4730920
145 0.5477226 1.5427249 1.1958261 0.5477226 0.8124038 1.5297059 1.7233688
146 0.3741657 1.1747340 0.9327379 0.3741657 0.6164414 1.9748418 2.0124612
147 0.8888194 0.6082763 0.7745967 0.8660254 0.7416198 2.5748786 2.3958297
148 0.4358899 0.9643651 0.8366600 0.3872983 0.3605551 2.0904545 2.1400935
149 0.7549834 1.1445523 0.7874008 0.3000000 0.7141428 2.0273135 2.2671568
150 1.0295630 0.5830952 0.6403124 0.7615773 0.7211103 2.5690465 2.6248809
          120       121       122       123       124       125       126
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100                                                                      
101                                                                      
102                                                                      
103                                                                      
104                                                                      
105                                                                      
106                                                                      
107                                                                      
108                                                                      
109                                                                      
110                                                                      
111                                                                      
112                                                                      
113                                                                      
114                                                                      
115                                                                      
116                                                                      
117                                                                      
118                                                                      
119                                                                      
120                                                                      
121 1.7146428                                                            
122 0.8831761 1.6062378                                                  
123 2.5278449 1.3747727 2.7658633                                        
124 0.6633250 1.2247449 0.7348469 2.2912878                              
125 1.5968719 0.3000000 1.4525839 1.5033296 1.1180340                    
126 1.8788294 0.6557439 1.9924859 0.9695360 1.5066519 0.6633250          
127 0.7280110 1.3076697 0.6403124 2.4289916 0.1732051 1.1832160 1.6124515
128 0.8660254 1.2529964 0.5744563 2.4248711 0.3605551 1.0862780 1.5684387
129 1.1135529 0.6782330 1.0677078 1.7058722 0.7745967 0.5916080 1.0246951
130 1.6522712 0.7937254 1.8894444 1.1224972 1.3228757 0.7745967 0.3464102
131 1.9209373 0.8544004 2.1656408 0.6782330 1.6340135 0.9695360 0.4690416
132 2.8948230 1.3928388 2.9223278 1.0630146 2.4617067 1.4798649 1.0246951
133 1.1704700 0.6557439 1.0816654 1.7146428 0.8185353 0.6000000 1.0583005
134 0.6782330 1.2328828 0.8831761 2.1840330 0.3741657 1.0630146 1.3674794
135 0.7348469 1.3490738 1.0677078 2.0420578 0.8366600 1.1618950 1.3747727
136 2.3194827 0.9165151 2.4454039 0.7000000 1.9339080 1.1357817 0.7416198
137 1.6431677 0.6480741 1.2247449 1.9209373 1.1575837 0.5196152 1.1704700
138 1.1445523 0.7416198 1.0630146 1.8055470 0.7280110 0.5099020 0.9486833
139 0.8774964 1.3820275 0.5000000 2.5651511 0.4358899 1.2165525 1.7088007
140 1.4628739 0.3741657 1.4282857 1.5588457 0.9273618 0.4123106 0.7416198
141 1.5716234 0.2645751 1.3964240 1.5684387 1.0816654 0.3741657 0.8831761
142 1.5066519 0.6082763 1.3820275 1.8384776 0.9000000 0.6928203 1.0770330
143 0.6782330 1.4071247 0.3162278 2.4879711 0.5477226 1.2529964 1.7406895
144 1.7578396 0.2236068 1.6401219 1.3038405 1.3228757 0.3162278 0.6480741
145 1.7860571 0.3000000 1.5329710 1.5811388 1.2845233 0.4000000 0.9165151
146 1.3453624 0.5744563 1.1958261 1.8384776 0.7681146 0.6164414 1.0862780
147 0.5830952 1.2247449 0.7745967 2.2248595 0.2449490 1.1532563 1.5198684
148 1.0862780 0.7348469 0.9695360 1.9313208 0.5099020 0.6244998 1.1000000
149 1.5099669 0.7874008 1.0295630 2.0952327 1.0000000 0.6244998 1.2845233
150 0.8660254 1.2845233 0.4582576 2.4248711 0.5385165 1.0862780 1.5937377
          127       128       129       130       131       132       133
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100                                                                      
101                                                                      
102                                                                      
103                                                                      
104                                                                      
105                                                                      
106                                                                      
107                                                                      
108                                                                      
109                                                                      
110                                                                      
111                                                                      
112                                                                      
113                                                                      
114                                                                      
115                                                                      
116                                                                      
117                                                                      
118                                                                      
119                                                                      
120                                                                      
121                                                                      
122                                                                      
123                                                                      
124                                                                      
125                                                                      
126                                                                      
127                                                                      
128 0.2449490                                                            
129 0.8774964 0.8426150                                                  
130 1.4422205 1.4352700 0.9848858                                        
131 1.7720045 1.7832555 1.1357817 0.5099020                              
132 2.5475478 2.4839485 1.9748418 1.2845233 1.1618950                    
133 0.9165151 0.8831761 0.1000000 1.0392305 1.1575837 1.9824228          
134 0.4358899 0.4582576 0.7874008 1.1618950 1.5394804 2.3452079 0.8660254
135 0.9219544 0.9000000 0.7874008 1.2041595 1.4933185 2.3832751 0.8774964
136 2.0566964 2.0615528 1.4212670 0.9110434 0.5385165 0.9273618 1.4106736
137 1.1704700 1.0246951 0.6782330 1.2845233 1.4387495 1.8761663 0.6403124
138 0.7874008 0.6782330 0.4358899 0.8831761 1.2083046 1.8947295 0.5099020
139 0.2828427 0.1414214 0.9643651 1.5748016 1.9235384 2.6172505 1.0000000
140 1.0148892 0.9949874 0.6164414 0.7141428 0.9327379 1.5811388 0.6244998
141 1.1575837 1.1045361 0.5196152 0.9695360 1.0392305 1.6522712 0.4690416
142 0.9591663 0.9695360 0.7937254 1.0392305 1.2247449 1.8083141 0.7745967
143 0.5196152 0.4795832 0.8124038 1.6217275 1.8894444 2.7055499 0.8426150
144 1.4071247 1.3341664 0.6708204 0.8366600 0.8485281 1.3820275 0.6480741
145 1.3416408 1.2569805 0.7141428 1.0770330 1.1224972 1.5588457 0.6633250
146 0.8366600 0.8366600 0.5744563 1.0488088 1.2247449 1.9000000 0.5477226
147 0.3872983 0.5567764 0.7071068 1.3379088 1.5842980 2.4939928 0.7416198
148 0.5744563 0.5385165 0.4690416 1.0049876 1.2922848 2.0099751 0.5000000
149 0.9848858 0.8185353 0.6928203 1.3453624 1.5652476 2.0346990 0.6708204
150 0.4690416 0.2828427 0.7937254 1.4899664 1.8165902 2.5238859 0.8366600
          134       135       136       137       138       139       140
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100                                                                      
101                                                                      
102                                                                      
103                                                                      
104                                                                      
105                                                                      
106                                                                      
107                                                                      
108                                                                      
109                                                                      
110                                                                      
111                                                                      
112                                                                      
113                                                                      
114                                                                      
115                                                                      
116                                                                      
117                                                                      
118                                                                      
119                                                                      
120                                                                      
121                                                                      
122                                                                      
123                                                                      
124                                                                      
125                                                                      
126                                                                      
127                                                                      
128                                                                      
129                                                                      
130                                                                      
131                                                                      
132                                                                      
133                                                                      
134                                                                      
135 0.5830952                                                            
136 1.9078784 1.9442222                                                  
137 1.1916375 1.2961481 1.5427249                                        
138 0.5916080 0.7141428 1.5198684 0.6855655                              
139 0.5567764 0.9848858 2.1977261 1.1180340 0.8124038                    
140 0.9486833 1.1916375 1.0862780 0.7615773 0.5916080 1.1269428          
141 1.1445523 1.2688578 1.1269428 0.5000000 0.6782330 1.2247449 0.4123106
142 1.0440307 1.3964240 1.2845233 0.8426150 0.8124038 1.0770330 0.3605551
143 0.6480741 0.7745967 2.2045408 1.1135529 0.8306624 0.4795832 1.2247449
144 1.3000000 1.3228757 0.9433981 0.6244998 0.7615773 1.4628739 0.5567764
145 1.3304135 1.4387495 1.1357817 0.4358899 0.8124038 1.3711309 0.5744563
146 0.9219544 1.2206556 1.3453624 0.7000000 0.6633250 0.9486833 0.3605551
147 0.5099020 0.8124038 1.8920888 1.1916375 0.7937254 0.6244998 0.9591663
148 0.5830952 0.9165151 1.5297059 0.7211103 0.3872983 0.6708204 0.4690416
149 1.0488088 1.2247449 1.7029386 0.2449490 0.6244998 0.9000000 0.7874008
150 0.5385165 0.7810250 2.1189620 0.9643651 0.6480741 0.3162278 1.0908712
          141       142       143       144       145       146       147
2                                                                        
3                                                                        
4                                                                        
5                                                                        
6                                                                        
7                                                                        
8                                                                        
9                                                                        
10                                                                       
11                                                                       
12                                                                       
13                                                                       
14                                                                       
15                                                                       
16                                                                       
17                                                                       
18                                                                       
19                                                                       
20                                                                       
21                                                                       
22                                                                       
23                                                                       
24                                                                       
25                                                                       
26                                                                       
27                                                                       
28                                                                       
29                                                                       
30                                                                       
31                                                                       
32                                                                       
33                                                                       
34                                                                       
35                                                                       
36                                                                       
37                                                                       
38                                                                       
39                                                                       
40                                                                       
41                                                                       
42                                                                       
43                                                                       
44                                                                       
45                                                                       
46                                                                       
47                                                                       
48                                                                       
49                                                                       
50                                                                       
51                                                                       
52                                                                       
53                                                                       
54                                                                       
55                                                                       
56                                                                       
57                                                                       
58                                                                       
59                                                                       
60                                                                       
61                                                                       
62                                                                       
63                                                                       
64                                                                       
65                                                                       
66                                                                       
67                                                                       
68                                                                       
69                                                                       
70                                                                       
71                                                                       
72                                                                       
73                                                                       
74                                                                       
75                                                                       
76                                                                       
77                                                                       
78                                                                       
79                                                                       
80                                                                       
81                                                                       
82                                                                       
83                                                                       
84                                                                       
85                                                                       
86                                                                       
87                                                                       
88                                                                       
89                                                                       
90                                                                       
91                                                                       
92                                                                       
93                                                                       
94                                                                       
95                                                                       
96                                                                       
97                                                                       
98                                                                       
99                                                                       
100                                                                      
101                                                                      
102                                                                      
103                                                                      
104                                                                      
105                                                                      
106                                                                      
107                                                                      
108                                                                      
109                                                                      
110                                                                      
111                                                                      
112                                                                      
113                                                                      
114                                                                      
115                                                                      
116                                                                      
117                                                                      
118                                                                      
119                                                                      
120                                                                      
121                                                                      
122                                                                      
123                                                                      
124                                                                      
125                                                                      
126                                                                      
127                                                                      
128                                                                      
129                                                                      
130                                                                      
131                                                                      
132                                                                      
133                                                                      
134                                                                      
135                                                                      
136                                                                      
137                                                                      
138                                                                      
139                                                                      
140                                                                      
141                                                                      
142 0.5477226                                                            
143 1.2124356 1.2369317                                                  
144 0.3464102 0.8124038 1.4317821                                        
145 0.2449490 0.6928203 1.3747727 0.3162278                              
146 0.4242641 0.2449490 1.0344080 0.7348469 0.6164414                    
147 1.0630146 0.9433981 0.5477226 1.3076697 1.2845233 0.7810250          
148 0.6082763 0.5196152 0.7745967 0.8426150 0.7937254 0.3605551 0.5830952
149 0.6244998 0.8185353 0.9486833 0.8062258 0.6244998 0.6708204 1.0677078
150 1.1224972 1.1224972 0.3316625 1.3190906 1.2569805 0.9486833 0.6557439
          148       149
2                      
3                      
4                      
5                      
6                      
7                      
8                      
9                      
10                     
11                     
12                     
13                     
14                     
15                     
16                     
17                     
18                     
19                     
20                     
21                     
22                     
23                     
24                     
25                     
26                     
27                     
28                     
29                     
30                     
31                     
32                     
33                     
34                     
35                     
36                     
37                     
38                     
39                     
40                     
41                     
42                     
43                     
44                     
45                     
46                     
47                     
48                     
49                     
50                     
51                     
52                     
53                     
54                     
55                     
56                     
57                     
58                     
59                     
60                     
61                     
62                     
63                     
64                     
65                     
66                     
67                     
68                     
69                     
70                     
71                     
72                     
73                     
74                     
75                     
76                     
77                     
78                     
79                     
80                     
81                     
82                     
83                     
84                     
85                     
86                     
87                     
88                     
89                     
90                     
91                     
92                     
93                     
94                     
95                     
96                     
97                     
98                     
99                     
100                    
101                    
102                    
103                    
104                    
105                    
106                    
107                    
108                    
109                    
110                    
111                    
112                    
113                    
114                    
115                    
116                    
117                    
118                    
119                    
120                    
121                    
122                    
123                    
124                    
125                    
126                    
127                    
128                    
129                    
130                    
131                    
132                    
133                    
134                    
135                    
136                    
137                    
138                    
139                    
140                    
141                    
142                    
143                    
144                    
145                    
146                    
147                    
148                    
149 0.6164414          
150 0.6403124 0.7681146

The Euclidean distance is to calcualte the distance between and among groups.

Non-Euclidean Grouping, a non-metric multidimensional scaling:

nmds$points

Not seeming to work when rendering!

Thursday session

data("decathlon2")
str(decathlon2)
'data.frame':   27 obs. of  13 variables:
 $ X100m       : num  11 10.8 11 11.3 11.1 ...
 $ Long.jump   : num  7.58 7.4 7.23 7.09 7.3 7.31 6.81 7.56 6.97 7.27 ...
 $ Shot.put    : num  14.8 14.3 14.2 15.2 13.5 ...
 $ High.jump   : num  2.07 1.86 1.92 2.1 2.01 2.13 1.95 1.86 1.95 1.98 ...
 $ X400m       : num  49.8 49.4 48.9 50.4 48.6 ...
 $ X110m.hurdle: num  14.7 14.1 15 15.3 14.2 ...
 $ Discus      : num  43.8 50.7 40.9 46.3 45.7 ...
 $ Pole.vault  : num  5.02 4.92 5.32 4.72 4.42 4.42 4.92 4.82 4.72 4.62 ...
 $ Javeline    : num  63.2 60.1 62.8 63.4 55.4 ...
 $ X1500m      : num  292 302 280 276 268 ...
 $ Rank        : int  1 2 4 5 7 8 9 10 11 12 ...
 $ Points      : int  8217 8122 8067 8036 8004 7995 7802 7733 7708 7651 ...
 $ Competition : Factor w/ 2 levels "Decastar","OlympicG": 1 1 1 1 1 1 1 1 1 1 ...

we want to group the atheletes together by which events they are going to be best at.

decathlon2.active <- decathlon2[1:23, 1:10]
head(decathlon2.active[, 1:6])
          X100m Long.jump Shot.put High.jump X400m X110m.hurdle
SEBRLE    11.04      7.58    14.83      2.07 49.81        14.69
CLAY      10.76      7.40    14.26      1.86 49.37        14.05
BERNARD   11.02      7.23    14.25      1.92 48.93        14.99
YURKOV    11.34      7.09    15.19      2.10 50.42        15.31
ZSIVOCZKY 11.13      7.30    13.48      2.01 48.62        14.17
McMULLEN  10.83      7.31    13.76      2.13 49.91        14.38
decathlon2.active_stats <- data.frame(
  Min = apply(decathlon2.active, 2, min), # minimum
  Q1 = apply(decathlon2.active, 2, quantile, 1/4), # First quartile
  Med = apply(decathlon2.active, 2, median), # median
  Mean = apply(decathlon2.active, 2, mean), # mean
  Q3 = apply(decathlon2.active, 2, quantile, 3/4), # Third quartile
  Max = apply(decathlon2.active, 2, max) # Maximum
  )
decathlon2.active_stats <- round(decathlon2.active_stats, 1)
head(decathlon2.active_stats)
              Min   Q1  Med Mean   Q3  Max
X100m        10.4 10.8 11.0 11.0 11.2 11.6
Long.jump     6.8  7.2  7.3  7.3  7.5  8.0
Shot.put     12.7 14.2 14.7 14.6 15.1 16.4
High.jump     1.9  1.9  2.0  2.0  2.1  2.1
X400m        46.8 49.0 49.4 49.4 50.0 51.2
X110m.hurdle 14.0 14.2 14.4 14.5 14.9 15.7
cor.mat <- round(cor(decathlon2[,1:10]), 2)
head(cor.mat)
             X100m Long.jump Shot.put High.jump X400m X110m.hurdle Discus
X100m         1.00     -0.74    -0.37     -0.31  0.57         0.67  -0.39
Long.jump    -0.74      1.00     0.37      0.27 -0.50        -0.55   0.33
Shot.put     -0.37      0.37     1.00      0.57 -0.21        -0.27   0.72
High.jump    -0.31      0.27     0.57      1.00 -0.26        -0.20   0.42
X400m         0.57     -0.50    -0.21     -0.26  1.00         0.60  -0.25
X110m.hurdle  0.67     -0.55    -0.27     -0.20  0.60         1.00  -0.42
             Pole.vault Javeline X1500m
X100m              0.01    -0.27  -0.18
Long.jump          0.08     0.29   0.17
Shot.put          -0.07     0.48   0.01
High.jump         -0.55     0.21  -0.16
X400m              0.11     0.02   0.18
X110m.hurdle       0.12     0.10  -0.10

This above shows the corrleations between events, remember closest to -1 or 1 show strong correlations, those closer to 0 are less to no correlations x100m and x400m has a positive correlation of 0.57, showing those with a longer x100 generally had a longer x400m

corrplot(cor.mat, type="upper", order="hclust", 
         tl.col="black", tl.srt=45)

This is another way to look at the correlations.

Now to look at the PCA

princomp() function can be used to make PCAs as well.

pc1 <- princomp(decathlon2[,1:10])
summary(pc1)
Importance of components:
                           Comp.1    Comp.2     Comp.3      Comp.4      Comp.5
Standard deviation     10.0416571 5.2796539 3.15640452 0.948715557 0.503217004
Proportion of Variance  0.7199694 0.1990280 0.07113586 0.006426512 0.001808064
Cumulative Proportion   0.7199694 0.9189974 0.99013328 0.996559790 0.998367855
                           Comp.6       Comp.7       Comp.8       Comp.9
Standard deviation     0.34429323 0.2202867681 0.2009115461 0.1353626628
Proportion of Variance 0.00084637 0.0003464815 0.0002882127 0.0001308281
Cumulative Proportion  0.99921422 0.9995607062 0.9998489188 0.9999797469
                            Comp.10
Standard deviation     5.325908e-02
Proportion of Variance 2.025306e-05
Cumulative Proportion  1.000000e+00

So the ,1:10 function is always at the maximum number of variables you have.

This table above sows which components give you the pecentage of the findings, so comp1 tells 72%, 2 is 20%, by the 3rd comp we get 99% of results.

pc1$loadings

Loadings:
             Comp.1 Comp.2 Comp.3 Comp.4 Comp.5 Comp.6 Comp.7 Comp.8 Comp.9
X100m                              0.179         0.317  0.119  0.363  0.847
Long.jump                         -0.173        -0.289        -0.797  0.494
Shot.put                   -0.142        -0.974                            
High.jump                                              -0.244              
X400m                       0.108  0.917        -0.368                     
X110m.hurdle                       0.281         0.821        -0.456 -0.159
Discus              -0.279 -0.935  0.128  0.156                            
Pole.vault                                              0.954 -0.102 -0.113
Javeline            -0.956  0.287                                          
X1500m       -0.997                                                        
             Comp.10
X100m               
Long.jump           
Shot.put            
High.jump     0.968 
X400m               
X110m.hurdle        
Discus              
Pole.vault    0.232 
Javeline            
X1500m              

               Comp.1 Comp.2 Comp.3 Comp.4 Comp.5 Comp.6 Comp.7 Comp.8 Comp.9
SS loadings       1.0    1.0    1.0    1.0    1.0    1.0    1.0    1.0    1.0
Proportion Var    0.1    0.1    0.1    0.1    0.1    0.1    0.1    0.1    0.1
Cumulative Var    0.1    0.2    0.3    0.4    0.5    0.6    0.7    0.8    0.9
               Comp.10
SS loadings        1.0
Proportion Var     0.1
Cumulative Var     1.0

This is wrong because the values aren’t scaled together and the loads need to be balanced, as the times for 1500m is huge compared with the others.

To fix this:

pc2 <- princomp(decathlon2[,1:10], cor = TRUE)
summary(pc2)
Importance of components:
                          Comp.1    Comp.2    Comp.3    Comp.4     Comp.5
Standard deviation     1.9364846 1.3210481 1.2320016 1.0159725 0.78602715
Proportion of Variance 0.3749973 0.1745168 0.1517828 0.1032200 0.06178387
Cumulative Proportion  0.3749973 0.5495141 0.7012969 0.8045169 0.86630075
                           Comp.6     Comp.7     Comp.8     Comp.9    Comp.10
Standard deviation     0.65443931 0.57088553 0.52856662 0.43716446 0.33510587
Proportion of Variance 0.04282908 0.03259103 0.02793827 0.01911128 0.01122959
Cumulative Proportion  0.90912983 0.94172086 0.96965913 0.98877041 1.00000000

No this is more scaled, using the cor = TRUE.

Now our first comp1 calls only 37%by comp9 we get to 99%

pc2$loadings

Loadings:
             Comp.1 Comp.2 Comp.3 Comp.4 Comp.5 Comp.6 Comp.7 Comp.8 Comp.9
X100m         0.423  0.259                0.280  0.160         0.353  0.712
Long.jump    -0.392 -0.289        -0.183 -0.336        -0.249  0.730  0.128
Shot.put     -0.369  0.214  0.385         0.354  0.322 -0.231              
High.jump    -0.314  0.463               -0.382  0.527        -0.250  0.146
X400m         0.332  0.112  0.419  0.266 -0.253 -0.239 -0.690        -0.137
X110m.hurdle  0.370  0.225  0.338 -0.157 -0.205  0.262  0.428  0.364 -0.496
Discus       -0.370  0.155  0.219  0.391  0.432 -0.282  0.184  0.269 -0.186
Pole.vault    0.114 -0.558  0.327 -0.248  0.334  0.436 -0.127 -0.161       
Javeline     -0.183         0.564 -0.478 -0.170 -0.424  0.233 -0.199  0.333
X1500m              -0.430  0.286  0.642 -0.323  0.109  0.344         0.199
             Comp.10
X100m               
Long.jump           
Shot.put     -0.617 
High.jump     0.415 
X400m         0.120 
X110m.hurdle        
Discus        0.480 
Pole.vault    0.403 
Javeline            
X1500m       -0.190 

               Comp.1 Comp.2 Comp.3 Comp.4 Comp.5 Comp.6 Comp.7 Comp.8 Comp.9
SS loadings       1.0    1.0    1.0    1.0    1.0    1.0    1.0    1.0    1.0
Proportion Var    0.1    0.1    0.1    0.1    0.1    0.1    0.1    0.1    0.1
Cumulative Var    0.1    0.2    0.3    0.4    0.5    0.6    0.7    0.8    0.9
               Comp.10
SS loadings        1.0
Proportion Var     0.1
Cumulative Var     1.0

to look at this table look at the largest recordings for each event to establish which sport makes up each component.

to find out how many comps to use look at the standard deviation in the comps, and we use all the comps that have a value higher than 1, beyond that they are removed.

This is enough to interpret princomp outputs.

PCA

this uses the PCA function

pc3 <- PCA(decathlon2[,1:10])

pc3
**Results for the Principal Component Analysis (PCA)**
The analysis was performed on 27 individuals, described by 10 variables
*The results are available in the following objects:

   name               description                          
1  "$eig"             "eigenvalues"                        
2  "$var"             "results for the variables"          
3  "$var$coord"       "coord. for the variables"           
4  "$var$cor"         "correlations variables - dimensions"
5  "$var$cos2"        "cos2 for the variables"             
6  "$var$contrib"     "contributions of the variables"     
7  "$ind"             "results for the individuals"        
8  "$ind$coord"       "coord. for the individuals"         
9  "$ind$cos2"        "cos2 for the individuals"           
10 "$ind$contrib"     "contributions of the individuals"   
11 "$call"            "summary statistics"                 
12 "$call$centre"     "mean of the variables"              
13 "$call$ecart.type" "standard error of the variables"    
14 "$call$row.w"      "weights for the individuals"        
15 "$call$col.w"      "weights for the variables"          
pc3$eig
        eigenvalue percentage of variance cumulative percentage of variance
comp 1   3.7499727              37.499727                          37.49973
comp 2   1.7451681              17.451681                          54.95141
comp 3   1.5178280              15.178280                          70.12969
comp 4   1.0322001              10.322001                          80.45169
comp 5   0.6178387               6.178387                          86.63008
comp 6   0.4282908               4.282908                          90.91298
comp 7   0.3259103               3.259103                          94.17209
comp 8   0.2793827               2.793827                          96.96591
comp 9   0.1911128               1.911128                          98.87704
comp 10  0.1122959               1.122959                         100.00000
pc3$var
$coord
                   Dim.1      Dim.2        Dim.3       Dim.4      Dim.5
X100m        -0.81895206  0.3427787  0.100864539  0.10134200 -0.2198061
Long.jump     0.75889854 -0.3814931 -0.006261254 -0.18542415  0.2637141
Shot.put      0.71507829  0.2821167  0.473854591  0.03610404 -0.2786370
High.jump     0.60849326  0.6113542  0.004605966  0.07124353  0.3005866
X400m        -0.64384815  0.1484225  0.515759382  0.26978529  0.1992387
X110m.hurdle -0.71642027  0.2975519  0.416451017 -0.15978086  0.1610208
Discus        0.71688812  0.2043979  0.270322202  0.39762306 -0.3394923
Pole.vault   -0.22141731 -0.7375479  0.403083623 -0.25154946 -0.2625927
Javeline      0.35517566  0.0985309  0.695433666 -0.48555900  0.1334223
X1500m        0.06971223 -0.5681197  0.352757755  0.65246136  0.2536784

$cor
                   Dim.1      Dim.2        Dim.3       Dim.4      Dim.5
X100m        -0.81895206  0.3427787  0.100864539  0.10134200 -0.2198061
Long.jump     0.75889854 -0.3814931 -0.006261254 -0.18542415  0.2637141
Shot.put      0.71507829  0.2821167  0.473854591  0.03610404 -0.2786370
High.jump     0.60849326  0.6113542  0.004605966  0.07124353  0.3005866
X400m        -0.64384815  0.1484225  0.515759382  0.26978529  0.1992387
X110m.hurdle -0.71642027  0.2975519  0.416451017 -0.15978086  0.1610208
Discus        0.71688812  0.2043979  0.270322202  0.39762306 -0.3394923
Pole.vault   -0.22141731 -0.7375479  0.403083623 -0.25154946 -0.2625927
Javeline      0.35517566  0.0985309  0.695433666 -0.48555900  0.1334223
X1500m        0.06971223 -0.5681197  0.352757755  0.65246136  0.2536784

$cos2
                   Dim.1       Dim.2        Dim.3       Dim.4      Dim.5
X100m        0.670682478 0.117497253 1.017366e-02 0.010270201 0.04831474
Long.jump    0.575926998 0.145537006 3.920330e-05 0.034382115 0.06954513
Shot.put     0.511336968 0.079589834 2.245382e-01 0.001303502 0.07763857
High.jump    0.370264042 0.373753996 2.121493e-05 0.005075641 0.09035233
X400m        0.414540443 0.022029236 2.660077e-01 0.072784101 0.03969604
X110m.hurdle 0.513258007 0.088537114 1.734314e-01 0.025529923 0.02592771
Discus       0.513928570 0.041778514 7.307409e-02 0.158104095 0.11525503
Pole.vault   0.049025625 0.543976848 1.624764e-01 0.063277132 0.06895491
Javeline     0.126149749 0.009708339 4.836280e-01 0.235767546 0.01780151
X1500m       0.004859795 0.322759992 1.244380e-01 0.425705821 0.06435272

$contrib
                  Dim.1      Dim.2        Dim.3      Dim.4     Dim.5
X100m        17.8849964  6.7327182  0.670277238  0.9949816  7.819960
Long.jump    15.3581652  8.3394260  0.002582856  3.3309545 11.256195
Shot.put     13.6357518  4.5605826 14.793387670  0.1262838 12.566155
High.jump     9.8737797 21.4165036  0.001397716  0.4917303 14.623936
X400m        11.0544924  1.2622988 17.525552828  7.0513559  6.424985
X110m.hurdle 13.6869799  5.0732713 11.426291715  2.4733503  4.196517
Discus       13.7048617  2.3939535  4.814385760 15.3171947 18.654551
Pole.vault    1.3073595 31.1704550 10.704533859  6.1303166 11.160666
Javeline      3.3640178  0.5562982 31.863162259 22.8412641  2.881255
X1500m        0.1295955 18.4944926  8.198428099 41.2425682 10.415780
pc3$ind
$coord
                  Dim.1       Dim.2       Dim.3       Dim.4       Dim.5
SEBRLE       0.27795816 -0.53643446  1.58523905  0.10582254  1.07462350
CLAY         0.90485358 -2.09428033  0.84068483  1.85071783 -0.40864484
BERNARD     -1.37226593 -1.34811551  0.96193172 -1.49307179 -0.18266695
YURKOV      -0.92820476  2.28174439  1.94268810  0.09682298  0.19092728
ZSIVOCZKY   -0.10381712  1.08982206 -2.09890758  0.07190646 -0.03293753
McMULLEN     0.23985797  0.93909229 -0.81813616  1.20189259  1.83019935
MARTINEAU   -2.53729150  1.80109355  0.05197499  0.37430608 -2.28541065
HERNU       -1.90284276 -0.33027691  1.28868243  0.76650496  0.23946548
BARRAS      -1.80562476  0.30259042 -0.59280994  0.65652606 -0.24403890
NOOL        -2.88173660  0.86385407 -1.40244800 -1.49119549  1.35872620
BOURGUIGNON -4.50552974 -0.48542232  1.20270407  0.95136343  0.50886452
Sebrle       3.56775576  0.06800653  1.91121552 -1.04236336 -0.30059583
Clay         3.47217746 -0.70559895  1.60702918 -0.69610798  0.74181520
Karpov       4.32876094  0.16078863 -1.15252940  0.40768939 -0.77248484
Macey        1.94447458  2.52394759 -0.26030390 -0.07980903 -0.02502407
Warners      1.55208189 -1.48863373 -1.41419607 -0.54966495  0.10219038
Zsivoczky    0.47515255  1.97176276  0.90018250 -0.72528762  0.17169629
Hernu        0.28084133  0.82269611 -0.90579446 -0.78238890 -0.77138935
Bernard      1.53327964  1.08583185 -1.24571714  0.53472225  1.04287871
Schwarzl    -0.67797434 -1.13425675 -0.42218044 -0.60985100 -0.10068641
Pogorelov   -0.07787933 -0.33365807  0.60795140  1.44699856  0.20462272
Schoenbeck  -0.48740500 -0.86068770  0.86671186 -0.17322579 -0.43298531
Barras      -0.41308108  1.36689345  0.22729553 -0.75332386 -0.86177769
KARPOV       0.96774776 -0.99559913 -0.47601937  2.50106898 -0.66134899
WARNERS     -0.28004348 -0.91215821 -1.41698173 -0.14716129  0.03616162
Nool        -0.53538892 -2.13563762  0.61605320 -1.80807906 -0.31995379
Drews       -1.03585630 -1.91736400 -2.40432024 -0.61481200 -0.10222610

$cos2
                  Dim.1       Dim.2       Dim.3        Dim.4        Dim.5
SEBRLE      0.015446507 0.057531377 0.502413099 0.0022388646 2.308788e-01
CLAY        0.065574226 0.351274143 0.056603460 0.2743196867 1.337422e-02
BERNARD     0.232236641 0.224134336 0.114114984 0.2749258382 4.115041e-03
YURKOV      0.084812169 0.512512631 0.371515341 0.0009228422 3.588447e-03
ZSIVOCZKY   0.001508695 0.166254972 0.616666115 0.0007237678 1.518607e-04
McMULLEN    0.008268165 0.126741081 0.096194895 0.2076023015 4.813906e-01
MARTINEAU   0.404809538 0.203977634 0.000169863 0.0088097536 3.284267e-01
HERNU       0.362608741 0.010924181 0.166312267 0.0588386106 5.742729e-03
BARRAS      0.617959073 0.017354617 0.066609421 0.0816974783 1.128815e-02
NOOL        0.514439600 0.046228159 0.121842657 0.1377510721 1.143641e-01
BOURGUIGNON 0.841449586 0.009767330 0.059958939 0.0375171024 1.073348e-02
Sebrle      0.674281017 0.000244992 0.193495125 0.0575557845 4.786484e-03
Clay        0.686401077 0.028345883 0.147035315 0.0275884533 3.133037e-02
Karpov      0.826740626 0.001140651 0.058606543 0.0073333243 2.632825e-02
Macey       0.349765258 0.589295050 0.006268065 0.0005892182 5.792793e-05
Warners     0.336540414 0.309587684 0.279400521 0.0422089014 1.458908e-03
Zsivoczky   0.035468777 0.610786464 0.127303756 0.0826419605 4.631289e-03
Hernu       0.024732878 0.212242088 0.257283505 0.1919543626 1.865950e-01
Bernard     0.336842041 0.168930748 0.222342471 0.0409675572 1.558300e-01
Schwarzl    0.131904148 0.369194047 0.051147940 0.1067282592 2.909200e-03
Pogorelov   0.001032843 0.018958028 0.062940129 0.3565546476 7.130132e-03
Schoenbeck  0.064365572 0.200708083 0.203527523 0.0081301543 5.079489e-02
Barras      0.027236841 0.298232827 0.008246467 0.0905835986 1.185432e-01
KARPOV      0.086877302 0.091949844 0.021019928 0.5802742721 4.057358e-02
WARNERS     0.016684285 0.177009676 0.427154672 0.0046072715 2.781970e-04
Nool        0.031247723 0.497204143 0.041372909 0.3563809236 1.115974e-02
Drews       0.096356670 0.330135249 0.519119550 0.0339443449 9.384397e-04

$contrib
                   Dim.1        Dim.2        Dim.3       Dim.4        Dim.5
SEBRLE       0.076307460  0.610706158  6.132015051  0.04018174  6.922672774
CLAY         0.808657739  9.308261762  1.724567163 12.29002506  1.001044040
BERNARD      1.859879013  3.857031402  2.257886894  7.99896372  0.200023538
YURKOV       0.850933674 11.049253694  9.209156448  0.03363793  0.218522972
ZSIVOCZKY    0.010645010  2.520636092 10.749798438  0.01855274  0.006503440
McMULLEN     0.056821989  1.871610682  1.633295867  5.18326795 20.079732810
MARTINEAU    6.358414865  6.884485726  0.006591778  0.50271993 31.310473305
HERNU        3.576135265  0.231502336  4.052336553  2.10815381  0.343753384
BARRAS       3.220053866  0.194316330  0.857520750  1.54659390  0.357009066
NOOL         8.201942074  1.583724842  4.799403054  7.97887222 11.066875879
BOURGUIGNON 20.049329548  0.500078792  3.529646797  3.24762038  1.552263652
Sebrle      12.571826101  0.009815223  8.913186933  3.89861748  0.541660326
Clay        11.907263355  1.056610245  6.301750650  1.73870367  3.298774196
Karpov      18.506970658  0.054866799  3.241288722  0.59639112  3.577182263
Macey        3.734329836 13.519468797  0.165338893  0.02285474  0.003753851
Warners      2.379235302  4.702996710  4.880147385  1.08409773  0.062600980
Zsivoczky    0.222984290  8.251032192  1.977310276  1.88752571  0.176718890
Hernu        0.077898692  1.436408108  2.002041197  2.19643175  3.567043567
Bernard      2.321939345  2.502212070  3.786627335  1.02595631  6.519716994
Schwarzl     0.453977830  2.730371293  0.434920519  1.33450385  0.060771903
Pogorelov    0.005990355  0.236266539  0.901885531  7.51291627  0.250997175
Schoenbeck   0.234632460  1.572136160  1.833002969  0.10767068  1.123847719
Barras       0.168530592  3.965229098  0.126065281  2.03627203  4.451963832
KARPOV       0.924980290  2.103623348  0.552920806 22.44521074  2.621938688
WARNERS      0.077456712  1.765791033  4.899391999  0.07770688  0.007838932
Nool         0.283104577  9.679525885  0.926083463 11.73024769  0.613671066
Drews        1.059759100  7.802038684 14.105819248  1.35630393  0.062644759

$dist
     SEBRLE        CLAY     BERNARD      YURKOV   ZSIVOCZKY    McMULLEN 
   2.236476    3.533554    2.847560    3.187240    2.672811    2.637847 
  MARTINEAU       HERNU      BARRAS        NOOL BOURGUIGNON      Sebrle 
   3.987907    3.159976    2.296929    4.017789    4.911700    4.344849 
       Clay      Karpov       Macey     Warners   Zsivoczky       Hernu 
   4.190954    4.760789    3.287865    2.675445    2.522958    1.785762 
    Bernard    Schwarzl   Pogorelov  Schoenbeck      Barras      KARPOV 
   2.641850    1.866741    2.423288    1.921158    2.502977    3.283288 
    WARNERS        Nool       Drews 
   2.168062    3.028727    3.337019 

I believe these are showing the correlations of each factor in each dimension, the circle graph shows which event is in the same dimension and the length of the arrow should the contribution!

These can be plotted together with the atheletes in each dimension as well, to see which atheletes are better at which event.

Post Session