Import data

Dataset Cassidy diperoleh dari buku Multilevel Modeling using R yang ditulis oleh Finch, Bollin, & Kelley (2014). Link unduh dataset , pilih “Cassidy”

Cassidy <- read.csv("Cassidy.csv", header = T, sep = ",")
Cassidy
str(Cassidy)
'data.frame':   486 obs. of  30 variables:
 $ Gender        : int  2 1 2 2 2 2 2 2 2 2 ...
 $ Male          : int  1 0 1 1 1 1 1 1 1 1 ...
 $ Minority      : int  1 1 1 1 1 1 0 1 1 1 ...
 $ Age           : int  21 20 21 22 21 21 21 21 21 20 ...
 $ GPA           : num  2.5 2.2 2.8 2.25 3.1 ...
 $ Verbal        : int  NA NA NA NA NA 600 NA 525 NA NA ...
 $ Math          : int  NA NA NA NA NA 600 NA 600 NA NA ...
 $ TotSAT        : int  NA NA NA NA NA 1200 NA 1125 NA NA ...
 $ SSH.total     : int  20 20 18 18 19 21 22 32 20 31 ...
 $ BStotal       : int  14 20 23 30 17 NA 15 34 19 10 ...
 $ CTA.tot       : int  42 NA 40 49 36 45 49 61 36 17 ...
 $ PTTtotal      : int  NA 44 47 47 46 57 50 59 40 35 ...
 $ PTT.factor1   : int  25 27 25 26 22 34 31 34 20 23 ...
 $ PTT.factor2   : int  NA 13 16 13 17 17 13 5 13 7 ...
 $ PTT.factor3   : int  11 4 6 8 7 6 6 20 7 5 ...
 $ Perf.CM       : int  25 31 27 22 33 28 25 42 20 9 ...
 $ Perf.D        : int  10 12 14 12 16 10 16 19 12 4 ...
 $ Perf.PE       : int  17 18 18 13 21 19 14 22 18 7 ...
 $ Perf.PC       : int  10 13 10 10 16 9 12 15 7 4 ...
 $ Perf.PS       : int  26 28 25 21 26 25 27 31 21 22 ...
 $ Perf.O        : int  24 30 24 18 25 22 28 30 24 23 ...
 $ harvey.negproj: int  32 39 37 32 46 36 37 56 30 12 ...
 $ harvey.achexp : int  29 32 29 23 29 27 31 36 23 23 ...
 $ harvey.parinf : int  27 31 28 23 37 28 26 37 25 11 ...
 $ harvey.org    : int  24 30 24 18 25 22 28 30 24 23 ...
 $ stoeberORG    : int  24 30 24 18 25 22 28 30 24 23 ...
 $ stoeberCMD    : int  32 39 37 32 46 36 37 56 30 12 ...
 $ stoeberPEC    : int  27 31 28 23 37 28 26 37 25 11 ...
 $ stoeberPS     : int  26 29 25 20 25 24 27 32 19 18 ...
 $ Harvey4f      : int  1 4 1 1 4 1 3 4 1 2 ...

Jalankan Pemodelan Regresi Linear

Model regresinya ditulis sebagai berikut. Sesuai dengan contoh yang disajikan dalam Finch, Bollin, & Kelley (2014, p.28), dalam dataset Cassidy GPA akan diprediksi melalui variabel ukuran kecemasan fisik (BSTotal) dan Cognitive Test Anxiety (CTA.tot)

Model1.1 <- lm(GPA ~ CTA.tot + BStotal, Cassidy)
Model1.1

Call:
lm(formula = GPA ~ CTA.tot + BStotal, data = Cassidy)

Coefficients:
(Intercept)      CTA.tot      BStotal  
    3.61892     -0.02007      0.01347  

Hasil Pemodelan

summary(Model1.1)

Call:
lm(formula = GPA ~ CTA.tot + BStotal, data = Cassidy)

Residuals:
     Min       1Q   Median       3Q      Max 
-2.99239 -0.29138  0.01516  0.36849  0.93941 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept)  3.618924   0.079305  45.633  < 2e-16 ***
CTA.tot     -0.020068   0.003065  -6.547 1.69e-10 ***
BStotal      0.013469   0.005077   2.653  0.00828 ** 
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.4852 on 426 degrees of freedom
  (57 observations deleted due to missingness)
Multiple R-squared:  0.1066,    Adjusted R-squared:  0.1024 
F-statistic: 25.43 on 2 and 426 DF,  p-value: 3.706e-11

ANOVA

anova(Model1.1)
Analysis of Variance Table

Response: GPA
           Df  Sum Sq Mean Sq F value    Pr(>F)    
CTA.tot     1  10.316 10.3159 43.8125 1.089e-10 ***
BStotal     1   1.657  1.6570  7.0376   0.00828 ** 
Residuals 426 100.304  0.2355                      
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Output yang bisa disediakan R dalam packages lm

attributes(Model1.1)
$names
 [1] "coefficients"  "residuals"     "effects"       "rank"          "fitted.values" "assign"        "qr"            "df.residual"  
 [9] "na.action"     "xlevels"       "call"          "terms"         "model"        

$class
[1] "lm"

Nilai GPA yang diprediksi

Model1.1$fitted.values
       1        3        4        5        8        9       10       11       12       13       14       15       16       17 
2.964641 3.125996 3.039668 3.125454 2.852730 3.152391 3.412460 3.011917 2.611103 3.158448 3.298923 3.312121 2.959938 3.205183 
      19       23       25       26       27       28       29       30       31       34       35       37       38       39 
2.945928 2.904979 3.226064 3.245318 2.944573 3.171646 2.917635 3.198584 3.206267 3.073204 3.258787 3.118584 2.972594 2.870630 
      41       42       43       44       45       46       48       50       51       52       53       54       55       56 
3.144980 3.285454 3.386064 2.871713 2.911849 3.166131 3.051511 3.251917 3.080345 3.131782 3.292053 3.138923 3.372324 3.372324 
      57       58       59       60       61       62       63       65       66       67       68       69       70       71 
3.065521 3.212324 3.325590 3.093543 3.172459 2.918177 3.104844 2.985250 3.225251 3.245318 3.151849 2.937703 2.978109 3.058651 
      72       73       74       75       76       77       79       80       81       83       84       86       87       88 
2.852188 3.011375 3.225251 3.253272 3.118313 3.292053 2.664979 2.998448 2.878041 2.737025 3.185928 3.232121 3.225793 3.106199 
      89       90       91       92       93       94       95       96       97       99      100      101      102      103 
2.885995 2.978380 3.231850 3.111443 3.292053 2.924505 3.266741 3.192256 3.012730 3.053407 3.119126 3.053949 3.044912 2.997906 
     104      106      107      108      109      111      112      113      114      117      118      119      120      121 
3.327757 3.058651 3.098245 2.985792 3.486674 2.898109 3.211782 3.185115 3.225793 3.199939 2.891781 3.191714 3.298923 3.118313 
     122      123      124      125      126      127      128      129      130      131      132      133      134      135 
2.885453 3.091375 2.898651 3.118313 2.818922 3.019058 3.032256 2.937703 3.246673 3.084776 2.924234 2.797499 3.078448 2.891781 
     136      137      138      139      140      141      142      143      144      145      147      148      149      150 
2.784843 2.925047 3.232121 3.158448 2.697160 2.978109 2.991578 2.965995 2.712255 2.924505 2.985521 3.139194 3.104844 3.111443 
     151      152      153      155      156      158      161      162      163      164      165      166      167      168 
3.011104 2.817567 3.286538 3.366267 3.139194 3.091375 2.783759 3.271985 3.171917 3.265657 2.925047 2.985250 2.965453 3.104844 
     169      171      172      174      175      176      177      178      179      180      181      182      183      185 
3.392392 3.131511 3.078448 3.205996 2.851646 3.073204 2.851375 2.964641 3.231850 3.292053 3.332460 2.980277 3.167215 2.737838 
     186      187      188      189      190      191      192      193      195      196      197      198      199      200 
3.185115 3.118855 2.831307 3.004776 3.225793 3.231850 3.266199 3.111443 3.138652 3.040210 2.897838 3.144980 3.059464 3.332189 
     202      203      204      205      207      209      210      213      214      215      216      217      218      219 
3.219194 2.892865 3.231850 3.058110 3.232392 3.205725 2.777973 3.132866 3.178516 3.238720 2.851104 3.171646 3.004505 3.251917 
     220      221      222      223      224      225      226      227      228      229      230      231      232      233 
3.018787 2.925318 3.265386 2.991307 3.098787 3.126267 2.992391 3.151849 3.031172 3.025386 3.252188 3.158448 2.890968 3.426470 
     234      235      236      237      238      239      240      241      242      243      244      245      246      247 
3.165047 2.925589 3.251917 3.225251 3.126267 3.191714 3.118313 2.957771 2.906063 3.258787 3.051511 2.972323 3.225251 3.151579 
     248      250      251      252      253      254      255      257      258      259      260      261      262      263 
3.024573 3.365725 3.132053 2.911849 2.884369 3.252188 3.158448 2.818651 3.005318 3.112527 3.131782 3.225251 3.151579 3.205183 
     264      265      266      267      268      269      270      271      272      273      274      275      276      277 
3.144980 3.125454 3.118584 3.238720 3.360210 2.984708 3.178787 2.811239 3.151579 3.205183 3.118313 3.365725 3.219465 3.278855 
     279      280      281      282      283      286      287      289      290      291      292      293      294      295 
2.930833 3.004505 2.972052 3.111443 3.118313 3.238720 2.924234 3.312121 3.292053 3.038042 3.118313 2.803827 3.265657 3.192527 
     296      297      298      299      300      301      302      303      304      305      306      307      308      309 
3.392392 2.777973 3.145792 3.105386 3.158448 3.086131 3.105386 3.271985 2.958854 2.931917 3.185928 3.158719 3.071307 3.011375 
     310      311      312      313      315      316      317      318      319      320      321      323      324      325 
3.238991 3.238720 3.353069 2.944302 3.085047 3.392392 3.245589 3.171917 2.792526 3.125725 3.278855 3.259058 2.951172 2.817296 
     326      327      328      329      330      331      332      333      334      335      336      337      338      340 
3.251917 3.111443 3.091917 3.234017 3.292053 3.285454 3.119126 3.231850 2.811239 3.305522 3.205183 3.212324 2.892323 3.165860 
     341      342      343      344      345      346      347      348      349      350      351      352      353      354 
2.804640 3.386064 3.099329 3.399533 3.252188 3.091375 3.359126 2.838990 3.399533 3.392392 3.285454 3.392392 3.385793 3.098245 
     355      356      357      358      360      362      363      364      365      367      369      370      371      372 
3.279397 3.018245 3.078448 3.372324 3.120210 3.251917 3.285725 3.111714 3.392663 3.339058 3.078448 3.191985 3.104844 3.231850 
     373      374      375      376      377      378      379      380      382      384      385      386      387      388 
3.245318 2.730426 3.205725 3.245589 3.225251 3.025115 3.285725 3.064979 2.978109 2.977838 3.219465 3.138381 3.059193 2.992933 
     389      390      391      392      394      395      396      397      398      399      400      401      402      403 
3.078177 2.958042 3.211782 2.998448 2.830494 3.131511 3.500414 3.379194 3.385793 3.172188 3.138652 3.386064 3.085589 3.005589 
     404      405      406      407      408      409      410      411      412      413      414      415      416      417 
3.259600 3.352256 3.118584 3.372324 3.085047 3.191985 3.285454 3.312392 3.231850 3.159803 3.191714 3.258787 2.944573 3.124912 
     418      419      420      421      422      423      424      425      426      427      428      429      430      431 
3.105115 3.211782 3.278855 3.399262 3.051240 3.025115 2.959396 3.199668 3.258787 3.266199 3.238720 3.392392 3.305522 3.178516 
     432      433      434      435      436      437      438      439      440      441      442      445      446      447 
3.312121 3.238720 3.198584 3.272256 3.292053 3.206267 3.179329 3.365725 3.292053 3.372324 3.345657 3.158990 3.131511 3.071307 
     448      449      450      451      452      453      454      455      456      457      458      459      460      461 
3.305793 3.171646 3.312121 3.071307 3.065521 3.339058 3.113069 3.426200 2.925860 3.231850 3.251917 2.984437 3.312392 2.931375 
     462      463      464      465      466      467      468      469      470      471      472      473      475      476 
3.392392 3.379194 3.447080 3.198855 3.392392 3.379465 3.205454 3.412460 3.225251 3.305522 3.091375 3.392392 3.024573 3.318991 
     477      478      479      480      481      482      483      484      485 
3.392392 3.425929 3.386064 3.459736 3.199397 3.312121 3.192256 2.864031 3.245318 

Residu

Model1.1$residuals
            1             3             4             5             8             9            10            11            12 
-0.4646405061 -0.3259956916 -0.7896675749 -0.0254537419  0.4492704297 -0.0283914353 -0.1124596847 -0.5119169570  0.0888967457 
           13            14            15            16            17            19            23            25            26 
-0.6584484215 -0.7989228998 -0.4221207716 -0.5799383942 -0.3051829226 -0.1459275978 -0.8649791080  0.0989363702 -0.2453184879 
           27            28            29            30            31            34            35            37            38 
-0.4445727235  0.7783537067 -0.8176350301  0.1014160133  0.3937331779 -0.1232042042  0.3412126654  0.4814161689  0.9394056837 
           39            41            42            43            44            45            46            48            50 
-0.6706295541 -0.5449795748 -0.4194540531 -0.4960639410 -0.0717134535 -0.4118490187  0.4338687432  0.7484894275  0.4480825762 
           51            52            53            54            55            56            57            58            59 
 0.1496549101 -0.3317817030 -0.0920529890  0.1910774114  0.4276758805  0.6276758805  0.5244786311  0.0156761917  0.3744103817 
           60            61            62            63            65            66            67            68            69 
 0.7564570383  0.2275407821  0.1818230202  0.1951559904 -0.0142502384 -0.2252507053 -0.3453184879  0.0481505144 -0.1377028127 
           70            71            72            73            74            75            76            77            79 
-0.7281093528 -0.5586514581  0.3478123794 -0.1113750073  0.5747492947  0.1467277019  0.2116871437  0.1579470110 -0.0849786411 
           80            81            83            84            86            87            88            89            90 
 0.6015518897 -0.0780414146  0.2209750135 -0.8859280646  0.3778793840  0.6642073450 -0.6061988838 -0.0859952248 -0.0783803277 
           91            92            93            94            95            96            97            99           100 
-0.4318496412 -0.6114429455 -0.1220529890 -0.9245049409  0.1932588552 -0.1922560257 -0.1127298815 -0.1534073965  0.6808742192 
          101           102           103           104           106           107           108           109           111 
-0.0539493462 -0.2449116366  0.1020938394 -0.6277574172  0.3413485419  0.4017549264  0.0142078119 -0.4866738290  0.0018908028 
          112           113           114           117           118           119           120           121           122 
-0.5017818586  0.5148848600 -0.3257926550 -0.4999388610 -0.6417812361  0.4682859240  0.2010771002 -0.2183128563  0.1145467249 
          123           124           125           126           127           128           129           130           131 
 0.8086248371 -0.9986511469  0.1816871437 -0.3189219661 -0.0190578426 -0.5322557144 -0.7877028127  0.0533266379 -0.7507762269 
          132           133           134           135           136           137           138           139           140 
-0.4742339660  0.4025006907 -0.0784482659 -0.1917812361  0.0151566129  0.3749531094 -0.1321206160  0.6415515785  0.0028396038 
          141           142           143           144           145           147           148           149           150 
 0.2218906472  0.8304218005  0.0340046196  0.7877449080 -0.9245049409  0.5904787867 -0.2391935634 -0.5048440096  0.0885570545 
          151           152           153           155           156           158           161           162           163 
-0.5111040324  0.2824329081 -0.2865379525  0.1297328667  0.2608064366 -0.2913751629  0.9212405123  0.2280147936 -0.3719172682 
          164           165           166           167           168           169           171           172           174 
 0.3343427547  0.2749531094 -0.3852502384 -0.5654534307  0.6951559904 -0.8923919021  0.0684892719 -1.0784482659 -0.6409958472 
          175           176           177           178           179           180           181           182           183 
-0.1516456709  0.4267957958  0.1486253040  0.0353594939 -0.4318496412 -0.0920529890 -0.0324595291  0.6197228484 -0.1672151562 
          185           186           187           188           189           190           191           192           193 
-0.0378379111 -0.1851151400  0.3811451940  0.5016930866  0.6452239287  0.5342073450 -0.0318496412  0.0338008049 -0.4114429455 
          195           196           197           198           199           200           202           203           204 
 0.5313483863 -1.0402095246 -0.3978382223 -0.4449795748 -0.0374643827 -0.1321885543 -1.3191937190 -0.1778651355 -0.5318496412 
          205           207           209           210           213           214           215           216           217 
-0.0781095084 -0.9323915909  0.4942751276 -0.0779734763 -0.1328656024 -0.8785162041  0.3612804480 -0.4511037212  0.4283537067 
          218           219           220           221           222           223           224           225           226 
-0.0045050965  0.1580825762  0.1812131323 -0.0423178654  0.3346137295  0.1086927754 -0.0987870234 -0.6262666665 -2.9923911241 
          227           228           229           230           231           232           233           234           235 
-0.6168494856  0.2688281850  0.4746141963  0.3478116013  0.1415515785 -0.5909683116 -0.7264704811 -0.0650473574  0.0644111597 
          236           237           238           239           240           241           242           243           244 
-1.1519174238 -0.9252507053 -0.9262666665 -0.1917140760 -0.0183128563  0.0422294047  0.2939369926 -0.2587873346 -0.0715105725 
          245           246           247           248           250           251           252           253           254 
-0.1723233414  0.0747492947 -0.2515785107  0.0754271209 -0.4657251836  0.1679473222 -0.2118490187  0.2156306244  0.2178116013 
          255           257           258           259           260           261           262           263           264 
 0.2415515785  0.5813490087  0.1946819790  0.3764731551  0.3682182970  0.1747492947 -0.3785785107 -0.1051829226  0.7210204252 
          265           266           267           268           269           270           271           272           273 
-0.6254537419 -0.9185838311 -0.0387195520 -0.1602101471  0.5152917113  0.0912128210 -0.5112391308  0.4484214893 -0.9051829226 
          274           275           276           277           279           280           281           282           283 
-0.6183128563 -0.9657251836 -0.6194646939  0.5211448828  0.0691670981 -0.2045050965  0.9279476334  0.3485570545 -0.0183128563 
          286           287           289           290           291           292           293           294           295 
 0.2112804480 -0.4242339660 -0.6121207716 -1.4920529890 -0.2380417258 -0.1183128563  0.1761727297 -0.7656572453 -1.1925270005 
          296           297           298           299           300           301           302           303           304 
 0.0486080979 -0.7779734763  0.3542075006 -0.6353859593  0.5915515785  0.2438688988 -0.0053859593 -0.1719852064  0.3411455053 
          305           306           307           308           309           310           311           312           313 
 0.0680831986 -0.6859280646  0.2412806037 -0.5713073803 -0.1493750073 -0.0389905268 -0.1387195520  0.2969307386 -0.1443017486 
          315           316           317           318           319           320           321           323           324 
 0.2149527982 -0.3923919021  0.0544105373  0.4980827318  0.4074737775 -0.0257247167 -0.3988551172  0.0409416906  0.0358283406 
          325           326           327           328           329           330           331           332           333 
 0.5827038830  0.0480825762  0.7885570545  0.7080828874  0.2099825600  0.4679470110 -0.2854540531  0.4608742192 -0.3418496412 
          334           335           336           337           338           340           341           342           343 
 0.2887608692  0.1944781643 -0.4551829226 -0.2123238083  0.1076768142 -0.1858602820 -0.6366401949  0.3959360590  0.0006710269 
          344           345           346           347           348           349           350           351           352 
 0.3334672123  0.6478116013  0.5786248371  0.4408737523 -0.0389897488 -0.1995327877 -0.4923919021  0.2145459469  0.1076080979 
          353           354           355           356           357           358           360           362           363 
 0.2372070338 -0.4982450736  0.6806029331  0.4717550820  0.5275517341 -0.2723241195  0.5797903198  0.6680825762 -0.2857250280 
          364           365           367           369           370           371           372           373           374 
-0.1117139203 -0.1926628770  0.0809415350  0.2215517341 -0.4419850508  0.5951559904 -0.0238496412  0.2546815121  0.4095739494 
          375           376           377           378           379           380           382           384           385 
-0.5057248724  0.3544105373 -0.3252507053 -0.3251148288 -1.0857250280 -0.1649794192 -0.0881093528 -0.0278383779  0.2805353061 
          386           387           388           389           390           391           392           394           395 
 0.7716193611  0.7408065922  0.3070669262 -0.0781772910 -0.1580415702 -0.4117818586  0.5675518897  0.1695060111  0.3684892719 
          396           397           398           399           400           401           402           403           404 
-0.0004136505  0.2208059697  0.2142070338  0.4078117570  0.5613483863  0.5139360590  0.0504108485  0.6944110041 -0.1496002591 
          405           406           407           408           409           410           411           412           413 
-0.0522563369 -0.5185838311 -0.0723241195  0.7149527982 -0.1919850508  0.7145459469 -0.1123917465  0.1681503588 -0.0598032958 
          414           415           416           417           418           419           420           421           422 
-0.4917140760  0.0412126654  0.2554272765 -0.5249117922 -0.2051149844  0.2832181414  0.3211448828 -0.0992618129  0.5487604024 
          423           424           425           426           427           428           429           430           431 
 0.2748851712  0.6376035555  0.7003321139 -0.7587873346  0.1338008049  0.5612804480  0.0066080979 -0.0055218357  0.5214837959 
          432           433           434           435           436           437           438           439           440 
 0.1878792284  0.2612804480 -0.2085839867  0.4747438187  0.4969470110 -0.9062668221  0.5206708713 -0.2257251836 -0.1820529890 
          441           442           445           446           447           448           449           450           451 
 0.0276758805  0.5543425990  0.6310096288  0.0014892719  0.1286926197  0.4442071894 -0.1716462933 -0.5121207716 -0.3713073803 
          452           453           454           455           456           457           458           459           460 
 0.1534786311  0.2609415350  0.6369312054  0.5338004937  0.5741401849 -0.0318496412  0.2480825762 -0.2844373139  0.4876082535 
          461           462           463           464           465           466           467           468           469 
 0.7686251484  0.4076080979  0.4208059697 -0.4470802135  0.8011450384  0.4076080979 -0.5484650051  0.6945461025  0.4505403153 
          470           471           472           473           475           476           477           478           479 
 0.7047492947  0.6804781643  0.4856248371  0.5776080979  0.8434271209 -0.0189906824 -0.0923919021  0.5740714686  0.4139360590 
          480           481           482           483           484           485 
 0.5202638644 -0.2993969113  0.6018792284  0.8077439743  0.0359693818 -0.7453184879 

Model Regresi dengan Interaksi (Mediation Analysis)

Model1.2 <- lm(GPA ~ CTA.tot + BStotal + CTA.tot*BStotal, 
               Cassidy)
summary(Model1.2)

Call:
lm(formula = GPA ~ CTA.tot + BStotal + CTA.tot * BStotal, data = Cassidy)

Residuals:
     Min       1Q   Median       3Q      Max 
-2.98711 -0.29737  0.01801  0.36340  0.95016 

Coefficients:
                  Estimate Std. Error t value Pr(>|t|)    
(Intercept)      3.8977792  0.2307491  16.892  < 2e-16 ***
CTA.tot         -0.0267935  0.0060581  -4.423 1.24e-05 ***
BStotal         -0.0057595  0.0157812  -0.365    0.715    
CTA.tot:BStotal  0.0004328  0.0003364   1.287    0.199    
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.4849 on 425 degrees of freedom
  (57 observations deleted due to missingness)
Multiple R-squared:  0.1101,    Adjusted R-squared:  0.1038 
F-statistic: 17.53 on 3 and 425 DF,  p-value: 9.558e-11

Model Regresi dengan Variabel Bebas Kategori

Disini kita memasukkan variabel Male (0 : Perempuan, 1 : Laki-laki)

Model1.4 <- lm(GPA~CTA.tot + Male, Cassidy)
summary(Model1.4)

Call:
lm(formula = GPA ~ CTA.tot + Male, data = Cassidy)

Residuals:
     Min       1Q   Median       3Q      Max 
-3.01149 -0.29005  0.03038  0.35374  0.96294 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept)  3.740318   0.080940  46.211  < 2e-16 ***
CTA.tot     -0.015184   0.002117  -7.173 3.16e-12 ***
Male        -0.222594   0.047152  -4.721 3.17e-06 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.4775 on 437 degrees of freedom
  (46 observations deleted due to missingness)
Multiple R-squared:  0.1364,    Adjusted R-squared:  0.1324 
F-statistic: 34.51 on 2 and 437 DF,  p-value: 1.215e-14
GPAmodel1.5 <- lm(GPA~CTA.tot + Minority, Cassidy)
summary(GPAmodel1.5)

Call:
lm(formula = GPA ~ CTA.tot + Minority, data = Cassidy)

Residuals:
     Min       1Q   Median       3Q      Max 
-2.95593 -0.27885  0.01024  0.37506  1.11245 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept)  3.331752   0.118509  28.114  < 2e-16 ***
CTA.tot     -0.014555   0.002137  -6.812 3.21e-11 ***
Minority     0.322835   0.096014   3.362 0.000841 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.4833 on 437 degrees of freedom
  (46 observations deleted due to missingness)
Multiple R-squared:  0.1152,    Adjusted R-squared:  0.1112 
F-statistic: 28.46 on 2 and 437 DF,  p-value: 2.403e-12

Uji Asumsi Regresi Linear

Fungsi residualPlots melakukan uji lack-of-fit dimana t-test dari kuadrat prediktor di-plot terhadap residual (sumbu y) dan dibuat garis kecocokan untuk memeriksa pola nonlinear di dalam data.
Uji Tukey untuk non-additivity juga dihitung pada residualPlots terhadap nilai y-pred untuk menambah informasi tentang kecukupan dari model fit along with uji lack-of-fit untuk setiap prediktor.
Statistik Tukey diperoleh dengan menambahkan kuadrat dari nilai y-pred pada model regresi awal. Ini akan menguji hipotesis nol bahwa model aditif dan tidak ada interaksi diantara variabel bebas (Tukey, 1949).
Hasil nonsignifikan (Ho diterima) berarti bahwa tidak ada interaksi variabel bebas di dalam model.

library(car)
residualPlots(Model1.1)
           Test stat Pr(>|Test stat|)
CTA.tot       1.1085           0.2683
BStotal       0.5973           0.5506
Tukey test    0.4988           0.6179

Melalui residualPlots kita bisa menguji asumsi dalam model regresi. Pertama, asumsi homogenitas varians melalui fitted plot. Jika asumsi dipenuhi, plot kita seharusnya tidak berbentuk awan dari titik data dengan jarak yang sama untuk semua nilai x.
Kemudian asumsi linearitas, jika garis pad variabel bebas datar, seperti dalam kasus kita disini, kita dapat menyimpulkan bahwa hubungan BSTotal dan CTA.tot adalah linear terhadap GPA.

Asumsi normalitas residu

QQplots (quantile–quantile plots) digunakan untuk mengevaluasi normalitas residu. Semakin dekat data amatan pada garis (distribusi normal), maka asumsi normalitas residu terpenuhi.

qqPlot(Model1.1)
[1] 226 290

LS0tDQp0aXRsZTogIlJlZ3Jlc2kgTGluZWFyIGRlbmdhbiBSIg0Kb3V0cHV0OiBodG1sX25vdGVib29rDQphdXRob3IgOiBQdXJ3b2tvIEhhcnlhZGkgU2FudG9zbw0KLS0tDQoNCiMgSW1wb3J0IGRhdGENCkRhdGFzZXQgKipDYXNzaWR5KiogZGlwZXJvbGVoIGRhcmkgYnVrdSAqKk11bHRpbGV2ZWwgTW9kZWxpbmcgdXNpbmcgUioqIHlhbmcgZGl0dWxpcyBvbGVoIEZpbmNoLCBCb2xsaW4sICYgS2VsbGV5ICgyMDE0KS4gWyoqTGluayB1bmR1aCBkYXRhc2V0KipdKGh0dHA6Ly93d3cubWxtaW5yLmNvbS9kYXRhLXNldHMpICwgcGlsaWggIkNhc3NpZHkiDQpgYGB7cn0NCkNhc3NpZHkgPC0gcmVhZC5jc3YoIkNhc3NpZHkuY3N2IiwgaGVhZGVyID0gVCwgc2VwID0gIiwiKQ0KQ2Fzc2lkeQ0KYGBgDQoNCmBgYHtyfQ0Kc3RyKENhc3NpZHkpDQpgYGANCg0KIyBKYWxhbmthbiBQZW1vZGVsYW4gUmVncmVzaSBMaW5lYXINCk1vZGVsIHJlZ3Jlc2lueWEgZGl0dWxpcyBzZWJhZ2FpIGJlcmlrdXQuIFNlc3VhaSBkZW5nYW4gY29udG9oIHlhbmcgZGlzYWppa2FuIGRhbGFtIEZpbmNoLCBCb2xsaW4sICYgS2VsbGV5ICgyMDE0LCBwLjI4KSwgZGFsYW0gZGF0YXNldCBbKipDYXNzaWR5KipdKGh0dHA6Ly93d3cubWxtaW5yLmNvbS9kYXRhLXNldHMpIEdQQSBha2FuIGRpcHJlZGlrc2kgbWVsYWx1aSB2YXJpYWJlbCB1a3VyYW4ga2VjZW1hc2FuIGZpc2lrICgqKkJTVG90YWwqKikgZGFuICpDb2duaXRpdmUgVGVzdCBBbnhpZXR5KiAoKipDVEEudG90KiopDQoNCmBgYHtyfQ0KTW9kZWwxLjEgPC0gbG0oR1BBIH4gQ1RBLnRvdCArIEJTdG90YWwsIENhc3NpZHkpDQpNb2RlbDEuMQ0KYGBgDQoNCiMgSGFzaWwgUGVtb2RlbGFuDQoNCmBgYHtyfQ0Kc3VtbWFyeShNb2RlbDEuMSkNCmBgYA0KDQojIEFOT1ZBDQpgYGB7cn0NCmFub3ZhKE1vZGVsMS4xKQ0KYGBgDQoNCiMgT3V0cHV0IHlhbmcgYmlzYSBkaXNlZGlha2FuIFIgZGFsYW0gcGFja2FnZXMgKipsbSoqDQpgYGB7cn0NCmF0dHJpYnV0ZXMoTW9kZWwxLjEpDQpgYGANCg0KIyBOaWxhaSBHUEEgeWFuZyBkaXByZWRpa3NpDQpgYGB7cn0NCk1vZGVsMS4xJGZpdHRlZC52YWx1ZXMNCmBgYA0KDQojIFJlc2lkdQ0KYGBge3J9DQpNb2RlbDEuMSRyZXNpZHVhbHMNCmBgYA0KDQojIE1vZGVsIFJlZ3Jlc2kgZGVuZ2FuIEludGVyYWtzaSAoKk1lZGlhdGlvbiBBbmFseXNpcyopDQpgYGB7cn0NCk1vZGVsMS4yIDwtIGxtKEdQQSB+IENUQS50b3QgKyBCU3RvdGFsICsgQ1RBLnRvdCpCU3RvdGFsLCANCiAgICAgICAgICAgICAgIENhc3NpZHkpDQpzdW1tYXJ5KE1vZGVsMS4yKQ0KYGBgDQoNCiMgTW9kZWwgUmVncmVzaSBkZW5nYW4gVmFyaWFiZWwgQmViYXMgS2F0ZWdvcmkNCkRpc2luaSBraXRhIG1lbWFzdWtrYW4gdmFyaWFiZWwgTWFsZSAoMCA6IFBlcmVtcHVhbiwgMSA6IExha2ktbGFraSkNCmBgYHtyfQ0KTW9kZWwxLjQgPC0gbG0oR1BBfkNUQS50b3QgKyBNYWxlLCBDYXNzaWR5KQ0Kc3VtbWFyeShNb2RlbDEuNCkNCmBgYA0KDQpgYGB7cn0NCkdQQW1vZGVsMS41IDwtIGxtKEdQQX5DVEEudG90ICsgTWlub3JpdHksIENhc3NpZHkpDQpzdW1tYXJ5KEdQQW1vZGVsMS41KQ0KYGBgDQoNCiMgVWppIEFzdW1zaSBSZWdyZXNpIExpbmVhcg0KRnVuZ3NpICoqcmVzaWR1YWxQbG90cyoqIG1lbGFrdWthbiB1amkgbGFjay1vZi1maXQgZGltYW5hIHQtdGVzdCBkYXJpIGt1YWRyYXQgcHJlZGlrdG9yIGRpLXBsb3QgdGVyaGFkYXAgcmVzaWR1YWwgKHN1bWJ1ICp5KikgZGFuIGRpYnVhdCBnYXJpcyBrZWNvY29rYW4gdW50dWsgbWVtZXJpa3NhIHBvbGEgbm9ubGluZWFyIGRpIGRhbGFtIGRhdGEuIFwNClVqaSBUdWtleSB1bnR1ayBub24tYWRkaXRpdml0eSBqdWdhIGRpaGl0dW5nIHBhZGEgKipyZXNpZHVhbFBsb3RzKiogdGVyaGFkYXAgbmlsYWkgeS1wcmVkIHVudHVrIG1lbmFtYmFoIGluZm9ybWFzaSB0ZW50YW5nIGtlY3VrdXBhbiBkYXJpIG1vZGVsIGZpdCBhbG9uZyB3aXRoIHVqaSBsYWNrLW9mLWZpdCB1bnR1ayBzZXRpYXAgcHJlZGlrdG9yLiBcDQpTdGF0aXN0aWsgVHVrZXkgZGlwZXJvbGVoIGRlbmdhbiBtZW5hbWJhaGthbiBrdWFkcmF0IGRhcmkgbmlsYWkgeS1wcmVkIHBhZGEgbW9kZWwgcmVncmVzaSBhd2FsLiBJbmkgYWthbiBtZW5ndWppIGhpcG90ZXNpcyBub2wgYmFod2EgbW9kZWwgYWRpdGlmIGRhbiB0aWRhayBhZGEgaW50ZXJha3NpIGRpYW50YXJhIHZhcmlhYmVsIGJlYmFzIChUdWtleSwgMTk0OSkuIFwNCkhhc2lsIG5vbnNpZ25pZmlrYW4gKEhvIGRpdGVyaW1hKSBiZXJhcnRpIGJhaHdhIHRpZGFrIGFkYSBpbnRlcmFrc2kgdmFyaWFiZWwgYmViYXMgZGkgZGFsYW0gbW9kZWwuDQpgYGB7cn0NCmxpYnJhcnkoY2FyKQ0KcmVzaWR1YWxQbG90cyhNb2RlbDEuMSkNCmBgYA0KTWVsYWx1aSAqKnJlc2lkdWFsUGxvdHMqKiBraXRhIGJpc2EgbWVuZ3VqaSBhc3Vtc2kgZGFsYW0gbW9kZWwgcmVncmVzaS4gUGVydGFtYSwgYXN1bXNpIGhvbW9nZW5pdGFzIHZhcmlhbnMgbWVsYWx1aSBmaXR0ZWQgcGxvdC4gSmlrYSBhc3Vtc2kgZGlwZW51aGksIHBsb3Qga2l0YSBzZWhhcnVzbnlhIHRpZGFrIGJlcmJlbnR1ayBhd2FuIGRhcmkgdGl0aWsgZGF0YSBkZW5nYW4gamFyYWsgeWFuZyBzYW1hIHVudHVrIHNlbXVhIG5pbGFpICp4Ki4gXA0KS2VtdWRpYW4gYXN1bXNpIGxpbmVhcml0YXMsIGppa2EgZ2FyaXMgcGFkIHZhcmlhYmVsIGJlYmFzIGRhdGFyLCBzZXBlcnRpIGRhbGFtIGthc3VzIGtpdGEgZGlzaW5pLCBraXRhIGRhcGF0IG1lbnlpbXB1bGthbiBiYWh3YSBodWJ1bmdhbiBCU1RvdGFsIGRhbiBDVEEudG90IGFkYWxhaCBsaW5lYXIgdGVyaGFkYXAgR1BBLg0KDQojIEFzdW1zaSBub3JtYWxpdGFzIHJlc2lkdQ0KUVFwbG90cyAocXVhbnRpbGXigJNxdWFudGlsZSBwbG90cykgZGlndW5ha2FuIHVudHVrIG1lbmdldmFsdWFzaSBub3JtYWxpdGFzIHJlc2lkdS4gU2VtYWtpbiBkZWthdCBkYXRhIGFtYXRhbiBwYWRhIGdhcmlzIChkaXN0cmlidXNpIG5vcm1hbCksIG1ha2EgYXN1bXNpIG5vcm1hbGl0YXMgcmVzaWR1IHRlcnBlbnVoaS4NCmBgYHtyfQ0KcXFQbG90KE1vZGVsMS4xKQ0KYGBgDQoNCg==