Introduction:

In this homework, you will apply logistic regression to a real-world dataset: the Pima Indians Diabetes Database. This dataset contains medical records from 768 women of Pima Indian heritage, aged 21 or older, and is used to predict the onset of diabetes (binary outcome: 0 = no diabetes, 1 = diabetes) based on physiological measurements.

The data is publicly available from the UCI Machine Learning Repository and can be imported directly.

Dataset URL: https://raw.githubusercontent.com/jbrownlee/Datasets/master/pima-indians-diabetes.data.csv

Columns (no header in the CSV, so we need to assign them manually):

  1. Pregnancies: Number of times pregnant
  2. Glucose: Plasma glucose concentration (2-hour test)
  3. BloodPressure: Diastolic blood pressure (mm Hg)
  4. SkinThickness: Triceps skin fold thickness (mm)
  5. Insulin: 2-hour serum insulin (mu U/ml)
  6. BMI: Body mass index (weight in kg/(height in m)^2)
  7. DiabetesPedigreeFunction: Diabetes pedigree function (a function scoring genetic risk)
  8. Age: Age in years
  9. Outcome: Class variable (0 = no diabetes, 1 = diabetes)

Task Overview: You will load the data, build a logistic regression model to predict diabetes onset using a subset of predictors (Glucose, BMI, Age), interpret the model, evaluate it with a confusion matrix and metrics, and analyze the ROC curve and AUC.

Cleaning the dataset Don’t change the following code

library(tidyverse)
## ── Attaching core tidyverse packages ──────────────────────── tidyverse 2.0.0 ──
## ✔ dplyr     1.1.4     ✔ readr     2.1.5
## ✔ forcats   1.0.1     ✔ stringr   1.5.1
## ✔ ggplot2   4.0.0     ✔ tibble    3.3.0
## ✔ lubridate 1.9.4     ✔ tidyr     1.3.1
## ✔ purrr     1.1.0     
## ── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──
## ✖ dplyr::filter() masks stats::filter()
## ✖ dplyr::lag()    masks stats::lag()
## ℹ Use the conflicted package (<http://conflicted.r-lib.org/>) to force all conflicts to become errors
url <- "https://raw.githubusercontent.com/jbrownlee/Datasets/master/pima-indians-diabetes.data.csv"

data <- read.csv(url, header = FALSE)

colnames(data) <- c("Pregnancies", "Glucose", "BloodPressure", "SkinThickness", "Insulin", "BMI", "DiabetesPedigreeFunction", "Age", "Outcome")

data$Outcome <- as.factor(data$Outcome)

# Handle missing values (replace 0s with NA because 0 makes no sense here)
data$Glucose[data$Glucose == 0] <- NA
data$BloodPressure[data$BloodPressure == 0] <- NA
data$BMI[data$BMI == 0] <- NA


colSums(is.na(data))
##              Pregnancies                  Glucose            BloodPressure 
##                        0                        5                       35 
##            SkinThickness                  Insulin                      BMI 
##                        0                        0                       11 
## DiabetesPedigreeFunction                      Age                  Outcome 
##                        0                        0                        0

Question 1: Create and Interpret a Logistic Regression Model - Fit a logistic regression model to predict Outcome using Glucose, BMI, and Age.

## Enter your code here
# glm(hd ~ sex, data=data, family="binomial")

logisticRegressionModel <- glm(Outcome ~ Glucose + BMI + Age, data = data, family = "binomial")
logisticRegressionModel
## 
## Call:  glm(formula = Outcome ~ Glucose + BMI + Age, family = "binomial", 
##     data = data)
## 
## Coefficients:
## (Intercept)      Glucose          BMI          Age  
##    -9.03238      0.03555      0.08975      0.02870  
## 
## Degrees of Freedom: 751 Total (i.e. Null);  748 Residual
##   (16 observations deleted due to missingness)
## Null Deviance:       974.7 
## Residual Deviance: 725   AIC: 733

What does the intercept represent (log-odds of diabetes when predictors are zero)? The intercept has a value of -9 which suggests that the log-odds of diabetes is equal to -9.03238 when all predictors, Glucose, BMI, and Age are closely equal to zero.

For each predictor (Glucose, BMI, Age), does a one-unit increase raise or lower the odds of diabetes? Are they significant (p-value < 0.05)? Due to all predictors, Glucose, BMI, and Age possessing positive values, this suggests that this increases the odds of diabetes are significant in Glucose and Age due to being less than 0.05, but BMI is less significant due to being greater than 0.05.

Question 2: Confusion Matrix and Important Metric

Calculate and report the metrics:

Accuracy: (TP + TN) / Total Sensitivity (Recall): TP / (TP + FN) Specificity: TN / (TN + FP) Precision: TP / (TP + FP)

Use the following starter code

# Keep only rows with no missing values in Glucose, BMI, or Age
data_subset <- data[complete.cases(data[, c("Glucose", "BMI", "Age")]), ]

#Create a numeric version of the outcome (0 = no diabetes, 1 = diabetes).This is required for calculating confusion matrices.
data_subset$Outcome_num <- ifelse(data_subset$Outcome == "1", 1, 0)


# Predicted probabilities
predicted_data <- data.frame(
  probability.of.Outcome = logisticRegressionModel$fitted.values,
  Outcome = data_subset$Outcome)

predicted_data
##     probability.of.Outcome Outcome
## 1               0.66360006       1
## 2               0.06101402       0
## 3               0.61834186       1
## 4               0.06043396       0
## 5               0.65771328       1
## 6               0.14802668       0
## 7               0.06116212       1
## 8               0.28013239       0
## 9               0.90283168       1
## 11              0.29185049       0
## 12              0.79018904       1
## 13              0.49423685       0
## 14              0.88904285       1
## 15              0.65652969       1
## 16              0.13393352       1
## 17              0.54057167       1
## 18              0.15678020       1
## 19              0.36875552       0
## 20              0.28484895       1
## 21              0.43753713       0
## 22              0.28886248       0
## 23              0.93606735       1
## 24              0.20309566       1
## 25              0.68988838       1
## 26              0.34957706       1
## 27              0.72382298       1
## 28              0.05362751       0
## 29              0.43793280       0
## 30              0.32692619       0
## 31              0.44902956       0
## 32              0.55575994       1
## 33              0.04535146       0
## 34              0.04022137       0
## 35              0.28355775       0
## 36              0.09365574       0
## 37              0.46443795       0
## 38              0.24352489       1
## 39              0.16388268       1
## 40              0.46267837       1
## 41              0.76206518       0
## 42              0.59035695       0
## 43              0.13595021       0
## 44              0.93528537       1
## 45              0.55649424       0
## 46              0.86452121       1
## 47              0.41473239       0
## 48              0.03343950       0
## 49              0.27449787       1
## 51              0.04750056       0
## 52              0.07420421       0
## 53              0.05451568       0
## 54              0.87138830       1
## 55              0.65012987       0
## 56              0.02252439       0
## 57              0.89802263       1
## 58              0.40432752       0
## 59              0.74180750       0
## 60              0.28015166       0
## 62              0.44217047       1
## 63              0.01490161       0
## 64              0.25891619       0
## 65              0.30350831       1
## 66              0.12005398       0
## 67              0.24046916       1
## 68              0.55590485       0
## 69              0.03997582       0
## 70              0.38375591       0
## 71              0.15172557       1
## 72              0.31473014       0
## 73              0.63351150       1
## 74              0.34608813       0
## 75              0.06176809       0
## 77              0.06146859       0
## 78              0.18291002       0
## 79              0.56166300       1
## 80              0.10732115       0
## 81              0.08520738       0
## 83              0.08173799       0
## 84              0.06896305       0
## 85              0.78236624       1
## 86              0.19166509       0
## 87              0.33450675       0
## 88              0.21824692       0
## 89              0.59050304       1
## 90              0.10326043       0
## 91              0.02040067       0
## 92              0.30744087       0
## 93              0.31948020       0
## 94              0.39870087       1
## 95              0.23776250       0
## 96              0.56884043       0
## 97              0.09647762       0
## 98              0.01718930       0
## 99              0.07653216       0
## 100             0.65810775       1
## 101             0.77018913       1
## 102             0.33387716       0
## 103             0.12273756       0
## 104             0.04407517       0
## 105             0.15687002       0
## 106             0.20185497       0
## 107             0.05549181       0
## 108             0.44920355       0
## 109             0.09229839       0
## 110             0.16661940       1
## 111             0.67346049       1
## 112             0.70042432       1
## 113             0.08254696       0
## 114             0.07164768       0
## 115             0.62528208       1
## 116             0.67008425       1
## 117             0.38171854       1
## 118             0.07464178       0
## 119             0.08152471       0
## 120             0.05582038       0
## 121             0.90191908       1
## 122             0.20945381       0
## 123             0.17465864       0
## 124             0.51139163       0
## 125             0.20316943       1
## 126             0.44483507       1
## 127             0.48619280       0
## 128             0.23346251       0
## 129             0.34777790       1
## 130             0.26573200       1
## 131             0.67483938       1
## 132             0.31871597       1
## 133             0.72476634       1
## 134             0.18399064       0
## 135             0.04834651       0
## 136             0.33949245       0
## 137             0.10807987       0
## 138             0.07452834       0
## 139             0.30701297       0
## 140             0.23426580       0
## 141             0.26698030       0
## 142             0.34785494       0
## 143             0.16180713       0
## 144             0.25353709       1
## 145             0.51149487       0
## 147             0.05287106       0
## 148             0.17493386       0
## 149             0.74711670       0
## 150             0.06000638       0
## 151             0.46199537       0
## 152             0.12428635       0
## 153             0.68933166       1
## 154             0.67051462       0
## 155             0.96022281       1
## 156             0.86895301       1
## 157             0.06282464       0
## 158             0.09658196       0
## 159             0.06477073       0
## 160             0.85590643       1
## 161             0.50854865       0
## 162             0.31513699       0
## 163             0.44079186       0
## 164             0.09892429       0
## 165             0.34954797       1
## 166             0.18616725       1
## 167             0.44449836       0
## 168             0.24339382       0
## 169             0.19361257       0
## 170             0.15377524       0
## 171             0.16673816       1
## 172             0.43550662       1
## 173             0.06733514       0
## 174             0.15979544       0
## 175             0.05988683       0
## 176             0.78563291       1
## 177             0.11866407       0
## 178             0.91067595       1
## 179             0.80825740       0
## 180             0.53993201       1
## 181             0.05025601       0
## 182             0.26703676       0
## 184             0.03707194       0
## 185             0.40252230       0
## 186             0.90574248       1
## 187             0.86120289       1
## 188             0.34005026       1
## 189             0.14630667       1
## 190             0.36876070       1
## 191             0.07904064       0
## 192             0.36791134       0
## 193             0.59421282       1
## 194             0.83322339       1
## 195             0.06814984       0
## 196             0.72166676       1
## 197             0.07473295       0
## 198             0.07492956       1
## 199             0.21618023       1
## 200             0.45868538       1
## 201             0.16377117       0
## 202             0.56854406       0
## 203             0.13888668       0
## 204             0.05179431       0
## 205             0.39920095       0
## 206             0.10279202       0
## 207             0.94962688       1
## 208             0.83235858       1
## 209             0.11534291       0
## 210             0.86661368       1
## 211             0.04976700       0
## 212             0.67335126       0
## 213             0.89304307       0
## 214             0.61220621       1
## 215             0.27923019       1
## 216             0.76451750       1
## 217             0.22670470       1
## 218             0.27330482       0
## 219             0.07659116       1
## 220             0.38185630       1
## 221             0.72467033       1
## 222             0.78827161       1
## 223             0.18565015       0
## 224             0.58685205       0
## 225             0.06829164       0
## 226             0.09949318       0
## 227             0.18367083       0
## 228             0.68004613       1
## 229             0.89605868       0
## 230             0.46813090       0
## 231             0.64472858       1
## 232             0.76813313       1
## 233             0.03512856       0
## 234             0.32697577       0
## 235             0.04410466       0
## 236             0.84628204       1
## 237             0.88969137       1
## 238             0.87532621       1
## 239             0.61780765       1
## 240             0.05170763       0
## 241             0.07082780       0
## 242             0.10017276       0
## 243             0.23827632       1
## 244             0.19422429       1
## 245             0.60311596       0
## 246             0.83303549       1
## 247             0.32770854       0
## 248             0.89909414       0
## 249             0.38428435       0
## 250             0.15124012       0
## 251             0.22120889       0
## 252             0.23889824       0
## 253             0.04953331       0
## 254             0.11459758       0
## 255             0.11691034       1
## 256             0.19828080       1
## 257             0.17887125       0
## 258             0.15623408       0
## 259             0.69883589       0
## 260             0.71707281       1
## 261             0.81853025       0
## 262             0.36525031       1
## 263             0.11051715       0
## 264             0.67512915       0
## 265             0.31358505       1
## 266             0.20261849       0
## 267             0.46226051       1
## 268             0.44933995       0
## 269             0.07235915       0
## 270             0.36110033       1
## 271             0.43567653       1
## 272             0.08877056       0
## 273             0.18493829       0
## 274             0.05088367       0
## 275             0.33128368       0
## 276             0.24506561       0
## 277             0.11369280       1
## 278             0.10154494       0
## 279             0.24801836       0
## 280             0.09186564       0
## 281             0.58972781       1
## 282             0.47370166       0
## 283             0.41711384       0
## 284             0.68313032       1
## 285             0.21797420       1
## 286             0.40116323       0
## 287             0.71641934       0
## 288             0.53067228       1
## 289             0.04712263       0
## 290             0.26775522       0
## 291             0.08745865       0
## 292             0.22682859       1
## 293             0.57291177       1
## 294             0.46046734       1
## 295             0.62758689       0
## 296             0.58058435       0
## 297             0.37824219       1
## 298             0.24803175       0
## 299             0.29474268       1
## 300             0.21955100       0
## 301             0.66018715       1
## 302             0.41100863       1
## 303             0.11129750       0
## 304             0.64729036       1
## 305             0.32005860       0
## 306             0.40826274       0
## 307             0.58137125       1
## 308             0.20853969       0
## 309             0.26360927       1
## 310             0.30776656       1
## 311             0.06535416       0
## 312             0.25036944       0
## 313             0.41092641       1
## 314             0.16107311       0
## 315             0.33149016       1
## 316             0.22369709       0
## 317             0.05117748       0
## 318             0.73245084       1
## 319             0.32712977       0
## 320             0.84109136       1
## 321             0.25184255       0
## 322             0.18282382       1
## 323             0.24378690       1
## 324             0.50258896       1
## 325             0.22371606       0
## 326             0.38582606       0
## 327             0.33531991       1
## 328             0.82388681       0
## 329             0.34014640       1
## 330             0.18639905       0
## 331             0.19088602       0
## 332             0.09218007       0
## 333             0.91902894       1
## 334             0.13200336       0
## 335             0.05320940       0
## 336             0.86742537       0
## 337             0.35965734       0
## 338             0.29290730       1
## 339             0.59568971       1
## 340             0.88624733       1
## 341             0.18920894       0
## 342             0.09132616       0
## 344             0.34659852       0
## 345             0.32815123       0
## 346             0.57649831       0
## 347             0.29236645       0
## 348             0.10531291       0
## 349             0.05676819       0
## 351             0.24193286       0
## 352             0.37729664       0
## 353             0.07898002       0
## 354             0.06279661       0
## 355             0.19814573       0
## 356             0.72467781       1
## 357             0.31076811       1
## 358             0.59801892       1
## 359             0.20451411       0
## 360             0.88526710       1
## 361             0.78897461       1
## 362             0.74400417       0
## 363             0.50320629       0
## 364             0.82287641       1
## 365             0.54649709       0
## 366             0.16790447       0
## 367             0.21165603       1
## 368             0.04952246       0
## 369             0.04507078       0
## 370             0.48272227       1
## 371             0.78269038       1
## 373             0.09704412       0
## 374             0.19000429       0
## 375             0.34459338       0
## 376             0.75532282       1
## 377             0.06564960       0
## 378             0.12244145       0
## 379             0.85402788       1
## 380             0.30435064       0
## 381             0.14485063       0
## 382             0.05348425       0
## 383             0.09319434       0
## 384             0.05402437       0
## 385             0.15572249       0
## 386             0.10794570       0
## 387             0.26789645       1
## 388             0.46951938       1
## 389             0.65094037       1
## 390             0.13731054       0
## 391             0.19779697       0
## 392             0.85134307       1
## 393             0.16168195       0
## 394             0.13430673       0
## 395             0.60509679       1
## 396             0.21179332       0
## 397             0.09248973       0
## 398             0.33946376       1
## 399             0.02913697       0
## 400             0.84267098       1
## 401             0.13084013       1
## 402             0.39847300       0
## 403             0.48699564       1
## 404             0.07268482       0
## 405             0.74444828       1
## 406             0.46625190       0
## 407             0.26301832       1
## 408             0.05958239       0
## 409             0.80446521       1
## 410             0.84435244       1
## 411             0.19801831       0
## 412             0.22338931       0
## 413             0.61960934       0
## 414             0.26996338       0
## 415             0.39684645       1
## 416             0.72171476       1
## 417             0.07505161       0
## 418             0.64654541       1
## 419             0.02475853       0
## 420             0.21863561       1
## 421             0.50245499       0
## 422             0.05982690       0
## 423             0.23869814       0
## 424             0.17118007       0
## 425             0.77187668       1
## 426             0.84799767       1
## 428             0.82533914       1
## 429             0.53910692       0
## 430             0.21756689       1
## 431             0.05413944       0
## 432             0.11409791       0
## 433             0.05393312       0
## 434             0.27662620       0
## 435             0.06907776       0
## 436             0.64969264       1
## 437             0.61721992       0
## 438             0.42076439       0
## 439             0.03395945       0
## 440             0.26189146       0
## 441             0.87450365       1
## 442             0.07172639       0
## 443             0.23064261       0
## 444             0.18113764       1
## 445             0.20642234       1
## 446             0.96817202       1
## 447             0.08292490       0
## 448             0.16339824       0
## 449             0.15599867       1
## 450             0.21704383       0
## 451             0.02779798       0
## 452             0.26600085       1
## 453             0.18259140       0
## 454             0.27355231       0
## 455             0.19842979       0
## 456             0.78495582       1
## 457             0.48559124       0
## 458             0.07070356       0
## 459             0.74404318       1
## 460             0.59394115       0
## 461             0.17913761       0
## 462             0.02176006       0
## 463             0.10771664       0
## 464             0.08587260       0
## 465             0.14009285       0
## 466             0.11253219       0
## 467             0.03642788       0
## 468             0.17309703       0
## 469             0.27220493       1
## 470             0.79486433       0
## 471             0.64494785       0
## 472             0.36560333       0
## 473             0.33439546       0
## 474             0.48018967       0
## 475             0.15482239       0
## 476             0.49529998       0
## 477             0.19109830       1
## 478             0.12410536       0
## 479             0.24797063       0
## 480             0.49527096       0
## 481             0.68458039       1
## 482             0.33885594       0
## 483             0.06226367       0
## 484             0.12371592       0
## 485             0.72687680       1
## 486             0.56265143       1
## 487             0.54101247       0
## 488             0.95052210       0
## 489             0.08227156       0
## 490             0.89372062       0
## 491             0.11005245       0
## 492             0.16022990       0
## 493             0.16490742       0
## 494             0.33102362       1
## 496             0.75953954       0
## 497             0.12702210       0
## 498             0.06099965       0
## 499             0.84950541       1
## 500             0.54761477       0
## 501             0.11828122       0
## 502             0.12966096       0
## 504             0.17866354       0
## 505             0.24526648       0
## 506             0.09221063       0
## 507             0.83844593       1
## 508             0.22417076       0
## 509             0.06208386       0
## 510             0.33491186       0
## 511             0.11299337       1
## 512             0.18168280       0
## 513             0.12336536       0
## 514             0.06204313       0
## 515             0.07400936       0
## 516             0.59909870       1
## 517             0.58968113       1
## 518             0.56205034       0
## 519             0.09884110       0
## 520             0.27576164       0
## 521             0.02523876       0
## 522             0.28936865       0
## 524             0.48747062       1
## 525             0.25656343       0
## 526             0.03291344       0
## 527             0.03395945       0
## 528             0.13475781       0
## 529             0.18580656       0
## 530             0.12036732       0
## 531             0.19948677       0
## 532             0.38363361       0
## 533             0.19213803       0
## 534             0.09680854       0
## 535             0.06801244       0
## 536             0.32583319       1
## 537             0.21032098       0
## 538             0.04166002       0
## 539             0.35441889       0
## 540             0.43504221       1
## 541             0.33020703       1
## 542             0.31016162       1
## 543             0.25095444       1
## 544             0.14385608       0
## 545             0.09976982       0
## 546             0.85041886       1
## 547             0.95475562       1
## 548             0.35407064       0
## 549             0.76428975       0
## 550             0.78684564       0
## 551             0.13623736       0
## 552             0.07829450       0
## 553             0.35648962       0
## 554             0.07172684       0
## 555             0.12665258       0
## 556             0.21859713       0
## 557             0.21354962       0
## 558             0.27639435       0
## 559             0.49525433       0
## 560             0.09072912       0
## 561             0.49863081       1
## 562             0.92529006       1
## 563             0.13282467       0
## 564             0.10152438       0
## 565             0.10768221       0
## 566             0.06408069       0
## 567             0.19062099       0
## 568             0.17225806       0
## 569             0.57765294       0
## 570             0.33059973       1
## 571             0.09766907       0
## 572             0.14429785       0
## 573             0.14094557       0
## 574             0.14150256       0
## 575             0.35723780       0
## 576             0.28936725       0
## 577             0.11561206       0
## 578             0.40501046       1
## 579             0.29985290       0
## 580             0.94605555       1
## 581             0.67186867       1
## 582             0.10536856       0
## 583             0.36048196       0
## 584             0.31028760       0
## 585             0.36443558       1
## 586             0.04412532       0
## 587             0.58904672       1
## 588             0.08645851       0
## 589             0.84621187       1
## 590             0.02132939       0
## 591             0.59997227       1
## 592             0.30450063       0
## 593             0.50255482       1
## 594             0.05509608       0
## 595             0.33883146       0
## 596             0.76026197       1
## 597             0.22016747       0
## 598             0.05892304       0
## 599             0.81919423       1
## 600             0.08801009       0
## 601             0.11183714       0
## 602             0.06362255       0
## 603             0.21954458       0
## 604             0.73280017       1
## 605             0.74174410       1
## 606             0.30819104       0
## 607             0.83525045       1
## 608             0.03576715       0
## 609             0.70485802       0
## 610             0.09343462       0
## 611             0.14159007       0
## 612             0.75749794       1
## 613             0.81997981       1
## 614             0.16291589       0
## 615             0.63373693       1
## 616             0.10212907       0
## 617             0.19210652       0
## 618             0.01550448       0
## 619             0.25255837       1
## 620             0.23051726       1
## 621             0.30983015       0
## 622             0.05806539       0
## 623             0.91880972       0
## 624             0.23434633       0
## 625             0.13870084       0
## 626             0.16560570       0
## 627             0.14562927       0
## 628             0.30377897       0
## 629             0.47868226       0
## 630             0.05359131       0
## 631             0.17582314       1
## 632             0.16503579       0
## 633             0.11155604       0
## 634             0.20058382       0
## 635             0.07258193       0
## 636             0.19084517       1
## 637             0.20214464       0
## 638             0.10023708       0
## 639             0.26993477       1
## 640             0.05098845       0
## 641             0.11900934       0
## 642             0.32851032       0
## 643             0.56852689       1
## 644             0.08089095       0
## 645             0.10727554       0
## 646             0.72028900       0
## 647             0.48785825       1
## 648             0.79490552       1
## 649             0.38877110       1
## 650             0.09982364       0
## 651             0.05337015       0
## 652             0.25640525       0
## 653             0.31091947       0
## 654             0.16989599       0
## 655             0.17316520       0
## 656             0.66116202       1
## 657             0.05447303       0
## 658             0.47537877       0
## 659             0.60974433       0
## 660             0.08753509       1
## 661             0.68185368       0
## 662             0.92576964       1
## 663             0.81949182       1
## 664             0.66187967       1
## 665             0.31610727       1
## 666             0.22464126       0
## 667             0.74038861       1
## 668             0.18688482       1
## 669             0.22045595       0
## 670             0.62406486       0
## 671             0.77816331       0
## 672             0.06718727       0
## 673             0.11105363       0
## 674             0.75292098       0
## 675             0.34281746       0
## 676             0.82671405       1
## 677             0.56464439       1
## 678             0.13697367       0
## 679             0.31378419       1
## 680             0.06850206       0
## 681             0.01422697       0
## 682             0.87261940       1
## 683             0.26484153       0
## 684             0.28598125       1
## 686             0.32094951       0
## 687             0.15362251       0
## 688             0.13511619       0
## 689             0.22573798       0
## 690             0.82408951       1
## 691             0.11455343       0
## 692             0.83801270       1
## 693             0.36316566       0
## 694             0.56041931       1
## 695             0.04713819       0
## 696             0.49449709       1
## 697             0.63379122       1
## 698             0.06673730       0
## 699             0.35029763       0
## 700             0.47563672       0
## 701             0.32580619       0
## 702             0.33060689       1
## 703             0.82826641       1
## 704             0.54623397       0
## 705             0.14206554       0
## 706             0.14030083       0
## 708             0.31026059       0
## 709             0.73746767       1
## 710             0.16033718       1
## 711             0.51831021       0
## 712             0.32110908       0
## 713             0.58460380       1
## 714             0.21470587       0
## 715             0.13700414       0
## 716             0.83664700       1
## 717             0.73899368       1
## 718             0.11811293       0
## 719             0.21112030       0
## 720             0.28973691       1
## 721             0.07753199       0
## 722             0.27735147       0
## 723             0.52482847       1
## 724             0.46044316       0
## 725             0.29918259       0
## 726             0.39551955       0
## 727             0.27863328       0
## 728             0.38207831       0
## 729             0.46885040       0
## 730             0.08098655       0
## 731             0.29185614       1
## 732             0.16991140       1
## 733             0.86244607       1
## 734             0.11608125       0
## 735             0.15609935       0
## 736             0.15782675       0
## 737             0.18370871       0
## 738             0.06634126       0
## 739             0.16444963       0
## 740             0.34166645       1
## 741             0.60048581       1
## 742             0.13057157       0
## 743             0.12257080       0
## 744             0.54257711       1
## 745             0.76309072       0
## 746             0.18772966       0
## 747             0.80105067       1
## 748             0.25368504       0
## 749             0.87161001       1
## 750             0.58476045       1
## 751             0.31730612       1
## 752             0.39481184       0
## 753             0.10506793       0
## 754             0.88435023       1
## 755             0.65508506       1
## 756             0.46396635       1
## 757             0.45736779       0
## 758             0.52258754       1
## 759             0.24005514       0
## 760             0.94278973       1
## 761             0.06158409       0
## 762             0.89970687       1
## 763             0.05205013       0
## 764             0.33601291       0
## 765             0.35029608       0
## 766             0.17967203       0
## 767             0.37685726       1
## 768             0.08803680       0
# Predicted classes
predicted_probs <- logisticRegressionModel$fitted.values
predicted_classes <- ifelse(predicted_probs > 0.5, 1, 0)


# Confusion matrix
confusion <- table(
  Predicted = factor(predicted_classes, levels = c(0, 1)),
  Actual = factor(data_subset$Outcome_num, levels = c(0, 1))
)

confusion
##          Actual
## Predicted   0   1
##         0 429 114
##         1  59 150
#Extract Values:
TN <- 429
FP <- 114
FN <- 59
TP <- 150

#Metrics    
accuracy <- (TP + TN) / (TP + TN + FP + FN)
sensitivity <- TP / (TP + FN)
specificity <- TN / (TN + FP)
precision <- TP / (TP + FP)

cat("Accuracy:", round(accuracy, 3), "\nSensitivity:", round(sensitivity, 3), "\nSpecificity:", round(specificity, 3), "\nPrecision:", round(precision, 3))
## Accuracy: 0.77 
## Sensitivity: 0.718 
## Specificity: 0.79 
## Precision: 0.568

Interpret: How well does the model perform? Is it better at detecting diabetes (sensitivity) or non-diabetes (specificity)? Why might this matter for medical diagnosis?

The model does decently well with 77% accuracy with a better performance in detecting non-diabetes as specificity has a value of 0.79 compared to the 0.718 from sensitivity. This might matter for medical diagnosis because crucial and accurate predictions need to be made to determine whether or not a person has diabetes or not, and be correctly diagnosed.

Question 3: ROC Curve, AUC, and Interpretation

#Enter your code here
library(pROC)
## Warning: package 'pROC' was built under R version 4.5.2
## Type 'citation("pROC")' for a citation.
## 
## Attaching package: 'pROC'
## The following objects are masked from 'package:stats':
## 
##     cov, smooth, var
# ROC curve & AUC on full data
roc_obj <- roc(response = data_subset$Outcome_num,
               predictor = logisticRegressionModel$fitted.values,
               levels = c(0, 1),
               direction = "<")  # smaller prob = Healthy

# Print AUC value
auc_val <- auc(roc_obj); auc_val
## Area under the curve: 0.828
# Plot ROC with AUC displayed
plot.roc(roc_obj, print.auc = TRUE, legacy.axes = TRUE,
         xlab = "False Positive Rate (1 - Specificity)",
         ylab = "True Positive Rate (Sensitivity)")

What does AUC indicate (0.5 = random, 1.0 = perfect)? The mode has an AUC of 0.828 (closer to 1.0 ~ perfect) which mean the model is good at distinguishing individuals who are positive for diabetes and negative for diabetes.

For diabetes diagnosis, prioritize sensitivity (catching cases) or specificity (avoiding false positives)? Suggest a threshold and explain. Diabetes diagnosis priotizes sensitivity because there are more cases where false positive has been shown based on the probability results. Additionally, false positive results may be more beneficial than false negative due to the chances of people dying from diabetes if a person has diabetes but not diagnosed with diabetes (More dangerous).