Question 2.1: Describe a situation or problem from your job, everyday life, current events, etc., for which a classification model would be appropriate. List some (up to 5) predictors that you might use.

Answer: Working as a pharmacy technician I realize many situations in which a classification model can be used. The most common situation would be to answer whether a customer can receive his/her medication on that same day. A few predictors we use are: if the prescription is clear of errors(dosage, doctor, etc), we have the medication and the necesary quantity in stock and if the customer has decided on a method of payment.

Question 2.2: Using the support vector machine function ksvm contained in the R package kernlab, find a good classifier for this data. Show the equation of your classifier, and how well it classifies the data points in the full data set. (Don’t worry about test/validation data yet; we’ll cover that topic soon.)

library(kernlab) #used for ksvm
library(kknn) #used for knn
#load the data 
cc_data <- read.table("credit_card_data.txt", header=F, stringsAsFactors = F) 
#see the first few rows
head(cc_data)
#set up model
cc_model <- ksvm(as.matrix(cc_data[,1:10]),as.factor(cc_data[,11]), C=100, scaled = T, kernel="vanilladot", type = "C-svc")
 Setting default kernel parameters  
#take a look at it
cc_model
Support Vector Machine object of class "ksvm" 

SV type: C-svc  (classification) 
 parameter : cost C = 100 

Linear (vanilla) kernel function. 

Number of Support Vectors : 189 

Objective Function Value : -17887.92 
Training error : 0.136086 

After trying multiple values of C I saw no significant difference on the error. Therefore, I decided to keep the value of C at 100.

#calculating the coefficients (a1..am)
a <- colSums(cc_model@xmatrix[[1]] * cc_model@coef[[1]] )
#print out a
a
           V1            V2            V3            V4            V5            V6 
-0.0010065348 -0.0011729048 -0.0016261967  0.0030064203  1.0049405641 -0.0028259432 
           V7            V8            V9           V10 
 0.0002600295 -0.0005349551 -0.0012283758  0.1063633995 
#calculate a0
a0 <- -cc_model@b
#print a0
a0
[1] 0.08158492

The classifier’s equation is then: -0.00100653481057611v1 - 0.00117290480611665v2 - 0.00162619672236963v3 + 0.0030064202649194v4 + 1.00494056410556v5 - 0.00282594323043472v6 + 0.000260029507016313v7 - 0.000534955143494997v8 - 0.00122837582291523v9 + 0.106363399527188v10 + 0.081584921659538v0 = 0

#lets see what the model predicts
pred <- predict(cc_model, cc_data[,1:10])
pred
  [1] 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
 [44] 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1
 [87] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
[130] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1
[173] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
[216] 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0
[259] 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0
[302] 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0
[345] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[388] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[431] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1
[474] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
[517] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1
[560] 1 1 1 1 1 0 1 1 1 1 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[603] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[646] 0 0 0 0 0 0 0 0 0
Levels: 0 1
#lets observe the percentage of the model's correct predictions.
sum(pred == cc_data[,11]) / nrow(cc_data) * 100
[1] 86.39144

The model’s accuracy is 86.39%.

Question 2.3: Using the k-nearest-neighbors classification function kknn contained in the R kknn package, suggest a good value of k, and show how well it classifies that data points in the full data set. Don’t forget to scale the data (scale=TRUE in kknn).

acc_chk = function(Z){
  pred<- rep(0,(nrow(cc_data))) 
  
  for (i in 1:nrow(cc_data)){
  #ensure it doesn't use i itself 
    knn_model=kknn(V11~V1+V2+V3+V4+V5+V6+V7+V8+V9+V10,cc_data[-i,],cc_data[i,],k=Z, scale = T) 
    pred[i] <- as.integer(fitted(knn_model)+0.5) #for rounding
  }
  
  acc = sum(pred == cc_data[,11]) / nrow(cc_data)
  return(acc)
}
test_vec <- rep(0,30) # 30 zeroes vector for accuracy test (knn values ranging from 1 to 30)
for (Z in 1:30){
  test_vec[Z] = acc_chk(Z) 
}
knn_accuracy <- as.matrix(test_vec * 100) #see accuracy as percentage
knn_accuracy #print out knn values and percentage of accuracy
          [,1]
 [1,] 81.49847
 [2,] 81.49847
 [3,] 81.49847
 [4,] 81.49847
 [5,] 85.16820
 [6,] 84.55657
 [7,] 84.70948
 [8,] 84.86239
 [9,] 84.70948
[10,] 85.01529
[11,] 85.16820
[12,] 85.32110
[13,] 85.16820
[14,] 85.16820
[15,] 85.32110
[16,] 85.16820
[17,] 85.16820
[18,] 85.16820
[19,] 85.01529
[20,] 85.01529
[21,] 84.86239
[22,] 84.70948
[23,] 84.40367
[24,] 84.55657
[25,] 84.55657
[26,] 84.40367
[27,] 84.09786
[28,] 83.79205
[29,] 83.94495
[30,] 84.09786
knn_value <- c(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30)
plot(knn_value,knn_accuracy)#observe accuracies per knn value

max(knn_accuracy)
[1] 85.3211

The knn value that best classifies the data points is 12 with an accuracy of 85.3211%.

LS0tCnRpdGxlOiAiSW50cm9kdWN0aW9uIHRvIEFuYWx5dGljcyBNb2RlbGluZyBIb21ld29yayAxIgpvdXRwdXQ6CiAgaHRtbF9ub3RlYm9vazogZGVmYXVsdAogIHdvcmRfZG9jdW1lbnQ6IGRlZmF1bHQKICBwZGZfZG9jdW1lbnQ6IGRlZmF1bHQKLS0tCioqUXVlc3Rpb24gMi4xOiBEZXNjcmliZSBhIHNpdHVhdGlvbiBvciBwcm9ibGVtIGZyb20geW91ciBqb2IsIGV2ZXJ5ZGF5IGxpZmUsIGN1cnJlbnQgZXZlbnRzLCBldGMuLCBmb3Igd2hpY2ggYSBjbGFzc2lmaWNhdGlvbiBtb2RlbCB3b3VsZCBiZSBhcHByb3ByaWF0ZS4gTGlzdCBzb21lICh1cCB0byA1KSBwcmVkaWN0b3JzIHRoYXQgeW91IG1pZ2h0IHVzZS4qKgoKKipBbnN3ZXI6KiogV29ya2luZyBhcyBhIHBoYXJtYWN5IHRlY2huaWNpYW4gSSByZWFsaXplIG1hbnkgc2l0dWF0aW9ucyBpbiB3aGljaCBhIGNsYXNzaWZpY2F0aW9uIG1vZGVsIGNhbiBiZSB1c2VkLiBUaGUgbW9zdCBjb21tb24gc2l0dWF0aW9uIHdvdWxkIGJlIHRvIGFuc3dlciB3aGV0aGVyIGEgY3VzdG9tZXIgY2FuIHJlY2VpdmUgaGlzL2hlciBtZWRpY2F0aW9uIG9uIHRoYXQgc2FtZSBkYXkuIEEgZmV3IHByZWRpY3RvcnMgd2UgdXNlIGFyZTogaWYgdGhlIHByZXNjcmlwdGlvbiBpcyBjbGVhciBvZiBlcnJvcnMoZG9zYWdlLCBkb2N0b3IsIGV0YyksIHdlIGhhdmUgdGhlIG1lZGljYXRpb24gYW5kIHRoZSBuZWNlc2FyeSBxdWFudGl0eSBpbiBzdG9jayBhbmQgaWYgdGhlIGN1c3RvbWVyIGhhcyBkZWNpZGVkIG9uIGEgbWV0aG9kIG9mIHBheW1lbnQuIAoKKipRdWVzdGlvbiAyLjI6IFVzaW5nIHRoZSBzdXBwb3J0IHZlY3RvciBtYWNoaW5lIGZ1bmN0aW9uIGtzdm0gY29udGFpbmVkIGluIHRoZSBSIHBhY2thZ2Uga2VybmxhYiwgZmluZCBhIGdvb2QgY2xhc3NpZmllciBmb3IgdGhpcyBkYXRhLiBTaG93IHRoZSBlcXVhdGlvbiBvZiB5b3VyIGNsYXNzaWZpZXIsIGFuZCBob3cgd2VsbCBpdCBjbGFzc2lmaWVzIHRoZSBkYXRhIHBvaW50cyBpbiB0aGUgZnVsbCBkYXRhIHNldC4gKERvbuKAmXQgd29ycnkgYWJvdXQgdGVzdC92YWxpZGF0aW9uIGRhdGEgeWV0OyB3ZeKAmWxsIGNvdmVyIHRoYXQgdG9waWMgc29vbi4pKioKCmBgYHtyfQpsaWJyYXJ5KGtlcm5sYWIpICN1c2VkIGZvciBrc3ZtCmxpYnJhcnkoa2tubikgI3VzZWQgZm9yIGtubgpgYGAKYGBge3J9CiNsb2FkIHRoZSBkYXRhIApjY19kYXRhIDwtIHJlYWQudGFibGUoImNyZWRpdF9jYXJkX2RhdGEudHh0IiwgaGVhZGVyPUYsIHN0cmluZ3NBc0ZhY3RvcnMgPSBGKSAKI3NlZSB0aGUgZmlyc3QgZmV3IHJvd3MKaGVhZChjY19kYXRhKQpgYGAKYGBge3J9CiNzZXQgdXAgbW9kZWwKY2NfbW9kZWwgPC0ga3N2bShhcy5tYXRyaXgoY2NfZGF0YVssMToxMF0pLGFzLmZhY3RvcihjY19kYXRhWywxMV0pLCBDPTEwMCwgc2NhbGVkID0gVCwga2VybmVsPSJ2YW5pbGxhZG90IiwgdHlwZSA9ICJDLXN2YyIpCgojdGFrZSBhIGxvb2sgYXQgaXQKY2NfbW9kZWwKYGBgCkFmdGVyIHRyeWluZyBtdWx0aXBsZSB2YWx1ZXMgb2YgQyBJIHNhdyBubyBzaWduaWZpY2FudCBkaWZmZXJlbmNlIG9uIHRoZSBlcnJvci4gVGhlcmVmb3JlLCBJIGRlY2lkZWQgdG8ga2VlcCB0aGUgdmFsdWUgb2YgQyBhdCAxMDAuCgpgYGB7cn0KI2NhbGN1bGF0aW5nIHRoZSBjb2VmZmljaWVudHMgKGExLi5hbSkKYSA8LSBjb2xTdW1zKGNjX21vZGVsQHhtYXRyaXhbWzFdXSAqIGNjX21vZGVsQGNvZWZbWzFdXSApCiNwcmludCBvdXQgYQphCmBgYApgYGB7cn0KI2NhbGN1bGF0ZSBhMAphMCA8LSAtY2NfbW9kZWxAYgojcHJpbnQgYTAKYTAKYGBgClRoZSBjbGFzc2lmaWVyJ3MgZXF1YXRpb24gaXMgdGhlbjogLTAuMDAxMDA2NTM0ODEwNTc2MTF2MSAtIDAuMDAxMTcyOTA0ODA2MTE2NjV2MiAtIDAuMDAxNjI2MTk2NzIyMzY5NjN2MyArIDAuMDAzMDA2NDIwMjY0OTE5NHY0ICsgMS4wMDQ5NDA1NjQxMDU1NnY1IC0gMC4wMDI4MjU5NDMyMzA0MzQ3MnY2ICsgMC4wMDAyNjAwMjk1MDcwMTYzMTN2NyAtIDAuMDAwNTM0OTU1MTQzNDk0OTk3djggLSAwLjAwMTIyODM3NTgyMjkxNTIzdjkgKyAwLjEwNjM2MzM5OTUyNzE4OHYxMCArIDAuMDgxNTg0OTIxNjU5NTM4djAgPSAwCgpgYGB7cn0KI2xldHMgc2VlIHdoYXQgdGhlIG1vZGVsIHByZWRpY3RzCnByZWQgPC0gcHJlZGljdChjY19tb2RlbCwgY2NfZGF0YVssMToxMF0pCnByZWQKYGBgCmBgYHtyfQojbGV0cyBvYnNlcnZlIHRoZSBwZXJjZW50YWdlIG9mIHRoZSBtb2RlbCdzIGNvcnJlY3QgcHJlZGljdGlvbnMuCnN1bShwcmVkID09IGNjX2RhdGFbLDExXSkgLyBucm93KGNjX2RhdGEpICogMTAwCmBgYApUaGUgbW9kZWwncyBhY2N1cmFjeSBpcyA4Ni4zOSUuCgoqKlF1ZXN0aW9uIDIuMzogVXNpbmcgdGhlIGstbmVhcmVzdC1uZWlnaGJvcnMgY2xhc3NpZmljYXRpb24gZnVuY3Rpb24ga2tubiBjb250YWluZWQgaW4gdGhlIFIga2tubiBwYWNrYWdlLCBzdWdnZXN0IGEgZ29vZCB2YWx1ZSBvZiBrLCBhbmQgc2hvdyBob3cgd2VsbCBpdCBjbGFzc2lmaWVzIHRoYXQgZGF0YSBwb2ludHMgaW4gdGhlIGZ1bGwgZGF0YSBzZXQuIERvbuKAmXQgZm9yZ2V0IHRvIHNjYWxlIHRoZSBkYXRhIChzY2FsZT1UUlVFIGluIGtrbm4pLioqCmBgYHtyfQphY2NfY2hrID0gZnVuY3Rpb24oWil7CiAgcHJlZDwtIHJlcCgwLChucm93KGNjX2RhdGEpKSkgCiAgCiAgZm9yIChpIGluIDE6bnJvdyhjY19kYXRhKSl7CiAgI2Vuc3VyZSBpdCBkb2Vzbid0IHVzZSBpIGl0c2VsZiAKICAgIGtubl9tb2RlbD1ra25uKFYxMX5WMStWMitWMytWNCtWNStWNitWNytWOCtWOStWMTAsY2NfZGF0YVstaSxdLGNjX2RhdGFbaSxdLGs9Wiwgc2NhbGUgPSBUKSAKICAgIHByZWRbaV0gPC0gYXMuaW50ZWdlcihmaXR0ZWQoa25uX21vZGVsKSswLjUpICNmb3Igcm91bmRpbmcKICB9CiAgCiAgYWNjID0gc3VtKHByZWQgPT0gY2NfZGF0YVssMTFdKSAvIG5yb3coY2NfZGF0YSkKICByZXR1cm4oYWNjKQp9CmBgYAoKYGBge3J9CnRlc3RfdmVjIDwtIHJlcCgwLDMwKSAjIDMwIHplcm9lcyB2ZWN0b3IgZm9yIGFjY3VyYWN5IHRlc3QgKGtubiB2YWx1ZXMgcmFuZ2luZyBmcm9tIDEgdG8gMzApCmZvciAoWiBpbiAxOjMwKXsKICB0ZXN0X3ZlY1taXSA9IGFjY19jaGsoWikgCn0KYGBgCgpgYGB7cn0Ka25uX2FjY3VyYWN5IDwtIGFzLm1hdHJpeCh0ZXN0X3ZlYyAqIDEwMCkgI3NlZSBhY2N1cmFjeSBhcyBwZXJjZW50YWdlCmtubl9hY2N1cmFjeSAjcHJpbnQgb3V0IGtubiB2YWx1ZXMgYW5kIHBlcmNlbnRhZ2Ugb2YgYWNjdXJhY3kKYGBgCmBgYHtyfQprbm5fdmFsdWUgPC0gYygxLDIsMyw0LDUsNiw3LDgsOSwxMCwxMSwxMiwxMywxNCwxNSwxNiwxNywxOCwxOSwyMCwyMSwyMiwyMywyNCwyNSwyNiwyNywyOCwyOSwzMCkKYGBgCgpgYGB7cn0KcGxvdChrbm5fdmFsdWUsa25uX2FjY3VyYWN5KSNvYnNlcnZlIGFjY3VyYWNpZXMgcGVyIGtubiB2YWx1ZQpgYGAKYGBge3J9Cm1heChrbm5fYWNjdXJhY3kpCmBgYAoKVGhlIGtubiB2YWx1ZSB0aGF0IGJlc3QgY2xhc3NpZmllcyB0aGUgZGF0YSBwb2ludHMgaXMgMTIgd2l0aCBhbiBhY2N1cmFjeSBvZiA4NS4zMjExJS4KCgo=