Samuel I Kigamba
May 20, 2020
Load all the necessary Libraries
library(gridExtra)
library(RColorBrewer)
library(Matrix)
library(scales)
library(corrplot)
library(MASS)
library(psych)
library(ggplot2)
Presentation breakdown:
Probability
Random number generation
Probability calculations
Statistics and Calculus
Descriptive and Inferential Statistics
Linear Algebra and Correlation
Calculus-Based Probability & Statistics
Modeling
Random number generation
probability
Using R, generate a random variable X that has 10,000 random uniform numbers from 1 to N, where N can be any number of your choosing greater than or equal to 6. Then generate a random variable Y that has 10,000 random normal numbers with a mean of:
\[\mu=\sigma=(N+1)/2\]
set.seed(100)
N = 10
n = 10000
# Random Uniform number generation
X = runif(n, 1, N)
# Random normal number generation
Y = rnorm(n, mean = (N+1)/2, sd = (N+1)/2)
dfXY = data.frame(cbind(X, Y))
# View the first 5 numbers generated
dfXY[0:5,1:2]
## X Y
## 1 3.769895 7.849050
## 2 3.319053 4.227292
## 3 5.970902 7.721962
## 4 1.507448 2.475432
## 5 5.216944 1.687728
Assume the small letter “x” is estimated as the median of the X variable, and the small letter “y” is estimated as the 1st quartile of the Y variable.
## [1] 5.472943
## 25%
## 1.769673
Calculate as a minimum the below probabilities a through c. Interpret the meaning of all probabilities.
# Number of observations where X is greater than y, the 1st quantile of Y
nXy = nrow(subset(dfXY, X > y))
nXy
## [1] 9223
# Probability of X is higher than x, the median of X given that X is greater than y, the 1st quantile of Y.
pa = nrow(subset(dfXY, X > x & X > y))/nXy
pa
## [1] 0.542123
The probability is 0.54 or 54%
# Probability that both X and Y are greater than x, the median of X and y, the first quantile of Y respectively.
pb = nrow(subset(dfXY, X > x & Y > y))/n
pb
## [1] 0.3755
The probability is 0.3755 or 37.55%
# Probability that X is less that x, the median of X given that X is greater than Y
pc = nrow(subset(dfXY, X < x & X > y))/nXy
pc
## [1] 0.457877
The probability is 0.4579 or 45.79%
Investigate whether P(X>x and Y>y) = P(X > x) * P(Y > y) by building a table and evaluating the marginal and joint probabilities.
A1 <- c(sum(X <= x & Y <= y), sum(X > x & Y <= y))
B1 <- c(sum(X <= x & Y > y), sum(X > x & Y > y))
ct_matrix <- matrix(c(A1, B1), nrow = 2)
ct_matrix <- rbind(ct_matrix, apply(ct_matrix, 2, sum))
ct_matrix <- cbind(ct_matrix, apply(ct_matrix, 1, sum))
xy <- c("<=1st quartile", ">1st quartile", "Total")
countDF <- data.frame(xy, ct_matrix)
colnames(countDF) <- c("x/y", "<=1st quartile", ">1st quartile", "Total")
print(countDF)
## x/y <=1st quartile >1st quartile Total
## 1 <=1st quartile 1255 3745 5000
## 2 >1st quartile 1245 3755 5000
## 3 Total 2500 7500 10000
A <- countDF[2, 4]
B <- countDF[3, 3]
A_B <- countDF[2, 3]
tot <- countDF[3, 4]
Prob_A <- A/tot
Prob_B <- B/tot
prob_A_B <- A_B/tot
print(prob_A_B)
## [1] 0.3755
So P(AB) = 0.3755
## [1] 0.375
So, P(X>x and Y>y)=P(X>x)P(Y>y) is TRUE.
Check to see if independence holds by using Fisher’s Exact Test and the Chi Square Test.
# Generate the matrix
Matrix <- matrix(c(A1, B1), nrow = 2)
# Perform the Chi square test
chisq.test(Matrix)
##
## Pearson's Chi-squared test with Yates' continuity correction
##
## data: Matrix
## X-squared = 0.0432, df = 1, p-value = 0.8353
##
## Fisher's Exact Test for Count Data
##
## data: Matrix
## p-value = 0.8354
## alternative hypothesis: true odds ratio is not equal to 1
## 95 percent confidence interval:
## 0.9222661 1.1076494
## sample estimates:
## odds ratio
## 1.010724
What is the difference between the two?
Chi Square Test applies to approximations assuming samples are large while Fisher’s Exact Test runs an exact procedure for small sized samples.
Which is most appropriate?
In this case both have approximately equal p-values. Due to the large p-value in both tests we conclude that the independence holds.
Descriptive and Inferential Statistics
Linear Algebra and Correlation.
Calculus-Based Probability & Statistics
Load the train data and select variables
You are to register for Kaggle.com (free) and compete in the House Prices: Advanced Regression Techniques competition. https://www.kaggle.com/c/house-prices-advanced-regression-techniques .
I want you to do the following.
#Training set
train = read.csv("https://raw.githubusercontent.com/igukusamuel/DATA-605-Final-Project/master/train.csv")
dim(train)
## [1] 1460 81
Provide univariate descriptive statistics and appropriate plots for the training data set.
Descriptive statistics on LotArea
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## 1300 7554 9478 10517 11602 215245
## vars n mean sd median trimmed mad min max range
## X1 1 1460 10516.83 9981.26 9478.5 9563.28 2962.23 1300 215245 213945
## skew kurtosis se
## X1 12.18 202.26 261.22
There are 1460 obsevarions with LotArea ranging from 1300SF to 215,245SF. The average LotArea is 10,516SF.
# BoxPlot and Histograms for the LotArea
par(mfrow=c(1,2))
boxplot(X, main="LotArea BoxPlot")
hist(X, breaks = 20, main = "LotArea Histogram")
The distribution of the LotArea is right skewed with a alot of outliers.
Descriptive statistics on SalePrice
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## 34900 129975 163000 180921 214000 755000
## vars n mean sd median trimmed mad min max range
## X1 1 1460 180921.2 79442.5 163000 170783.3 56338.8 34900 755000 720100
## skew kurtosis se
## X1 1.88 6.5 2079.11
There are 1460 obsevarions with SalePrice ranging from $34,900 to $755,000. The average SalePrice is $180,921.
# BoxPlot and Histograms for the SalePrice
par(mfrow=c(1,2))
boxplot(Y, main="SalePrice BoxPlot")
hist(Y, breaks = 30, main = "SalePrice Histogram")
The distribution of the SalePrice is right skewed with a couple of outliers.
Descriptive statistics on GarageArea
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## 0.0 334.5 480.0 473.0 576.0 1418.0
## vars n mean sd median trimmed mad min max range skew
## X1 1 1460 472.98 213.8 480 469.81 177.91 0 1418 1418 0.18
## kurtosis se
## X1 0.9 5.6
There are 1460 obsevarions with GarageArea ranging from 0SF to 1,418SF. The average GarageArea is 473SF.
BoxPlot and Histograms for the GarageArea
# BoxPlot and Histograms for the GarageArea
par(mfrow=c(1,2))
boxplot(Z, main="GarageArea BoxPlot")
hist(Z, breaks = 30, main = "GarageArea Histogram")
The distribution of the GarageArea is right skewed with a few observable outliers.
Provide a scatterplot matrix for at least two of the independent variables and the dependent variable.
# Scatterplot of LotArea and GarageArea vs SalePrice
par(mfrow=c(1,2))
plot(X, Y, xlab="LotArea", ylab="SalePrice", main="LotArea vs SalePrice")
plot(Z, Y, xlab="GarageArea", ylab="SalePrice", main="GarageArea vs SalePrice")
From the observation there seem to be no correlation between the LotArea and the SalePrice but there seem to be some form of correlation between the GarageArea and the SalePrice. This is not surprising since a large garage would most likely be associated with a large house which is poised to directly affect the saleprice.
Derive a correlation matrix for any three quantitative variables in the dataset.
corDF = train[c("LotArea", "GarageArea", "SalePrice")]
corMatrix = cor(corDF, use = "complete.obs")
print(corMatrix)
## LotArea GarageArea SalePrice
## LotArea 1.0000000 0.1804028 0.2638434
## GarageArea 0.1804028 1.0000000 0.6234314
## SalePrice 0.2638434 0.6234314 1.0000000
From the corellation matrix we can conclude that there exist strong to weak correlation between the three variables SalePrice has a strong corellation with GarageArea and a weak corellation with LotArea with corellation coefficient of 0.62 and 0.26 respectively. LotArea and GarageArea have weak corellation with each other with a correlation coefficieant of 0.18.
Test the hypotheses that the correlations between each pairwise set of variables is 0 and provide an 80% confidence interval.
# Pairwise corelation between LotArea and SalePrice
cor.test(corDF$LotArea, corDF$SalePrice, method = 'pearson', conf.level = 0.80)
##
## Pearson's product-moment correlation
##
## data: corDF$LotArea and corDF$SalePrice
## t = 10.445, df = 1458, p-value < 2.2e-16
## alternative hypothesis: true correlation is not equal to 0
## 80 percent confidence interval:
## 0.2323391 0.2947946
## sample estimates:
## cor
## 0.2638434
# Pairwise corelation between GarageArea and SalePrice
cor.test(corDF$GarageArea, corDF$SalePrice, method = 'pearson', conf.level = 0.80)
##
## Pearson's product-moment correlation
##
## data: corDF$GarageArea and corDF$SalePrice
## t = 30.446, df = 1458, p-value < 2.2e-16
## alternative hypothesis: true correlation is not equal to 0
## 80 percent confidence interval:
## 0.6024756 0.6435283
## sample estimates:
## cor
## 0.6234314
# Pairwise corelation between LotArea and GarageArea
cor.test(corDF$LotArea, corDF$GarageArea, method = 'pearson', conf.level = 0.80)
##
## Pearson's product-moment correlation
##
## data: corDF$LotArea and corDF$GarageArea
## t = 7.0034, df = 1458, p-value = 3.803e-12
## alternative hypothesis: true correlation is not equal to 0
## 80 percent confidence interval:
## 0.1477356 0.2126767
## sample estimates:
## cor
## 0.1804028
Discuss the meaning of your analysis.
The tests for pairwise corelation using pearson method estimated the association between the paired samples and computed a test of the value being zero. All the p-value are less than the significant level alpha = 0.08 and we thus conclude that the pairwise variables are correlated with respective correlation coefficient shown.
Would you be worried about familywise error? Why or why not?
Due to the fact that there are many variables in the train dataset that might have a huge impact on the correlation of the pairwise variables Yes i would be worried. Its possible to reject TRUE NULL Hypothesis unless all other variables are considered.
Invert your correlation matrix from above. This is known as the precision matrix and contains variance inflation factors on the diagonal.
## LotArea GarageArea SalePrice
## LotArea 1.0000000 0.1804028 0.2638434
## GarageArea 0.1804028 1.0000000 0.6234314
## SalePrice 0.2638434 0.6234314 1.0000000
#To invert the Corelation Matrix use the solve() function
precision_matrix = solve(corMatrix)
print(precision_matrix)
## LotArea GarageArea SalePrice
## LotArea 1.07530074 -0.02799273 -0.2662594
## GarageArea -0.02799273 1.63649778 -1.0128585
## SalePrice -0.26625940 -1.01285847 1.7016986
Multiply the correlation matrix by the precision matrix, and then multiply the precision matrix by the correlation matrix.
# To multiply the correlation matrix by the precision matrix
round((corMatrix %*% precision_matrix), 4)
## LotArea GarageArea SalePrice
## LotArea 1 0 0
## GarageArea 0 1 0
## SalePrice 0 0 1
## LotArea GarageArea SalePrice
## LotArea 1 0 0
## GarageArea 0 1 0
## SalePrice 0 0 1
Both of the above operations produce an identity matrix.
Conduct LU decomposition on the matrix.
# LU decomposition of the correlation matrix
lud_cor = lu(corMatrix)
elu_cor = expand(lud_cor)
#Lower triangular matrix
cor_L = elu_cor$L
print(cor_L)
## 3 x 3 Matrix of class "dtrMatrix" (unitriangular)
## [,1] [,2] [,3]
## [1,] 1.0000000 . .
## [2,] 0.1804028 1.0000000 .
## [3,] 0.2638434 0.5952044 1.0000000
## 3 x 3 Matrix of class "dtrMatrix"
## [,1] [,2] [,3]
## [1,] 1.0000000 0.1804028 0.2638434
## [2,] . 0.9674548 0.5758334
## [3,] . . 0.5876481
# LU decomposition of the precision matrix
lud_prec = lu(precision_matrix)
elu_prec = expand(lud_prec)
#Lower triangular matrix
prec_L = elu_prec$L
print(prec_L)
## 3 x 3 Matrix of class "dtrMatrix" (unitriangular)
## [,1] [,2] [,3]
## [1,] 1.00000000 . .
## [2,] -0.02603247 1.00000000 .
## [3,] -0.24761389 -0.62343144 1.00000000
## 3 x 3 Matrix of class "dtrMatrix"
## [,1] [,2] [,3]
## [1,] 1.07530074 -0.02799273 -0.26625940
## [2,] . 1.63576906 -1.01978986
## [3,] . . 1.00000000
Multiply the Upper triangulat matrix with the lower triangular matrix
## 3 x 3 Matrix of class "dgeMatrix"
## [,1] [,2] [,3]
## [1,] 1.0000000 0.1804028 0.2638434
## [2,] 0.1804028 1.0000000 0.6234314
## [3,] 0.2638434 0.6234314 1.0000000
## 3 x 3 Matrix of class "dgeMatrix"
## [,1] [,2] [,3]
## [1,] 1.07530074 -0.02799273 -0.2662594
## [2,] -0.02799273 1.63649778 -1.0128585
## [3,] -0.26625940 -1.01285847 1.7016986
Multiplication of L & U returns their respective original matrix as expected.
Many times, it makes sense to fit a closed form distribution to data. Select a variable in the Kaggle.com training dataset that is skewed to the right, shift it so that the minimum value is absolutely above zero if necessary.
# We will use the GarageAre data. Check minimum and see if shifting is necessary.
z = corDF$GarageArea
min(z)
## [1] 0
Then load the MASS package and run fitdistr to fit an exponential probability density function.
(See https://stat.ethz.ch/R-manual/R-devel/library/MASS/html/fitdistr.html ).
# Fit an exponential probability density function
fit_exp = fitdistr(z, densfun = "exponential")
print(fit_exp)
## rate
## 2.114254e-03
## (5.533255e-05)
Find the optimal value of lambda for this distribution, and then take 1000 samples from this exponential distribution using this value (e.g., rexp(1000, lambda)).
## rate
## 0.002114254
Plot a histogram and compare it with a histogram of your original variable.
# Plot a histogram of the observed and the simulated data
par(mfrow=c(1,2))
hist(z, breaks = 100, xlab = "Observed_GarageArea", main = "Observed")
hist(samp, breaks = 100, xlab = "Simulated_GarageArea", main = "Simulated")
Visually from the histograms the simulated data is more heavily skewed to the right while the observed data is more concentrated to the centre.
Using the exponential pdf, find the 5th and 95th percentiles using the cumulative distribution function (CDF).
# Find the 5th and 95th percentile of the simulated sample data
quantile(samp, probs = c(0.05, 0.95))
## 5% 95%
## 23.98491 1369.76609
Also generate a 95% confidence interval from the empirical data, assuming normality.
# Calculating the 95% confidence interval on the observed data
mean_z = mean(z)
n_z = nrow(z)
sd_z = sd(z)
se = qnorm(0.975) * sd_z/sqrt(n)
left_int = mean_z - se
print(left_int)
## [1] 468.7896
## [1] 477.1706
The 95% confidence interval lies between 468.79 and 477.17.
# We plot a histogram to show the assumed normality
assume_normality = rnorm(length(z), mean_z, sd_z)
hist(assume_normality)
Finally, provide the empirical 5th percentile and 95th percentile of the data. Discuss.
## 5% 95%
## 0.0 850.1
Build some type of multiple regression model and submit your model to the competition board.
Provide your complete model summary and results with analysis.
#Load Training data set into a data frame and review the summary statistics for missing and or limited features.
hd_tr = read.csv("https://raw.githubusercontent.com/igukusamuel/DATA-605-Final-Project/master/train.csv")
summary(hd_tr)
## Id MSSubClass MSZoning LotFrontage
## Min. : 1.0 Min. : 20.0 C (all): 10 Min. : 21.00
## 1st Qu.: 365.8 1st Qu.: 20.0 FV : 65 1st Qu.: 59.00
## Median : 730.5 Median : 50.0 RH : 16 Median : 69.00
## Mean : 730.5 Mean : 56.9 RL :1151 Mean : 70.05
## 3rd Qu.:1095.2 3rd Qu.: 70.0 RM : 218 3rd Qu.: 80.00
## Max. :1460.0 Max. :190.0 Max. :313.00
## NA's :259
## LotArea Street Alley LotShape LandContour
## Min. : 1300 Grvl: 6 Grvl: 50 IR1:484 Bnk: 63
## 1st Qu.: 7554 Pave:1454 Pave: 41 IR2: 41 HLS: 50
## Median : 9478 NA's:1369 IR3: 10 Low: 36
## Mean : 10517 Reg:925 Lvl:1311
## 3rd Qu.: 11602
## Max. :215245
##
## Utilities LotConfig LandSlope Neighborhood Condition1
## AllPub:1459 Corner : 263 Gtl:1382 NAmes :225 Norm :1260
## NoSeWa: 1 CulDSac: 94 Mod: 65 CollgCr:150 Feedr : 81
## FR2 : 47 Sev: 13 OldTown:113 Artery : 48
## FR3 : 4 Edwards:100 RRAn : 26
## Inside :1052 Somerst: 86 PosN : 19
## Gilbert: 79 RRAe : 11
## (Other):707 (Other): 15
## Condition2 BldgType HouseStyle OverallQual
## Norm :1445 1Fam :1220 1Story :726 Min. : 1.000
## Feedr : 6 2fmCon: 31 2Story :445 1st Qu.: 5.000
## Artery : 2 Duplex: 52 1.5Fin :154 Median : 6.000
## PosN : 2 Twnhs : 43 SLvl : 65 Mean : 6.099
## RRNn : 2 TwnhsE: 114 SFoyer : 37 3rd Qu.: 7.000
## PosA : 1 1.5Unf : 14 Max. :10.000
## (Other): 2 (Other): 19
## OverallCond YearBuilt YearRemodAdd RoofStyle
## Min. :1.000 Min. :1872 Min. :1950 Flat : 13
## 1st Qu.:5.000 1st Qu.:1954 1st Qu.:1967 Gable :1141
## Median :5.000 Median :1973 Median :1994 Gambrel: 11
## Mean :5.575 Mean :1971 Mean :1985 Hip : 286
## 3rd Qu.:6.000 3rd Qu.:2000 3rd Qu.:2004 Mansard: 7
## Max. :9.000 Max. :2010 Max. :2010 Shed : 2
##
## RoofMatl Exterior1st Exterior2nd MasVnrType MasVnrArea
## CompShg:1434 VinylSd:515 VinylSd:504 BrkCmn : 15 Min. : 0.0
## Tar&Grv: 11 HdBoard:222 MetalSd:214 BrkFace:445 1st Qu.: 0.0
## WdShngl: 6 MetalSd:220 HdBoard:207 None :864 Median : 0.0
## WdShake: 5 Wd Sdng:206 Wd Sdng:197 Stone :128 Mean : 103.7
## ClyTile: 1 Plywood:108 Plywood:142 NA's : 8 3rd Qu.: 166.0
## Membran: 1 CemntBd: 61 CmentBd: 60 Max. :1600.0
## (Other): 2 (Other):128 (Other):136 NA's :8
## ExterQual ExterCond Foundation BsmtQual BsmtCond BsmtExposure
## Ex: 52 Ex: 3 BrkTil:146 Ex :121 Fa : 45 Av :221
## Fa: 14 Fa: 28 CBlock:634 Fa : 35 Gd : 65 Gd :134
## Gd:488 Gd: 146 PConc :647 Gd :618 Po : 2 Mn :114
## TA:906 Po: 1 Slab : 24 TA :649 TA :1311 No :953
## TA:1282 Stone : 6 NA's: 37 NA's: 37 NA's: 38
## Wood : 3
##
## BsmtFinType1 BsmtFinSF1 BsmtFinType2 BsmtFinSF2
## ALQ :220 Min. : 0.0 ALQ : 19 Min. : 0.00
## BLQ :148 1st Qu.: 0.0 BLQ : 33 1st Qu.: 0.00
## GLQ :418 Median : 383.5 GLQ : 14 Median : 0.00
## LwQ : 74 Mean : 443.6 LwQ : 46 Mean : 46.55
## Rec :133 3rd Qu.: 712.2 Rec : 54 3rd Qu.: 0.00
## Unf :430 Max. :5644.0 Unf :1256 Max. :1474.00
## NA's: 37 NA's: 38
## BsmtUnfSF TotalBsmtSF Heating HeatingQC CentralAir
## Min. : 0.0 Min. : 0.0 Floor: 1 Ex:741 N: 95
## 1st Qu.: 223.0 1st Qu.: 795.8 GasA :1428 Fa: 49 Y:1365
## Median : 477.5 Median : 991.5 GasW : 18 Gd:241
## Mean : 567.2 Mean :1057.4 Grav : 7 Po: 1
## 3rd Qu.: 808.0 3rd Qu.:1298.2 OthW : 2 TA:428
## Max. :2336.0 Max. :6110.0 Wall : 4
##
## Electrical X1stFlrSF X2ndFlrSF LowQualFinSF
## FuseA: 94 Min. : 334 Min. : 0 Min. : 0.000
## FuseF: 27 1st Qu.: 882 1st Qu.: 0 1st Qu.: 0.000
## FuseP: 3 Median :1087 Median : 0 Median : 0.000
## Mix : 1 Mean :1163 Mean : 347 Mean : 5.845
## SBrkr:1334 3rd Qu.:1391 3rd Qu.: 728 3rd Qu.: 0.000
## NA's : 1 Max. :4692 Max. :2065 Max. :572.000
##
## GrLivArea BsmtFullBath BsmtHalfBath FullBath
## Min. : 334 Min. :0.0000 Min. :0.00000 Min. :0.000
## 1st Qu.:1130 1st Qu.:0.0000 1st Qu.:0.00000 1st Qu.:1.000
## Median :1464 Median :0.0000 Median :0.00000 Median :2.000
## Mean :1515 Mean :0.4253 Mean :0.05753 Mean :1.565
## 3rd Qu.:1777 3rd Qu.:1.0000 3rd Qu.:0.00000 3rd Qu.:2.000
## Max. :5642 Max. :3.0000 Max. :2.00000 Max. :3.000
##
## HalfBath BedroomAbvGr KitchenAbvGr KitchenQual
## Min. :0.0000 Min. :0.000 Min. :0.000 Ex:100
## 1st Qu.:0.0000 1st Qu.:2.000 1st Qu.:1.000 Fa: 39
## Median :0.0000 Median :3.000 Median :1.000 Gd:586
## Mean :0.3829 Mean :2.866 Mean :1.047 TA:735
## 3rd Qu.:1.0000 3rd Qu.:3.000 3rd Qu.:1.000
## Max. :2.0000 Max. :8.000 Max. :3.000
##
## TotRmsAbvGrd Functional Fireplaces FireplaceQu GarageType
## Min. : 2.000 Maj1: 14 Min. :0.000 Ex : 24 2Types : 6
## 1st Qu.: 5.000 Maj2: 5 1st Qu.:0.000 Fa : 33 Attchd :870
## Median : 6.000 Min1: 31 Median :1.000 Gd :380 Basment: 19
## Mean : 6.518 Min2: 34 Mean :0.613 Po : 20 BuiltIn: 88
## 3rd Qu.: 7.000 Mod : 15 3rd Qu.:1.000 TA :313 CarPort: 9
## Max. :14.000 Sev : 1 Max. :3.000 NA's:690 Detchd :387
## Typ :1360 NA's : 81
## GarageYrBlt GarageFinish GarageCars GarageArea GarageQual
## Min. :1900 Fin :352 Min. :0.000 Min. : 0.0 Ex : 3
## 1st Qu.:1961 RFn :422 1st Qu.:1.000 1st Qu.: 334.5 Fa : 48
## Median :1980 Unf :605 Median :2.000 Median : 480.0 Gd : 14
## Mean :1979 NA's: 81 Mean :1.767 Mean : 473.0 Po : 3
## 3rd Qu.:2002 3rd Qu.:2.000 3rd Qu.: 576.0 TA :1311
## Max. :2010 Max. :4.000 Max. :1418.0 NA's: 81
## NA's :81
## GarageCond PavedDrive WoodDeckSF OpenPorchSF EnclosedPorch
## Ex : 2 N: 90 Min. : 0.00 Min. : 0.00 Min. : 0.00
## Fa : 35 P: 30 1st Qu.: 0.00 1st Qu.: 0.00 1st Qu.: 0.00
## Gd : 9 Y:1340 Median : 0.00 Median : 25.00 Median : 0.00
## Po : 7 Mean : 94.24 Mean : 46.66 Mean : 21.95
## TA :1326 3rd Qu.:168.00 3rd Qu.: 68.00 3rd Qu.: 0.00
## NA's: 81 Max. :857.00 Max. :547.00 Max. :552.00
##
## X3SsnPorch ScreenPorch PoolArea PoolQC
## Min. : 0.00 Min. : 0.00 Min. : 0.000 Ex : 2
## 1st Qu.: 0.00 1st Qu.: 0.00 1st Qu.: 0.000 Fa : 2
## Median : 0.00 Median : 0.00 Median : 0.000 Gd : 3
## Mean : 3.41 Mean : 15.06 Mean : 2.759 NA's:1453
## 3rd Qu.: 0.00 3rd Qu.: 0.00 3rd Qu.: 0.000
## Max. :508.00 Max. :480.00 Max. :738.000
##
## Fence MiscFeature MiscVal MoSold
## GdPrv: 59 Gar2: 2 Min. : 0.00 Min. : 1.000
## GdWo : 54 Othr: 2 1st Qu.: 0.00 1st Qu.: 5.000
## MnPrv: 157 Shed: 49 Median : 0.00 Median : 6.000
## MnWw : 11 TenC: 1 Mean : 43.49 Mean : 6.322
## NA's :1179 NA's:1406 3rd Qu.: 0.00 3rd Qu.: 8.000
## Max. :15500.00 Max. :12.000
##
## YrSold SaleType SaleCondition SalePrice
## Min. :2006 WD :1267 Abnorml: 101 Min. : 34900
## 1st Qu.:2007 New : 122 AdjLand: 4 1st Qu.:129975
## Median :2008 COD : 43 Alloca : 12 Median :163000
## Mean :2008 ConLD : 9 Family : 20 Mean :180921
## 3rd Qu.:2009 ConLI : 5 Normal :1198 3rd Qu.:214000
## Max. :2010 ConLw : 5 Partial: 125 Max. :755000
## (Other): 9
Load the selected variables into a dataframe and perform data cleanup operations.
# Remove features with missing or limited data as evaluated from the summary statistics above.
hd_train <- hd_tr[, c("MSSubClass", "MSZoning", "LotArea", "LotShape", "LotConfig", "Neighborhood", "Condition1",
"Street", "BldgType", "HouseStyle", "OverallQual", "OverallCond", "YearBuilt", "YearRemodAdd",
"RoofStyle", "Exterior2nd", "MasVnrType", "ExterQual", "BsmtQual", "BsmtCond", "BsmtExposure",
"BsmtFinType1", "BsmtFinSF1", "BsmtFinType2", "TotalBsmtSF", "HeatingQC", "Electrical",
"X1stFlrSF", "GrLivArea", "BsmtFullBath", "BsmtHalfBath", "FullBath", "HalfBath", "BedroomAbvGr",
"KitchenAbvGr", "KitchenQual", "TotRmsAbvGrd", "Functional", "GarageArea", "PavedDrive",
"WoodDeckSF", "OpenPorchSF", "YrSold", "SaleType", "SaleCondition", "SalePrice")]
# Remove all Na's from the training data set
hd_train = na.omit(hd_train)
# Generate an initial regression model
Model_1 = lm(SalePrice ~ ., data = hd_train)
# Generate Summary Statisics
summary(Model_1)
##
## Call:
## lm(formula = SalePrice ~ ., data = hd_train)
##
## Residuals:
## Min 1Q Median 3Q Max
## -344189 -11532 0 10881 211337
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 4.434e+05 1.283e+06 0.346 0.729647
## MSSubClass -1.102e+02 1.010e+02 -1.091 0.275645
## MSZoningFV 2.985e+04 1.468e+04 2.033 0.042284 *
## MSZoningRH 1.971e+04 1.461e+04 1.349 0.177597
## MSZoningRL 2.334e+04 1.241e+04 1.881 0.060260 .
## MSZoningRM 2.360e+04 1.161e+04 2.033 0.042245 *
## LotArea 4.435e-01 1.036e-01 4.282 1.99e-05 ***
## LotShapeIR2 8.458e+03 5.179e+03 1.633 0.102682
## LotShapeIR3 -3.270e+04 1.050e+04 -3.114 0.001888 **
## LotShapeReg 1.593e+03 1.975e+03 0.806 0.420183
## LotConfigCulDSac 1.134e+04 3.955e+03 2.867 0.004217 **
## LotConfigFR2 -9.935e+03 5.001e+03 -1.987 0.047184 *
## LotConfigFR3 -2.174e+04 1.546e+04 -1.407 0.159797
## LotConfigInside -1.358e+03 2.158e+03 -0.629 0.529412
## NeighborhoodBlueste -1.309e+02 2.353e+04 -0.006 0.995561
## NeighborhoodBrDale 5.567e+03 1.323e+04 0.421 0.674005
## NeighborhoodBrkSide -3.007e+03 1.140e+04 -0.264 0.792074
## NeighborhoodClearCr -6.810e+03 1.121e+04 -0.608 0.543542
## NeighborhoodCollgCr -6.176e+03 8.749e+03 -0.706 0.480382
## NeighborhoodCrawfor 2.075e+04 1.029e+04 2.017 0.043902 *
## NeighborhoodEdwards -2.309e+04 9.730e+03 -2.373 0.017784 *
## NeighborhoodGilbert -4.705e+03 9.461e+03 -0.497 0.619084
## NeighborhoodIDOTRR -1.306e+04 1.294e+04 -1.009 0.313128
## NeighborhoodMeadowV -5.162e+03 1.355e+04 -0.381 0.703306
## NeighborhoodMitchel -1.729e+04 9.970e+03 -1.734 0.083119 .
## NeighborhoodNAmes -1.207e+04 9.448e+03 -1.277 0.201842
## NeighborhoodNoRidge 4.720e+04 1.004e+04 4.702 2.86e-06 ***
## NeighborhoodNPkVill 1.483e+04 1.602e+04 0.925 0.354910
## NeighborhoodNridgHt 3.011e+04 8.908e+03 3.380 0.000748 ***
## NeighborhoodNWAmes -1.001e+04 9.689e+03 -1.033 0.301954
## NeighborhoodOldTown -1.838e+04 1.169e+04 -1.573 0.116086
## NeighborhoodSawyer -8.708e+03 9.942e+03 -0.876 0.381251
## NeighborhoodSawyerW 8.269e+02 9.545e+03 0.087 0.930975
## NeighborhoodSomerst 8.598e+03 1.096e+04 0.784 0.432908
## NeighborhoodStoneBr 4.906e+04 9.975e+03 4.918 9.91e-07 ***
## NeighborhoodSWISU -1.419e+04 1.166e+04 -1.217 0.223713
## NeighborhoodTimber -4.617e+03 9.837e+03 -0.469 0.638922
## NeighborhoodVeenker 1.228e+04 1.283e+04 0.957 0.338585
## Condition1Feedr -8.740e+03 6.004e+03 -1.456 0.145740
## Condition1Norm 4.573e+03 4.919e+03 0.930 0.352700
## Condition1PosA -1.313e+03 1.204e+04 -0.109 0.913180
## Condition1PosN -1.349e+04 8.558e+03 -1.576 0.115237
## Condition1RRAe -1.920e+04 1.166e+04 -1.647 0.099780 .
## Condition1RRAn 3.161e+03 7.894e+03 0.400 0.688856
## Condition1RRNe -1.199e+04 2.171e+04 -0.552 0.580848
## Condition1RRNn -2.010e+03 1.538e+04 -0.131 0.896057
## StreetPave 2.758e+04 1.452e+04 1.899 0.057794 .
## BldgType2fmCon 4.228e+03 1.504e+04 0.281 0.778738
## BldgTypeDuplex -8.633e+03 8.797e+03 -0.981 0.326619
## BldgTypeTwnhs -1.839e+04 1.215e+04 -1.514 0.130227
## BldgTypeTwnhsE -1.318e+04 1.093e+04 -1.205 0.228245
## HouseStyle1.5Unf 1.512e+04 9.181e+03 1.647 0.099903 .
## HouseStyle1Story 1.716e+04 5.157e+03 3.327 0.000903 ***
## HouseStyle2.5Fin -2.542e+04 1.252e+04 -2.030 0.042584 *
## HouseStyle2.5Unf -5.520e+03 1.028e+04 -0.537 0.591251
## HouseStyle2Story -8.261e+03 4.120e+03 -2.005 0.045136 *
## HouseStyleSFoyer 1.243e+04 7.850e+03 1.584 0.113460
## HouseStyleSLvl 7.320e+03 6.618e+03 1.106 0.268915
## OverallQual 8.975e+03 1.211e+03 7.413 2.26e-13 ***
## OverallCond 4.671e+03 1.023e+03 4.567 5.43e-06 ***
## YearBuilt 1.457e+02 8.310e+01 1.753 0.079892 .
## YearRemodAdd 4.454e+01 6.853e+01 0.650 0.515908
## RoofStyleGable 4.008e+03 1.057e+04 0.379 0.704592
## RoofStyleGambrel 1.078e+04 1.417e+04 0.761 0.446926
## RoofStyleHip 7.845e+03 1.074e+04 0.731 0.465204
## RoofStyleMansard 1.618e+04 1.600e+04 1.011 0.312346
## RoofStyleShed 8.839e+03 2.368e+04 0.373 0.709040
## Exterior2ndAsphShn -9.814e+03 2.380e+04 -0.412 0.680121
## Exterior2ndBrk Cmn -7.949e+03 1.842e+04 -0.431 0.666219
## Exterior2ndBrkFace 1.606e+04 9.929e+03 1.618 0.106002
## Exterior2ndCBlock -1.176e+04 3.314e+04 -0.355 0.722813
## Exterior2ndCmentBd 4.709e+02 9.132e+03 0.052 0.958886
## Exterior2ndHdBoard -3.003e+03 7.815e+03 -0.384 0.700867
## Exterior2ndImStucc 1.963e+04 1.233e+04 1.592 0.111633
## Exterior2ndMetalSd -2.496e+01 7.594e+03 -0.003 0.997378
## Exterior2ndOther -9.095e+03 3.090e+04 -0.294 0.768539
## Exterior2ndPlywood -2.116e+03 7.997e+03 -0.265 0.791332
## Exterior2ndStone -1.927e+04 2.289e+04 -0.842 0.400023
## Exterior2ndStucco -1.686e+04 9.753e+03 -1.729 0.084093 .
## Exterior2ndVinylSd 1.837e+03 7.695e+03 0.239 0.811392
## Exterior2ndWd Sdng -6.857e+02 7.603e+03 -0.090 0.928155
## Exterior2ndWd Shng -5.614e+03 8.849e+03 -0.634 0.525895
## MasVnrTypeBrkFace 8.857e+03 8.464e+03 1.046 0.295596
## MasVnrTypeNone 8.718e+03 8.338e+03 1.046 0.295967
## MasVnrTypeStone 1.242e+04 8.903e+03 1.395 0.163256
## ExterQualFa -7.445e+03 1.374e+04 -0.542 0.587907
## ExterQualGd -1.112e+04 5.806e+03 -1.916 0.055628 .
## ExterQualTA -1.396e+04 6.448e+03 -2.165 0.030546 *
## BsmtQualFa -2.391e+04 7.751e+03 -3.085 0.002078 **
## BsmtQualGd -2.517e+04 4.044e+03 -6.224 6.60e-10 ***
## BsmtQualTA -2.430e+04 4.947e+03 -4.913 1.02e-06 ***
## BsmtCondGd 3.466e+01 6.437e+03 0.005 0.995705
## BsmtCondPo 1.329e+04 3.412e+04 0.390 0.696971
## BsmtCondTA 6.426e+03 5.087e+03 1.263 0.206753
## BsmtExposureGd 1.952e+04 3.625e+03 5.385 8.63e-08 ***
## BsmtExposureMn -2.863e+03 3.699e+03 -0.774 0.439004
## BsmtExposureNo -7.077e+03 2.694e+03 -2.627 0.008729 **
## BsmtFinType1BLQ -7.593e+02 3.369e+03 -0.225 0.821721
## BsmtFinType1GLQ 2.217e+03 3.072e+03 0.722 0.470684
## BsmtFinType1LwQ -6.067e+03 4.469e+03 -1.358 0.174792
## BsmtFinType1Rec -2.432e+03 3.625e+03 -0.671 0.502332
## BsmtFinType1Unf -9.477e+03 3.503e+03 -2.705 0.006918 **
## BsmtFinSF1 -1.271e+00 3.565e+00 -0.356 0.721592
## BsmtFinType2BLQ -1.137e+04 8.978e+03 -1.267 0.205469
## BsmtFinType2GLQ -3.076e+03 1.112e+04 -0.277 0.782183
## BsmtFinType2LwQ -8.399e+03 8.746e+03 -0.960 0.337064
## BsmtFinType2Rec -6.727e+03 8.479e+03 -0.793 0.427704
## BsmtFinType2Unf -6.146e+03 7.602e+03 -0.808 0.418961
## TotalBsmtSF -6.353e-01 5.685e+00 -0.112 0.911039
## HeatingQCFa -4.060e+02 5.213e+03 -0.078 0.937937
## HeatingQCGd -3.087e+03 2.556e+03 -1.208 0.227300
## HeatingQCPo -7.300e+03 3.303e+04 -0.221 0.825093
## HeatingQCTA -3.077e+03 2.498e+03 -1.232 0.218211
## ElectricalFuseF 1.090e+01 7.643e+03 0.001 0.998862
## ElectricalFuseP 7.639e+03 2.285e+04 0.334 0.738238
## ElectricalMix -6.889e+03 4.743e+04 -0.145 0.884541
## ElectricalSBrkr -1.506e+03 3.631e+03 -0.415 0.678396
## X1stFlrSF -1.703e+01 8.330e+00 -2.044 0.041118 *
## GrLivArea 6.399e+01 6.190e+00 10.339 < 2e-16 ***
## BsmtFullBath 6.960e+03 2.365e+03 2.943 0.003309 **
## BsmtHalfBath 3.008e+03 3.669e+03 0.820 0.412479
## FullBath 9.400e+03 2.659e+03 3.535 0.000423 ***
## HalfBath 5.074e+03 2.540e+03 1.998 0.045948 *
## BedroomAbvGr -2.775e+03 1.676e+03 -1.656 0.098044 .
## KitchenAbvGr -1.292e+04 6.836e+03 -1.890 0.058980 .
## KitchenQualFa -3.008e+04 7.436e+03 -4.045 5.55e-05 ***
## KitchenQualGd -2.811e+04 4.213e+03 -6.672 3.76e-11 ***
## KitchenQualTA -2.695e+04 4.773e+03 -5.645 2.03e-08 ***
## TotRmsAbvGrd 2.653e+03 1.160e+03 2.286 0.022392 *
## FunctionalMaj2 -9.594e+03 1.760e+04 -0.545 0.585848
## FunctionalMin1 1.389e+02 1.078e+04 0.013 0.989721
## FunctionalMin2 3.133e+03 1.071e+04 0.292 0.770035
## FunctionalMod 8.591e+03 1.305e+04 0.658 0.510368
## FunctionalSev -3.481e+04 3.386e+04 -1.028 0.304040
## FunctionalTyp 1.352e+04 9.290e+03 1.455 0.145797
## GarageArea 2.321e+01 5.409e+00 4.290 1.92e-05 ***
## PavedDriveP -2.064e+03 6.683e+03 -0.309 0.757486
## PavedDriveY 2.992e+03 4.264e+03 0.702 0.483100
## WoodDeckSF 1.658e+01 7.018e+00 2.362 0.018312 *
## OpenPorchSF -7.524e+00 1.380e+01 -0.545 0.585688
## YrSold -4.207e+02 6.317e+02 -0.666 0.505588
## SaleTypeCon 2.352e+04 2.203e+04 1.068 0.285887
## SaleTypeConLD 1.938e+04 1.246e+04 1.556 0.119895
## SaleTypeConLI 1.386e+04 1.430e+04 0.969 0.332694
## SaleTypeConLw 2.019e+03 1.485e+04 0.136 0.891830
## SaleTypeCWD 1.157e+04 1.594e+04 0.726 0.468074
## SaleTypeNew 3.558e+04 1.886e+04 1.886 0.059520 .
## SaleTypeOth 1.567e+04 1.798e+04 0.871 0.383784
## SaleTypeWD 1.542e+03 5.169e+03 0.298 0.765576
## SaleConditionAdjLand 1.398e+04 1.921e+04 0.728 0.467035
## SaleConditionAlloca 2.442e+04 1.195e+04 2.043 0.041233 *
## SaleConditionFamily -7.976e+02 7.540e+03 -0.106 0.915771
## SaleConditionNormal 6.515e+03 3.528e+03 1.847 0.065016 .
## SaleConditionPartial -1.804e+04 1.818e+04 -0.992 0.321186
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 28670 on 1258 degrees of freedom
## Multiple R-squared: 0.8833, Adjusted R-squared: 0.8692
## F-statistic: 62.26 on 153 and 1258 DF, p-value: < 2.2e-16
The model produces a Multiple R-Squared of 0.8833 which is great. We can interpret this to mean that 88.33% variance in the saleprice van be explained by the predictor variables in the current model. The F-Statistic is 62.26 on 153 and 1258 degrees of freedom. The p-value is very small.
Let us test how good our model performs on the test data set.
# Load Testing set
hd_test = read.csv("https://raw.githubusercontent.com/igukusamuel/DATA-605-Final-Project/master/test.csv")
# Predict the saleprice for the testing data set
PredictedDataModel = hd_test
PredictedDataModel$SalePrice = predict(Model_1, hd_test)
# Set the Id and SalePrice colums to be loaded into Kaggle Data Frame
Id = hd_test$Id
SalePrice = PredictedDataModel$SalePrice
# Combine the ID and the model predicted SalePrice values
DF = data.frame(cbind(Id, SalePrice))
# Replace all missing values with 0 in the final Data Frame
DF[is.na(DF)] = 0
DF = replace(DF, is.na(DF), 0)
#View the first 5 rows of the data frame
print(head(DF, 5))
## Id SalePrice
## 1 1461 112727.4
## 2 1462 162637.4
## 3 1463 170295.3
## 4 1464 188793.8
## 5 1465 213258.2
Report your Kaggle.com user name and score.
Kaggle Username: Samuel Kigamba
Kaggle Score: 2.50090 (I later used python machine learning and obtained a score of 0.31772)
See submission documents : https://github.com/igukusamuel/DATA-605-Final-Project