This machine learning project is centered around the pivotal objective of enhancing online ride fare pricing accuracy. The core mission of this project is to construct a robust predictive model capable of precisely estimating ride fares across a multitude of situations and conditions. By achieving a heightened level of accuracy in fare predictions, the model strives to optimize pricing strategies, provide users with transparent and reliable fare information, and contribute to improved decision-making in the realm of ride-hailing services. Through a more refined understanding of ride pricing factors and variables, this model aspires to provide users with fair and accurate fare estimates, thereby enhancing user satisfaction and bolstering trust in the platform’s pricing mechanisms.

These are some valuable business advantages that are offered by the result of this project:

  • Enhanced User Trust: Accurate pricing builds trust, boosting user satisfaction and loyalty.
  • Competitive Edge: Reliable pricing sets you apart, making your service more attractive.
  • Optimized Revenue: Precise fares maximize revenue by aligning with demand.
  • Customer Retention: Satisfied users stay, reducing churn rates.
  • Effective Marketing: Tailored campaigns attract users valuing transparency.
  • Operational Efficiency: Accurate fares help drivers utilize fleet effectively.
  • Informed Decisions: Fare insights guide expansion and pricing strategies.
  • Reduced Complaints: Accurate pricing minimizes disputes and issues.
  • Regulatory Compliance: Accurate pricing ensures legal adherence.
  • Innovation Potential: Accurate models enable dynamic pricing and personalized offers.

Data Pre-processing

Import used libraries

library(tidyverse)
library(Ardian)
library(inspectdf)
library(randomForest)
library(caret)
library(tensorflow)
library(keras)
library(class)
library(FNN)

Read the data

The dataset is taken from Uber trip records of Taxi and Limousine Commission (TLC) that can be accessed in https://www.nyc.gov/site/tlc/about/tlc-trip-record-data.page

uber <- read.csv("uber.csv")

Inspect the data

Top 6 rows

uber %>% head()

Bottom 6 rows

uber %>% tail()

Check duplicated rows

uber %>% duplicated %>% any()
## [1] FALSE

Alhamdulillah there’s no duplicated row

Check missing values

uber %>% anyNA()
## [1] TRUE

There are missing values. Let’s inspect!

Inspect missing values

uber %>% is.na() %>% colSums()
##                 X               key       fare_amount   pickup_datetime 
##                 0                 0                 0                 0 
##  pickup_longitude   pickup_latitude dropoff_longitude  dropoff_latitude 
##                 0                 0                 1                 1 
##   passenger_count 
##                 0

It’s not a sufficient number of missing value. Let’s just remove the rows!

Handle missing values

Remove rows that has a missing value

uber <- uber %>% filter(complete.cases(.))

uber %>% is.na() %>% colSums()
##                 X               key       fare_amount   pickup_datetime 
##                 0                 0                 0                 0 
##  pickup_longitude   pickup_latitude dropoff_longitude  dropoff_latitude 
##                 0                 0                 0                 0 
##   passenger_count 
##                 0

Cool, we’re free from missing value now

Inspect data structure

uber %>% glimpse()
## Rows: 199,999
## Columns: 9
## $ X                 <int> 24238194, 27835199, 44984355, 25894730, 17610152, 44…
## $ key               <chr> "2015-05-07 19:52:06.0000003", "2009-07-17 20:04:56.…
## $ fare_amount       <dbl> 7.5, 7.7, 12.9, 5.3, 16.0, 4.9, 24.5, 2.5, 9.7, 12.5…
## $ pickup_datetime   <chr> "2015-05-07 19:52:06 UTC", "2009-07-17 20:04:56 UTC"…
## $ pickup_longitude  <dbl> -73.99982, -73.99435, -74.00504, -73.97612, -73.9250…
## $ pickup_latitude   <dbl> 40.73835, 40.72823, 40.74077, 40.79084, 40.74408, 40…
## $ dropoff_longitude <dbl> -73.99951, -73.99471, -73.96256, -73.96532, -73.9730…
## $ dropoff_latitude  <dbl> 40.72322, 40.75032, 40.77265, 40.80335, 40.76125, 40…
## $ passenger_count   <int> 1, 1, 1, 3, 5, 1, 5, 1, 1, 1, 1, 1, 5, 1, 1, 2, 1, 2…

We need to remove some useless columns and then do some feature engineering to get the target variables and some important information from the existing columns

Remove unneeded columns

uber <- uber %>% 
  select(-X,
         -key)

uber %>% glimpse()
## Rows: 199,999
## Columns: 7
## $ fare_amount       <dbl> 7.5, 7.7, 12.9, 5.3, 16.0, 4.9, 24.5, 2.5, 9.7, 12.5…
## $ pickup_datetime   <chr> "2015-05-07 19:52:06 UTC", "2009-07-17 20:04:56 UTC"…
## $ pickup_longitude  <dbl> -73.99982, -73.99435, -74.00504, -73.97612, -73.9250…
## $ pickup_latitude   <dbl> 40.73835, 40.72823, 40.74077, 40.79084, 40.74408, 40…
## $ dropoff_longitude <dbl> -73.99951, -73.99471, -73.96256, -73.96532, -73.9730…
## $ dropoff_latitude  <dbl> 40.72322, 40.75032, 40.77265, 40.80335, 40.76125, 40…
## $ passenger_count   <int> 1, 1, 1, 3, 5, 1, 5, 1, 1, 1, 1, 1, 5, 1, 1, 2, 1, 2…

Feature Engineering & Selection

Extract distance variable

uber <- uber %>%
  # getHaversine() function is from my personal package
  mutate(distance = getHaversine(pickup_latitude, pickup_longitude, dropoff_latitude, dropoff_longitude)) %>% 
  select(-pickup_latitude, 
         -pickup_longitude, 
         -dropoff_latitude, 
         -dropoff_longitude)

uber %>% glimpse()
## Rows: 199,999
## Columns: 4
## $ fare_amount     <dbl> 7.5, 7.7, 12.9, 5.3, 16.0, 4.9, 24.5, 2.5, 9.7, 12.5, …
## $ pickup_datetime <chr> "2015-05-07 19:52:06 UTC", "2009-07-17 20:04:56 UTC", …
## $ passenger_count <int> 1, 1, 1, 3, 5, 1, 5, 1, 1, 1, 1, 1, 5, 1, 1, 2, 1, 2, …
## $ distance        <dbl> 1.6833228, 2.4575899, 5.0363772, 1.6616835, 4.4754500,…

Extract hour and is_weekend variables

uber <- uber %>% 
  mutate(pickup_datetime = ymd_hms(pickup_datetime),
         hour = as.factor(hour(pickup_datetime)),
         is_weekend = as.factor(ifelse(wday(pickup_datetime) > 5, 1, 0))) %>%
  select(-pickup_datetime)

uber %>% glimpse()
## Rows: 199,999
## Columns: 5
## $ fare_amount     <dbl> 7.5, 7.7, 12.9, 5.3, 16.0, 4.9, 24.5, 2.5, 9.7, 12.5, …
## $ passenger_count <int> 1, 1, 1, 3, 5, 1, 5, 1, 1, 1, 1, 1, 5, 1, 1, 2, 1, 2, …
## $ distance        <dbl> 1.6833228, 2.4575899, 5.0363772, 1.6616835, 4.4754500,…
## $ hour            <fct> 19, 20, 21, 8, 17, 2, 7, 13, 9, 19, 17, 22, 14, 11, 22…
## $ is_weekend      <fct> 0, 1, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, …

Exploratory Data Analysis

Inspect target variable summary

By looking at data summary, we can detect weird records or people may call them anomalies!

uber$fare_amount %>% summary()
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  -52.00    6.00    8.50   11.36   12.50  499.00

There a 2 anomalies:

  1. fare_amount < 0. Did the passenger steal the driver’s money?
  2. fare_amount == 499. Did the driver steal the passenger’s money?

Handle target variable anomalies

uber <- uber %>%
  filter(fare_amount >= 1) %>% 
  killOutliers(fare_amount)

uber$fare_amount %>% summary()
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   2.500   5.700   8.000   8.941  11.000  22.200

Cool. Looking beatiful

Inspect features summary

uber %>% select(-fare_amount) %>% summary()
##  passenger_count      distance              hour        is_weekend
##  Min.   :  0.000   Min.   :    0.000   19     : 11843   0:126507  
##  1st Qu.:  1.000   1st Qu.:    1.163   18     : 11189   1: 56312  
##  Median :  1.000   Median :    1.961   20     : 10933             
##  Mean   :  1.681   Mean   :   18.142   21     : 10605             
##  3rd Qu.:  2.000   3rd Qu.:    3.283   22     : 10145             
##  Max.   :208.000   Max.   :16409.239   13     :  9127             
##                                        (Other):118977

There are 4 anomalies:

  1. passenger_count == 0. Did the driver order himself?
  2. distance == 0. Who the hell order a ride to stay in the same place?
  3. passenger_count == 208. Nuh uh
  4. distance == 16409.23911. Did they travel around the earth?

Handle features anomalies

uber <- uber %>%
  filter(passenger_count >= 1,
         distance >= 0.1) %>% 
  killOutliers(passenger_count) %>% 
  killOutliers(distance)

uber %>% select(-fare_amount) %>% summary()
##  passenger_count    distance           hour       is_weekend
##  Min.   :1.000   Min.   :0.1002   19     : 9730   0:103439  
##  1st Qu.:1.000   1st Qu.:1.2008   18     : 9253   1: 45663  
##  Median :1.000   Median :1.9419   20     : 8877             
##  Mean   :1.266   Mean   :2.3046   21     : 8532             
##  3rd Qu.:1.000   3rd Qu.:3.0925   22     : 8137             
##  Max.   :3.000   Max.   :6.5157   13     : 7567             
##                                   (Other):97006

Cool. Looking beatiful now. Let’s visualize the distributions to the witness the beauty of my data cleansing!

Visualize numerical columns distributions

plotNumericalDistribution(uber) # This function is from my personal package

Nice!

Visualize categorical columns distributions

uber %>% inspect_cat() %>% show_plot()

Nice!

Cross Validation

Set training indices

set.seed(1)

indices <- sample(nrow(uber), nrow(uber) * 0.8)

Split train & test

train_data <- uber[indices, ]
test_data <- uber[-indices, ]

X_train <- train_data %>% select(-fare_amount)
X_test <- test_data %>% select(-fare_amount)

y_train <- train_data$fare_amount
y_test <- test_data$fare_amount

Model Fitting

Linear Regression

model_lr <- lm(fare_amount ~ ., train_data)

model_lr %>% summary()
## 
## Call:
## lm(formula = fare_amount ~ ., data = train_data)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -15.0657  -1.3139  -0.3719   0.8955  18.8539 
## 
## Coefficients:
##                  Estimate Std. Error t value Pr(>|t|)    
## (Intercept)      3.153981   0.038298  82.353  < 2e-16 ***
## passenger_count  0.061779   0.011690   5.285 1.26e-07 ***
## distance         2.116939   0.004463 474.303  < 2e-16 ***
## hour1           -0.127819   0.050046  -2.554   0.0107 *  
## hour2           -0.242355   0.054503  -4.447 8.73e-06 ***
## hour3           -0.294481   0.061621  -4.779 1.76e-06 ***
## hour4           -0.281940   0.072295  -3.900 9.63e-05 ***
## hour5           -0.658629   0.078714  -8.367  < 2e-16 ***
## hour6           -0.693105   0.055864 -12.407  < 2e-16 ***
## hour7           -0.057806   0.046320  -1.248   0.2120    
## hour8            0.618164   0.043949  14.065  < 2e-16 ***
## hour9            0.839517   0.043529  19.286  < 2e-16 ***
## hour10           0.726293   0.044096  16.471  < 2e-16 ***
## hour11           0.858327   0.043636  19.670  < 2e-16 ***
## hour12           0.959312   0.043051  22.283  < 2e-16 ***
## hour13           0.855423   0.043090  19.852  < 2e-16 ***
## hour14           0.858773   0.043438  19.770  < 2e-16 ***
## hour15           0.852011   0.043665  19.512  < 2e-16 ***
## hour16           0.548972   0.045063  12.182  < 2e-16 ***
## hour17           0.706747   0.043287  16.327  < 2e-16 ***
## hour18           0.696278   0.041367  16.832  < 2e-16 ***
## hour19           0.498594   0.040991  12.164  < 2e-16 ***
## hour20           0.180626   0.041658   4.336 1.45e-05 ***
## hour21           0.064839   0.041969   1.545   0.1224    
## hour22           0.078493   0.042351   1.853   0.0638 .  
## hour23           0.048521   0.043671   1.111   0.2666    
## is_weekend1      0.077500   0.013763   5.631 1.80e-08 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 2.181 on 119254 degrees of freedom
## Multiple R-squared:  0.6546, Adjusted R-squared:  0.6546 
## F-statistic:  8694 on 26 and 119254 DF,  p-value: < 2.2e-16

Linear Regression Model Evaluation

pred_lr <- predict(model_lr, X_test)

getRegressionErrors(pred_lr, y_test)
##  Target Min  Max   MAE   MAPE   MSE  RMSE
##         2.5 22.1 1.579 19.217 4.881 2.209

It performs well, the MAE of 1.579 meaning that the prediction misses about $1.5 on avarage, and the MAPE of 19% meaning that prediction’s accuracy is 81% on average

Deep Learning

One Hot Encode

factor_columns <- c("hour", "is_weekend")
encoded_data <- oneHotEncode(uber, factor_columns) %>%
  as.matrix()

X_train <- encoded_data[indices, -which(colnames(encoded_data) == 'fare_amount')]
y_train <- encoded_data[indices, "fare_amount"]

X_test <- encoded_data[-indices, -which(colnames(encoded_data) == 'fare_amount')]
y_test <- encoded_data[-indices, "fare_amount"]

Feed Forward Neural Network Architecture

set_random_seed(1)

model_fnn <- keras_model_sequential() %>% 
  layer_dense(units = 16,
              activation = "relu",
              input_shape = ncol(X_train),
              name = "hidden_1") %>%
  layer_dense(units = 8,
              activation = "relu",
              name = "hidden_2") %>%
  layer_dense(units = 4,
              activation = "relu",
              name = "hidden_3") %>%
  layer_dense(units = 1,
              activation = "linear",
              name = "output")

model_fnn %>% compile(
  loss = "mean_squared_error",
  optimizer = optimizer_adam(learning_rate = 0.001),
  metrics = "mean_absolute_error")

history <- model_fnn %>% fit(
  X_train, y_train,
  epochs = 25,
  batch_size = 128,
  validation_split = 0.2,
  verbose = 1)
## Epoch 1/25
## 
  1/746 [..............................] - ETA: 2:15 - loss: 98.6354 - mean_absolute_error: 9.0468
103/746 [===>..........................] - ETA: 0s - loss: 68.7878 - mean_absolute_error: 7.4091  
189/746 [======>.......................] - ETA: 0s - loss: 44.2261 - mean_absolute_error: 5.3300
303/746 [===========>..................] - ETA: 0s - loss: 29.8111 - mean_absolute_error: 4.0080
413/746 [===============>..............] - ETA: 0s - loss: 23.2625 - mean_absolute_error: 3.3888
527/746 [====================>.........] - ETA: 0s - loss: 19.3303 - mean_absolute_error: 3.0021
627/746 [========================>.....] - ETA: 0s - loss: 17.0215 - mean_absolute_error: 2.7716
734/746 [============================>.] - ETA: 0s - loss: 15.2244 - mean_absolute_error: 2.5939
746/746 [==============================] - 1s 664us/step - loss: 15.0596 - mean_absolute_error: 2.5773 - val_loss: 4.7507 - val_mean_absolute_error: 1.5337
## Epoch 2/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.4161 - mean_absolute_error: 1.5196
126/746 [====>.........................] - ETA: 0s - loss: 4.7042 - mean_absolute_error: 1.5500
240/746 [========>.....................] - ETA: 0s - loss: 4.7704 - mean_absolute_error: 1.5625
368/746 [=============>................] - ETA: 0s - loss: 4.7468 - mean_absolute_error: 1.5586
474/746 [==================>...........] - ETA: 0s - loss: 4.7214 - mean_absolute_error: 1.5562
566/746 [=====================>........] - ETA: 0s - loss: 4.7471 - mean_absolute_error: 1.5609
683/746 [==========================>...] - ETA: 0s - loss: 4.7613 - mean_absolute_error: 1.5597
746/746 [==============================] - 0s 543us/step - loss: 4.7445 - mean_absolute_error: 1.5568 - val_loss: 4.7532 - val_mean_absolute_error: 1.5788
## Epoch 3/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.3348 - mean_absolute_error: 1.5822
122/746 [===>..........................] - ETA: 0s - loss: 4.6388 - mean_absolute_error: 1.5448
247/746 [========>.....................] - ETA: 0s - loss: 4.6383 - mean_absolute_error: 1.5427
370/746 [=============>................] - ETA: 0s - loss: 4.7035 - mean_absolute_error: 1.5530
478/746 [==================>...........] - ETA: 0s - loss: 4.7109 - mean_absolute_error: 1.5559
588/746 [======================>.......] - ETA: 0s - loss: 4.7346 - mean_absolute_error: 1.5574
714/746 [===========================>..] - ETA: 0s - loss: 4.7381 - mean_absolute_error: 1.5575
746/746 [==============================] - 0s 508us/step - loss: 4.7321 - mean_absolute_error: 1.5573 - val_loss: 4.7179 - val_mean_absolute_error: 1.5368
## Epoch 4/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.0663 - mean_absolute_error: 1.5507
118/746 [===>..........................] - ETA: 0s - loss: 4.6981 - mean_absolute_error: 1.5622
238/746 [========>.....................] - ETA: 0s - loss: 4.7650 - mean_absolute_error: 1.5638
362/746 [=============>................] - ETA: 0s - loss: 4.7475 - mean_absolute_error: 1.5610
441/746 [================>.............] - ETA: 0s - loss: 4.7286 - mean_absolute_error: 1.5584
553/746 [=====================>........] - ETA: 0s - loss: 4.7445 - mean_absolute_error: 1.5605
676/746 [==========================>...] - ETA: 0s - loss: 4.7293 - mean_absolute_error: 1.5576
746/746 [==============================] - 0s 532us/step - loss: 4.7283 - mean_absolute_error: 1.5566 - val_loss: 4.7309 - val_mean_absolute_error: 1.5596
## Epoch 5/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.6200 - mean_absolute_error: 1.5341
126/746 [====>.........................] - ETA: 0s - loss: 4.6261 - mean_absolute_error: 1.5379
247/746 [========>.....................] - ETA: 0s - loss: 4.6952 - mean_absolute_error: 1.5548
371/746 [=============>................] - ETA: 0s - loss: 4.7587 - mean_absolute_error: 1.5614
478/746 [==================>...........] - ETA: 0s - loss: 4.7541 - mean_absolute_error: 1.5608
600/746 [=======================>......] - ETA: 0s - loss: 4.7354 - mean_absolute_error: 1.5577
724/746 [============================>.] - ETA: 0s - loss: 4.7218 - mean_absolute_error: 1.5568
746/746 [==============================] - 0s 509us/step - loss: 4.7258 - mean_absolute_error: 1.5568 - val_loss: 4.7121 - val_mean_absolute_error: 1.5481
## Epoch 6/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.0109 - mean_absolute_error: 1.5060
124/746 [===>..........................] - ETA: 0s - loss: 4.6999 - mean_absolute_error: 1.5393
193/746 [======>.......................] - ETA: 0s - loss: 4.6371 - mean_absolute_error: 1.5383
287/746 [==========>...................] - ETA: 0s - loss: 4.6489 - mean_absolute_error: 1.5440
415/746 [===============>..............] - ETA: 0s - loss: 4.7042 - mean_absolute_error: 1.5506
542/746 [====================>.........] - ETA: 0s - loss: 4.7002 - mean_absolute_error: 1.5507
671/746 [=========================>....] - ETA: 0s - loss: 4.6985 - mean_absolute_error: 1.5520
746/746 [==============================] - 0s 540us/step - loss: 4.7180 - mean_absolute_error: 1.5550 - val_loss: 4.7101 - val_mean_absolute_error: 1.5323
## Epoch 7/25
## 
  1/746 [..............................] - ETA: 0s - loss: 2.8835 - mean_absolute_error: 1.2711
 77/746 [==>...........................] - ETA: 0s - loss: 4.8464 - mean_absolute_error: 1.5759
100/746 [===>..........................] - ETA: 0s - loss: 4.8055 - mean_absolute_error: 1.5695
118/746 [===>..........................] - ETA: 0s - loss: 4.7524 - mean_absolute_error: 1.5663
156/746 [=====>........................] - ETA: 0s - loss: 4.6950 - mean_absolute_error: 1.5592
192/746 [======>.......................] - ETA: 0s - loss: 4.6900 - mean_absolute_error: 1.5545
221/746 [=======>......................] - ETA: 1s - loss: 4.7394 - mean_absolute_error: 1.5607
268/746 [=========>....................] - ETA: 0s - loss: 4.7346 - mean_absolute_error: 1.5618
292/746 [==========>...................] - ETA: 0s - loss: 4.7504 - mean_absolute_error: 1.5645
301/746 [===========>..................] - ETA: 0s - loss: 4.7602 - mean_absolute_error: 1.5650
318/746 [===========>..................] - ETA: 0s - loss: 4.7789 - mean_absolute_error: 1.5660
349/746 [=============>................] - ETA: 0s - loss: 4.7822 - mean_absolute_error: 1.5650
368/746 [=============>................] - ETA: 0s - loss: 4.7720 - mean_absolute_error: 1.5643
437/746 [================>.............] - ETA: 0s - loss: 4.7281 - mean_absolute_error: 1.5575
458/746 [=================>............] - ETA: 0s - loss: 4.7231 - mean_absolute_error: 1.5577
473/746 [==================>...........] - ETA: 0s - loss: 4.7277 - mean_absolute_error: 1.5579
533/746 [====================>.........] - ETA: 0s - loss: 4.7063 - mean_absolute_error: 1.5549
538/746 [====================>.........] - ETA: 0s - loss: 4.7054 - mean_absolute_error: 1.5553
557/746 [=====================>........] - ETA: 0s - loss: 4.7217 - mean_absolute_error: 1.5580
590/746 [======================>.......] - ETA: 0s - loss: 4.7162 - mean_absolute_error: 1.5574
606/746 [=======================>......] - ETA: 0s - loss: 4.7105 - mean_absolute_error: 1.5562
626/746 [========================>.....] - ETA: 0s - loss: 4.7100 - mean_absolute_error: 1.5558
677/746 [==========================>...] - ETA: 0s - loss: 4.7133 - mean_absolute_error: 1.5548
746/746 [==============================] - 1s 2ms/step - loss: 4.7163 - mean_absolute_error: 1.5547 - val_loss: 4.7129 - val_mean_absolute_error: 1.5564
## Epoch 8/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.4548 - mean_absolute_error: 1.5399
 87/746 [==>...........................] - ETA: 0s - loss: 4.7912 - mean_absolute_error: 1.5621
153/746 [=====>........................] - ETA: 0s - loss: 4.7849 - mean_absolute_error: 1.5655
181/746 [======>.......................] - ETA: 0s - loss: 4.7494 - mean_absolute_error: 1.5601
253/746 [=========>....................] - ETA: 0s - loss: 4.7382 - mean_absolute_error: 1.5591
352/746 [=============>................] - ETA: 0s - loss: 4.7644 - mean_absolute_error: 1.5617
391/746 [==============>...............] - ETA: 0s - loss: 4.7343 - mean_absolute_error: 1.5567
406/746 [===============>..............] - ETA: 0s - loss: 4.7442 - mean_absolute_error: 1.5580
432/746 [================>.............] - ETA: 0s - loss: 4.7519 - mean_absolute_error: 1.5576
515/746 [===================>..........] - ETA: 0s - loss: 4.7225 - mean_absolute_error: 1.5561
612/746 [=======================>......] - ETA: 0s - loss: 4.7056 - mean_absolute_error: 1.5545
710/746 [===========================>..] - ETA: 0s - loss: 4.7158 - mean_absolute_error: 1.5553
746/746 [==============================] - 1s 873us/step - loss: 4.7116 - mean_absolute_error: 1.5545 - val_loss: 4.7266 - val_mean_absolute_error: 1.5223
## Epoch 9/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.9618 - mean_absolute_error: 1.4201
117/746 [===>..........................] - ETA: 0s - loss: 4.5910 - mean_absolute_error: 1.5354
239/746 [========>.....................] - ETA: 0s - loss: 4.5870 - mean_absolute_error: 1.5381
361/746 [=============>................] - ETA: 0s - loss: 4.6877 - mean_absolute_error: 1.5513
481/746 [==================>...........] - ETA: 0s - loss: 4.7054 - mean_absolute_error: 1.5513
586/746 [======================>.......] - ETA: 0s - loss: 4.7272 - mean_absolute_error: 1.5570
700/746 [===========================>..] - ETA: 0s - loss: 4.7149 - mean_absolute_error: 1.5547
746/746 [==============================] - 0s 523us/step - loss: 4.7085 - mean_absolute_error: 1.5543 - val_loss: 4.6958 - val_mean_absolute_error: 1.5481
## Epoch 10/25
## 
  1/746 [..............................] - ETA: 0s - loss: 5.3437 - mean_absolute_error: 1.6155
118/746 [===>..........................] - ETA: 0s - loss: 4.6778 - mean_absolute_error: 1.5620
236/746 [========>.....................] - ETA: 0s - loss: 4.6753 - mean_absolute_error: 1.5548
351/746 [=============>................] - ETA: 0s - loss: 4.7005 - mean_absolute_error: 1.5565
461/746 [=================>............] - ETA: 0s - loss: 4.6908 - mean_absolute_error: 1.5571
582/746 [======================>.......] - ETA: 0s - loss: 4.6601 - mean_absolute_error: 1.5499
704/746 [===========================>..] - ETA: 0s - loss: 4.6958 - mean_absolute_error: 1.5522
746/746 [==============================] - 0s 521us/step - loss: 4.7061 - mean_absolute_error: 1.5541 - val_loss: 4.7043 - val_mean_absolute_error: 1.5278
## Epoch 11/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.7433 - mean_absolute_error: 1.4249
120/746 [===>..........................] - ETA: 0s - loss: 4.7361 - mean_absolute_error: 1.5638
241/746 [========>.....................] - ETA: 0s - loss: 4.6998 - mean_absolute_error: 1.5517
363/746 [=============>................] - ETA: 0s - loss: 4.6605 - mean_absolute_error: 1.5481
485/746 [==================>...........] - ETA: 0s - loss: 4.6939 - mean_absolute_error: 1.5520
605/746 [=======================>......] - ETA: 0s - loss: 4.7155 - mean_absolute_error: 1.5551
722/746 [============================>.] - ETA: 0s - loss: 4.6994 - mean_absolute_error: 1.5534
746/746 [==============================] - 0s 507us/step - loss: 4.7047 - mean_absolute_error: 1.5539 - val_loss: 4.7002 - val_mean_absolute_error: 1.5370
## Epoch 12/25
## 
  1/746 [..............................] - ETA: 0s - loss: 5.1248 - mean_absolute_error: 1.6106
118/746 [===>..........................] - ETA: 0s - loss: 4.7501 - mean_absolute_error: 1.5618
240/746 [========>.....................] - ETA: 0s - loss: 4.6832 - mean_absolute_error: 1.5514
361/746 [=============>................] - ETA: 0s - loss: 4.6273 - mean_absolute_error: 1.5460
477/746 [==================>...........] - ETA: 0s - loss: 4.6729 - mean_absolute_error: 1.5478
599/746 [=======================>......] - ETA: 0s - loss: 4.6949 - mean_absolute_error: 1.5507
717/746 [===========================>..] - ETA: 0s - loss: 4.7030 - mean_absolute_error: 1.5531
746/746 [==============================] - 0s 510us/step - loss: 4.6980 - mean_absolute_error: 1.5533 - val_loss: 4.6939 - val_mean_absolute_error: 1.5428
## Epoch 13/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.8752 - mean_absolute_error: 1.4266
109/746 [===>..........................] - ETA: 0s - loss: 4.7949 - mean_absolute_error: 1.5633
166/746 [=====>........................] - ETA: 0s - loss: 4.7329 - mean_absolute_error: 1.5572
262/746 [=========>....................] - ETA: 0s - loss: 4.7993 - mean_absolute_error: 1.5613
374/746 [==============>...............] - ETA: 0s - loss: 4.7708 - mean_absolute_error: 1.5590
487/746 [==================>...........] - ETA: 0s - loss: 4.7375 - mean_absolute_error: 1.5548
600/746 [=======================>......] - ETA: 0s - loss: 4.7104 - mean_absolute_error: 1.5537
703/746 [===========================>..] - ETA: 0s - loss: 4.6929 - mean_absolute_error: 1.5515
746/746 [==============================] - 0s 593us/step - loss: 4.6954 - mean_absolute_error: 1.5519 - val_loss: 4.7274 - val_mean_absolute_error: 1.5780
## Epoch 14/25
## 
  1/746 [..............................] - ETA: 0s - loss: 5.4481 - mean_absolute_error: 1.7191
113/746 [===>..........................] - ETA: 0s - loss: 4.8392 - mean_absolute_error: 1.5669
226/746 [========>.....................] - ETA: 0s - loss: 4.7938 - mean_absolute_error: 1.5634
341/746 [============>.................] - ETA: 0s - loss: 4.7864 - mean_absolute_error: 1.5628
459/746 [=================>............] - ETA: 0s - loss: 4.7386 - mean_absolute_error: 1.5583
578/746 [======================>.......] - ETA: 0s - loss: 4.7153 - mean_absolute_error: 1.5543
703/746 [===========================>..] - ETA: 0s - loss: 4.7009 - mean_absolute_error: 1.5530
746/746 [==============================] - 0s 528us/step - loss: 4.6918 - mean_absolute_error: 1.5515 - val_loss: 4.7177 - val_mean_absolute_error: 1.5230
## Epoch 15/25
## 
  1/746 [..............................] - ETA: 0s - loss: 5.2156 - mean_absolute_error: 1.5681
123/746 [===>..........................] - ETA: 0s - loss: 4.5477 - mean_absolute_error: 1.5332
250/746 [=========>....................] - ETA: 0s - loss: 4.7171 - mean_absolute_error: 1.5556
377/746 [==============>...............] - ETA: 0s - loss: 4.7053 - mean_absolute_error: 1.5571
502/746 [===================>..........] - ETA: 0s - loss: 4.7060 - mean_absolute_error: 1.5552
623/746 [========================>.....] - ETA: 0s - loss: 4.6855 - mean_absolute_error: 1.5511
746/746 [==============================] - 0s 487us/step - loss: 4.6907 - mean_absolute_error: 1.5518 - val_loss: 4.7037 - val_mean_absolute_error: 1.5609
## Epoch 16/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.3225 - mean_absolute_error: 1.3354
128/746 [====>.........................] - ETA: 0s - loss: 4.5970 - mean_absolute_error: 1.5445
254/746 [=========>....................] - ETA: 0s - loss: 4.6517 - mean_absolute_error: 1.5482
368/746 [=============>................] - ETA: 0s - loss: 4.6693 - mean_absolute_error: 1.5495
496/746 [==================>...........] - ETA: 0s - loss: 4.6987 - mean_absolute_error: 1.5546
621/746 [=======================>......] - ETA: 0s - loss: 4.6842 - mean_absolute_error: 1.5520
740/746 [============================>.] - ETA: 0s - loss: 4.6869 - mean_absolute_error: 1.5515
746/746 [==============================] - 0s 535us/step - loss: 4.6911 - mean_absolute_error: 1.5522 - val_loss: 4.7037 - val_mean_absolute_error: 1.5630
## Epoch 17/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.7576 - mean_absolute_error: 1.3961
127/746 [====>.........................] - ETA: 0s - loss: 4.7024 - mean_absolute_error: 1.5481
256/746 [=========>....................] - ETA: 0s - loss: 4.6939 - mean_absolute_error: 1.5447
385/746 [==============>...............] - ETA: 0s - loss: 4.6754 - mean_absolute_error: 1.5465
513/746 [===================>..........] - ETA: 0s - loss: 4.6835 - mean_absolute_error: 1.5481
639/746 [========================>.....] - ETA: 0s - loss: 4.6855 - mean_absolute_error: 1.5489
746/746 [==============================] - 0s 478us/step - loss: 4.6882 - mean_absolute_error: 1.5511 - val_loss: 4.7203 - val_mean_absolute_error: 1.5702
## Epoch 18/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.3571 - mean_absolute_error: 1.5261
130/746 [====>.........................] - ETA: 0s - loss: 4.6550 - mean_absolute_error: 1.5529
256/746 [=========>....................] - ETA: 0s - loss: 4.6617 - mean_absolute_error: 1.5481
347/746 [============>.................] - ETA: 0s - loss: 4.6867 - mean_absolute_error: 1.5510
437/746 [================>.............] - ETA: 0s - loss: 4.7071 - mean_absolute_error: 1.5547
524/746 [====================>.........] - ETA: 0s - loss: 4.6938 - mean_absolute_error: 1.5537
643/746 [========================>.....] - ETA: 0s - loss: 4.6682 - mean_absolute_error: 1.5486
746/746 [==============================] - 0s 575us/step - loss: 4.6846 - mean_absolute_error: 1.5500 - val_loss: 4.7068 - val_mean_absolute_error: 1.5675
## Epoch 19/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.2981 - mean_absolute_error: 1.5683
 89/746 [==>...........................] - ETA: 0s - loss: 4.6181 - mean_absolute_error: 1.5574
182/746 [======>.......................] - ETA: 0s - loss: 4.7438 - mean_absolute_error: 1.5647
274/746 [==========>...................] - ETA: 0s - loss: 4.7007 - mean_absolute_error: 1.5581
368/746 [=============>................] - ETA: 0s - loss: 4.7435 - mean_absolute_error: 1.5643
492/746 [==================>...........] - ETA: 0s - loss: 4.7021 - mean_absolute_error: 1.5541
619/746 [=======================>......] - ETA: 0s - loss: 4.6970 - mean_absolute_error: 1.5534
746/746 [==============================] - 0s 559us/step - loss: 4.6883 - mean_absolute_error: 1.5520 - val_loss: 4.6908 - val_mean_absolute_error: 1.5334
## Epoch 20/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.7748 - mean_absolute_error: 1.4924
108/746 [===>..........................] - ETA: 0s - loss: 4.7910 - mean_absolute_error: 1.5575
232/746 [========>.....................] - ETA: 0s - loss: 4.7118 - mean_absolute_error: 1.5491
341/746 [============>.................] - ETA: 0s - loss: 4.7141 - mean_absolute_error: 1.5504
469/746 [=================>............] - ETA: 0s - loss: 4.6887 - mean_absolute_error: 1.5507
595/746 [======================>.......] - ETA: 0s - loss: 4.7116 - mean_absolute_error: 1.5519
723/746 [============================>.] - ETA: 0s - loss: 4.6836 - mean_absolute_error: 1.5501
746/746 [==============================] - 0s 504us/step - loss: 4.6847 - mean_absolute_error: 1.5505 - val_loss: 4.6901 - val_mean_absolute_error: 1.5450
## Epoch 21/25
## 
  1/746 [..............................] - ETA: 0s - loss: 3.3430 - mean_absolute_error: 1.4477
131/746 [====>.........................] - ETA: 0s - loss: 4.6573 - mean_absolute_error: 1.5476
168/746 [=====>........................] - ETA: 0s - loss: 4.6802 - mean_absolute_error: 1.5490
272/746 [=========>....................] - ETA: 0s - loss: 4.6972 - mean_absolute_error: 1.5527
398/746 [===============>..............] - ETA: 0s - loss: 4.7144 - mean_absolute_error: 1.5523
525/746 [====================>.........] - ETA: 0s - loss: 4.6668 - mean_absolute_error: 1.5475
654/746 [=========================>....] - ETA: 0s - loss: 4.6881 - mean_absolute_error: 1.5508
746/746 [==============================] - 0s 542us/step - loss: 4.6839 - mean_absolute_error: 1.5507 - val_loss: 4.6881 - val_mean_absolute_error: 1.5388
## Epoch 22/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.2588 - mean_absolute_error: 1.4346
125/746 [====>.........................] - ETA: 0s - loss: 4.7226 - mean_absolute_error: 1.5456
252/746 [=========>....................] - ETA: 0s - loss: 4.7396 - mean_absolute_error: 1.5516
380/746 [==============>...............] - ETA: 0s - loss: 4.7123 - mean_absolute_error: 1.5517
507/746 [===================>..........] - ETA: 0s - loss: 4.7126 - mean_absolute_error: 1.5532
631/746 [========================>.....] - ETA: 0s - loss: 4.6915 - mean_absolute_error: 1.5518
746/746 [==============================] - 0s 486us/step - loss: 4.6864 - mean_absolute_error: 1.5510 - val_loss: 4.6949 - val_mean_absolute_error: 1.5559
## Epoch 23/25
## 
  1/746 [..............................] - ETA: 0s - loss: 5.3329 - mean_absolute_error: 1.6248
125/746 [====>.........................] - ETA: 0s - loss: 4.8364 - mean_absolute_error: 1.5611
249/746 [=========>....................] - ETA: 0s - loss: 4.6888 - mean_absolute_error: 1.5455
343/746 [============>.................] - ETA: 0s - loss: 4.6885 - mean_absolute_error: 1.5473
402/746 [===============>..............] - ETA: 0s - loss: 4.6779 - mean_absolute_error: 1.5484
516/746 [===================>..........] - ETA: 0s - loss: 4.7043 - mean_absolute_error: 1.5523
626/746 [========================>.....] - ETA: 0s - loss: 4.7256 - mean_absolute_error: 1.5551
742/746 [============================>.] - ETA: 0s - loss: 4.6819 - mean_absolute_error: 1.5500
746/746 [==============================] - 0s 571us/step - loss: 4.6818 - mean_absolute_error: 1.5503 - val_loss: 4.6883 - val_mean_absolute_error: 1.5409
## Epoch 24/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.8487 - mean_absolute_error: 1.5215
116/746 [===>..........................] - ETA: 0s - loss: 4.7360 - mean_absolute_error: 1.5607
232/746 [========>.....................] - ETA: 0s - loss: 4.7432 - mean_absolute_error: 1.5611
350/746 [=============>................] - ETA: 0s - loss: 4.7185 - mean_absolute_error: 1.5574
468/746 [=================>............] - ETA: 0s - loss: 4.6789 - mean_absolute_error: 1.5519
584/746 [======================>.......] - ETA: 0s - loss: 4.6791 - mean_absolute_error: 1.5527
699/746 [===========================>..] - ETA: 0s - loss: 4.6905 - mean_absolute_error: 1.5512
746/746 [==============================] - 0s 524us/step - loss: 4.6853 - mean_absolute_error: 1.5503 - val_loss: 4.6875 - val_mean_absolute_error: 1.5341
## Epoch 25/25
## 
  1/746 [..............................] - ETA: 0s - loss: 4.5617 - mean_absolute_error: 1.5990
115/746 [===>..........................] - ETA: 0s - loss: 4.7608 - mean_absolute_error: 1.5536
234/746 [========>.....................] - ETA: 0s - loss: 4.6712 - mean_absolute_error: 1.5472
351/746 [=============>................] - ETA: 0s - loss: 4.7114 - mean_absolute_error: 1.5539
471/746 [=================>............] - ETA: 0s - loss: 4.7183 - mean_absolute_error: 1.5553
592/746 [======================>.......] - ETA: 0s - loss: 4.7038 - mean_absolute_error: 1.5528
711/746 [===========================>..] - ETA: 0s - loss: 4.7041 - mean_absolute_error: 1.5526
746/746 [==============================] - 0s 518us/step - loss: 4.6854 - mean_absolute_error: 1.5506 - val_loss: 4.6918 - val_mean_absolute_error: 1.5262
model_fnn
## Model: "sequential"
## ________________________________________________________________________________
##  Layer (type)                       Output Shape                    Param #     
## ================================================================================
##  hidden_1 (Dense)                   (None, 16)                      448         
##  hidden_2 (Dense)                   (None, 8)                       136         
##  hidden_3 (Dense)                   (None, 4)                       36          
##  output (Dense)                     (None, 1)                       5           
## ================================================================================
## Total params: 625
## Trainable params: 625
## Non-trainable params: 0
## ________________________________________________________________________________

Neural Network Model Evaluation

pred_fnn <- predict(model_fnn, X_test)
## 932/932 - 0s - 243ms/epoch - 261us/step
getRegressionErrors(pred_fnn, y_test)
##  Target Min  Max   MAE   MAPE   MSE  RMSE
##         2.5 22.1 1.553 18.565 4.834 2.199

It’s a little bit better than the linear regression model. Let’s try K-Nearest Neighbor Regression!

Further Model Fitting

K-Nearest Neighbor

pred_knn <- knn.reg(train = X_train %>% scale(),
                    test = X_test %>% scale(),
                    y = y_train,
                    k = round(sqrt(nrow(X_train))))$pred

getRegressionErrors(pred_knn, y_test)
##  Target Min  Max   MAE   MAPE   MSE  RMSE
##         2.5 22.1 1.658 20.305 5.386 2.321

It’s not better than the other model, but not bad!

Model Selection

When choosing between 3 of our trained models for fare pricing, we’ll decide to go with the Linear Regression model despite the FNN model showing slightly improved metrics like MAE, RMSE, and MAPE. It’s important to consider that the marginal improvement in performance with the FNN model may not be worth sacrificing the practicality and interpretability that the Linear Regression model offers. Linear Regression provides us with a straightforward way to understand the impact of each feature on fare predictions through its coefficients, making it highly explainable to both technical and non-technical audiences. Additionally, leveraging our domain knowledge in Uber fare pricing, we can identify meaningful linear relationships between input features and fares that Linear Regression can effectively capture. Given our dataset’s size and the desire to balance performance gains with resource efficiency, the practicality of the Linear Regression model takes precedence over the minor performance boost achieved by the FNN model.

So the winner model goes to:

model_lr %>% summary()
## 
## Call:
## lm(formula = fare_amount ~ ., data = train_data)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -15.0657  -1.3139  -0.3719   0.8955  18.8539 
## 
## Coefficients:
##                  Estimate Std. Error t value Pr(>|t|)    
## (Intercept)      3.153981   0.038298  82.353  < 2e-16 ***
## passenger_count  0.061779   0.011690   5.285 1.26e-07 ***
## distance         2.116939   0.004463 474.303  < 2e-16 ***
## hour1           -0.127819   0.050046  -2.554   0.0107 *  
## hour2           -0.242355   0.054503  -4.447 8.73e-06 ***
## hour3           -0.294481   0.061621  -4.779 1.76e-06 ***
## hour4           -0.281940   0.072295  -3.900 9.63e-05 ***
## hour5           -0.658629   0.078714  -8.367  < 2e-16 ***
## hour6           -0.693105   0.055864 -12.407  < 2e-16 ***
## hour7           -0.057806   0.046320  -1.248   0.2120    
## hour8            0.618164   0.043949  14.065  < 2e-16 ***
## hour9            0.839517   0.043529  19.286  < 2e-16 ***
## hour10           0.726293   0.044096  16.471  < 2e-16 ***
## hour11           0.858327   0.043636  19.670  < 2e-16 ***
## hour12           0.959312   0.043051  22.283  < 2e-16 ***
## hour13           0.855423   0.043090  19.852  < 2e-16 ***
## hour14           0.858773   0.043438  19.770  < 2e-16 ***
## hour15           0.852011   0.043665  19.512  < 2e-16 ***
## hour16           0.548972   0.045063  12.182  < 2e-16 ***
## hour17           0.706747   0.043287  16.327  < 2e-16 ***
## hour18           0.696278   0.041367  16.832  < 2e-16 ***
## hour19           0.498594   0.040991  12.164  < 2e-16 ***
## hour20           0.180626   0.041658   4.336 1.45e-05 ***
## hour21           0.064839   0.041969   1.545   0.1224    
## hour22           0.078493   0.042351   1.853   0.0638 .  
## hour23           0.048521   0.043671   1.111   0.2666    
## is_weekend1      0.077500   0.013763   5.631 1.80e-08 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 2.181 on 119254 degrees of freedom
## Multiple R-squared:  0.6546, Adjusted R-squared:  0.6546 
## F-statistic:  8694 on 26 and 119254 DF,  p-value: < 2.2e-16

Conclusion

In conclusion, our final choice of the Linear Regression model for Uber fare pricing was driven by a combination of practicality, interpretability, and domain knowledge. Despite the FNN model displaying slightly better metrics such as MAE, RMSE, and MAPE, we recognized that these marginal improvements were not worth sacrificing the clear interpretability that the Linear Regression model provides. Its straightforward coefficient-based insights allow us to easily explain the impact of each input feature on fare predictions to various audiences. As we consider the potential for further enhancement, introducing additional variables like weather conditions could offer a promising avenue. By incorporating factors such as weather, traffic, or demand patterns, we could enhance the model’s predictive accuracy and its ability to adapt to real-world variations. The current model, with its respectable metrics, serves as a strong foundation, and expanding our feature set could take our fare predictions to even greater heights of accuracy and practicality.

Thank you