Execution time for Berlin dataset.

We used the dataset built with SUMO using the Berlin network. The size of the dataset is around 80K trajectories. For speedup, a single timeslot was used with 80000 points. Four nodes were available, 7 cores each.

Execution time on a Single Node.

Scaleup for Berlin dataset.

Similar dataset was used to run scaleup analysis. The 80K points were divided on four samples. First one collected 1/4 of the data, second one 1/2, third one collected 3/4 and finally the last one kept the full content. The small sample was run using just one node (seven cores). The next sample used 2 nodes (14 cores). Similarly, the third sample run over 3 nodes (21 cores) and, finally, the full dataset run over the four available nodes.