Code and data available in GITHUB here
We use RStudio IDE. Create a .R file (ours is called webinar1_2020_07_29.R) and write the following lines of code. Ctrl-Return to run the line that cursor is on or run all highlighted lines. Or use the button in top right part of the editor.
To add functionalities to your .R script you must load the specific libraries. REMEMBER: make sure that they are installed in your system. If not you can install through RStudio on menu => Tools => Install Packages or directly through R console with command install.packages("NAME OF PACKAGE")
Below we add three libraries which add functionalities that we need
library(sf)
library(mapview)
library(raster)
My custom Coordinate Reference System (CRS) projection, Lambert Conical Conformal (LCC), NOT secant but tangent at lat=45.827 and lon=11.625 - for more info:
myproj <- "+proj=lcc +lat_1=45.827 +lat_2=45.827
+lat_0=45.827 +lon_0=11.625
+x_0=4000000
+y_0=2800000 +ellps=GRS80
+towgs84=0,0,0,0,0,0,0 +units=m +no_defs"
My points (in lat long) over the regular grid/lattice these points are 666.67 m apart. You can find the data points in the GITHUB link at top of this page.
grid.points <- st_read( "data/webinar1_2020_07_29/points.shp" )## Reading layer `points' from data source `/Users/ivanlizarazo/GEE/fpirotti/webinar_series_R_GEE-master/data/webinar1_2020_07_29/points.shp' using driver `ESRI Shapefile'
## Simple feature collection with 1315 features and 1 field
## geometry type: POINT
## dimension: XY
## bbox: xmin: 11.83267 ymin: 46.33037 xmax: 12.14322 ymax: 46.55611
## CRS: 4326
Convert my points from a Geographic (latitude and longitude degrees) to my custom CRS projection
grid.points.myproj <- grid.points %>% st_transform(myproj)
Run this line to view the points
mapview( grid.points.myproj, legend = F )
Points to a square Polygon areas. How? It is trivial, but faster way was to create a rhomboid with a certain distance from the point (“radius”) using a buffer of half the distance between points (333.33 m) . The minimum bounding box of the buffer with that radius will be a square with sides 2x the size of the radius of the buffer.
NB: nQuadSegs is 1 which means a chord(segment) at each quadrand, i.e. 1/4th of a circunference; this give a rhomboid which is a “very very rough circle” - it saves a lot of memory from drawing with a defaul value of nQuadSegs=30 - it makes a difference when processing a large number of points.
We buffer each feature (point) and view results
nodes.buffered<-st_buffer(grid.points.myproj, 333.33, nQuadSegs = 1)
mapview( nodes.buffered, legend = F )
Then we apply a function to each “rhomboid” for creating the square. It is a bit more complex as the function includes another function inside.
st_bbox_by_feature = function(geom) {
## make sure that object "geom" becomes a geometry object;
## geom is your data set with all the single geometries/polygons
geom2 = st_geometry(geom)
#' Function to:
#' (a) take a single geometry,
#' (b) create a bounding box with st_bbox and
#' (c) convert the bbox object to a new geometry (a square)
f <- function(single.geom) {
st_as_sfc( st_bbox(single.geom,), crs=myproj)
}
#' This line calls the function above LOOPING over each single geometry (lapply)
#' in the geom2 object
do.call("c", lapply(geom2, f))
}This line calls the above function over our data (rhomboids)
tiles <- st_bbox_by_feature( nodes.buffered )
Assign the CRS to the new dataset as it got lost. To check the CRS of our data we can use {r} st_crs(tiles)
tiles <- tiles %>% st_set_crs(myproj)
We view all results all over the same map (small subset of data - the first 20 features).
PS you might notice a mis-alignment of the geometries, this is due to the webgis using a CRS called “earth universal transverse mercator” projection which as small deformations when representing data that is in other CRS.
For more reading see Wikipedia.
mapview( grid.points.myproj[1:5,], legend = F ) + mapview( nodes.buffered[1:5,], legend = F ) + mapview( tiles[1:5,], legend = F )
Convert to Latitude and Longitude and save result to shapefile for future upload to Google Earth Engine
tiles.latlng <- tiles %>% st_transform("+init=epsg:4326")
st_write(tiles.latlng, "data/tiles.shp" )Import shapefile files “tiles.XXX” by left panel “assets” and click “NEW” and choose shapefile from dropdown menu.
DO THINGS in GEE
code shared: [click HERE to open your GEE code editor] but also reported here below
// IMPORTS
// these are in the "import" top part of the script editor in GEE,
// but can also be added to the script itself.
var srtm = ee.Image("CGIAR/SRTM90_V4");
var srtm_viz = {"opacity":1,"bands":["elevation"],
"min":1439.9060309998727,"max":2892.700269813135,
"palette":["005aff","ff0000"]};
var tiles = ee.FeatureCollection("users/2020_Kanan/summer_webinar_series/Webinar1_tiles");
///
////////////////////////////////////////////////////////////////////////////////////////
///////////////// PART OF SUMMER WEBINAR SERIES 2020 ///////////////////////////////////
////////////////////////////////////////////////////////////////////////////////////////
///// Francesco Pirotti - CIRGEO / TESAF Department University of Padova //////////////
////////////////////////////////////////////////////////////////////////////////////////
///// https://www.cirgeo.unipd.it/shared/R/webinars/webinar1_2020_07_29.html /////////
////////////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////////////
// "run" this script and then go to "tasks" button in the right panel to
// run also the export of the data.
// The last function in this script, "Export...",
// will create a "task" that you have to launch manually
// to put the final data to your google drive
// this prints info on the "srtm" dataset that we imported
print(srtm)
// PS double click in the link "SRTM Digital Elevation Data Version 4." above in the
// "imports" to see what data we are reading and aggregating at tiles
///////////////////////////////////////////////////////////////////////////////////////
// this prints info on the "tiles" dataset that we imported from the
// shapefile created with R
print(tiles);
///////////////////////////////////////////////////////////////////////////////////////
// this function below applies the "reduceRegions" function to the "srtm" dataset (image)
// over each feature (square polygon in our "tiles" dataset) ...
// NB the "reducer" function applies 5 percentiles (10, 25, 50, 75, 90) thus allowing a good
// representation of distribution of height values inside the square.
// We could also use average and standard deviation as reducers.
var height = srtm.reduceRegions({
reducer: ee.Reducer.percentile([10,25,50,75,90]),
//reducer: ee.Reducer.mean(),
collection: tiles
});
// this prints result
print(height);
// here we zoom to our tiles and add the layers
Map.centerObject(tiles, 12)
Map.addLayer(srtm, srtm_viz, "SRTM");
Map.addLayer(tiles, {}, "TILES");
// this exports the data to our Google drive
Export.table.toDrive({
collection: height,
description:'tiles_withData',
fileFormat: 'SHP'
});EXPORT THE RESULTS TO A SHAPEFILE CALLED tiles_withData.shp
Read the file exported from GEE
grid.points.gee <- st_read( "data/webinar1_2020_07_29/tiles_withData.shp" )## Reading layer `tiles_withData' from data source `/Users/ivanlizarazo/GEE/fpirotti/webinar_series_R_GEE-master/data/webinar1_2020_07_29/tiles_withData.shp' using driver `ESRI Shapefile'
## Simple feature collection with 1315 features and 6 fields
## geometry type: POLYGON
## dimension: XY
## bbox: xmin: 11.82833 ymin: 46.32735 xmax: 12.14759 ymax: 46.55912
## CRS: 4326
Create a new attribute column with the interquartile range (The difference between p75 and p25, i.e. 75th and 25th percentile)
grid.points.gee$iqr<-grid.points.gee$p75 - grid.points.gee$p25pal = mapviewPalette("mapviewSpectralColors")
mapview( grid.points.gee, col.regions = pal(100), zcol = "p50",
color=NULL, alpha.regions=0.8 ) mapview( grid.points.gee, col.regions = pal(100), zcol = "p10",
color=NULL, alpha.regions=0.8 ) mapview( grid.points.gee, col.regions = pal(100), zcol = "iqr",
color=NULL, alpha.regions=0.8)
sessionInfo()## R version 3.6.3 (2020-02-29)
## Platform: x86_64-apple-darwin15.6.0 (64-bit)
## Running under: macOS Mojave 10.14.6
##
## Matrix products: default
## BLAS: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRblas.0.dylib
## LAPACK: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib
##
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
##
## attached base packages:
## [1] stats graphics grDevices utils datasets methods base
##
## other attached packages:
## [1] raster_3.3-13 sp_1.4-2 mapview_2.7.8 sf_0.9-5
##
## loaded via a namespace (and not attached):
## [1] tidyselect_1.1.0 xfun_0.17 purrr_0.3.4
## [4] lattice_0.20-41 colorspace_1.4-1 vctrs_0.3.4
## [7] generics_0.0.2 leaflet.providers_1.9.0 htmltools_0.5.0
## [10] stats4_3.6.3 viridisLite_0.3.0 yaml_2.2.1
## [13] base64enc_0.1-3 rlang_0.4.7 leafpop_0.0.5
## [16] e1071_1.7-3 pillar_1.4.6 glue_1.4.2
## [19] DBI_1.1.0 gdtools_0.2.2 uuid_0.1-4
## [22] lifecycle_0.2.0 stringr_1.4.0 munsell_0.5.0
## [25] htmlwidgets_1.5.1 codetools_0.2-16 evaluate_0.14
## [28] knitr_1.29 crosstalk_1.1.0.1 class_7.3-17
## [31] leafem_0.1.1 Rcpp_1.0.5 KernSmooth_2.23-17
## [34] scales_1.1.1 classInt_0.4-3 satellite_1.0.2
## [37] jsonlite_1.7.1 leaflet_2.0.3.9000 webshot_0.5.2
## [40] farver_2.0.3 systemfonts_0.2.1 brew_1.0-6
## [43] png_0.1-7 digest_0.6.25 stringi_1.5.3
## [46] dplyr_1.0.0 grid_3.6.3 tools_3.6.3
## [49] magrittr_1.5 tibble_3.0.3 crayon_1.3.4
## [52] pkgconfig_2.0.3 ellipsis_0.3.1 rmarkdown_2.3.5
## [55] svglite_1.2.3 R6_2.4.1 units_0.6-6
## [58] compiler_3.6.3