How to import “Global Forest Watch” rasters per year

It is to make an analysis on historical trends in tree cover loss and gain since 1999 to 2019. The model which we are going to create is made by global criteria, it means that some sort of forests have been taking in it but it dose not mean that there are primary forest at all. As a result, we will have and extra forest raster at the end made by “El IDEAM”.

Note: What about making your Polygon´s Conservation Area? Have you made it yet? Please be careful, you are not going to be able to go further without having it.

Before starting: work space creation

As well our last R code, we must input and install the R packages needed, they are gonna be the next ones:

#install.packages(c("rgdal","sp", "raster", "gdalUtils","parallel","sf", "mapview", "gfcanalysis"))

note: Please to make an installing run once, no more times per computer!!!

Hence we are able to open or display our packages every time we want so.

library(rgdal)
## Loading required package: sp
## Please note that rgdal will be retired by the end of 2023,
## plan transition to sf/stars/terra functions using GDAL and PROJ
## at your earliest convenience.
## 
## rgdal: version: 1.5-28, (SVN revision 1158)
## Geospatial Data Abstraction Library extensions to R successfully loaded
## Loaded GDAL runtime: GDAL 3.2.1, released 2020/12/29
## Path to GDAL shared files: D:/1. Mis Documentos/Documents/R/win-library/4.1/rgdal/gdal
## GDAL binary built with GEOS: TRUE 
## Loaded PROJ runtime: Rel. 7.2.1, January 1st, 2021, [PJ_VERSION: 721]
## Path to PROJ shared files: D:/1. Mis Documentos/Documents/R/win-library/4.1/rgdal/proj
## PROJ CDN enabled: FALSE
## Linking to sp version:1.4-6
## To mute warnings of possible GDAL/OSR exportToProj4() degradation,
## use options("rgdal_show_exportToProj4_warnings"="none") before loading sp or rgdal.
## Overwritten PROJ_LIB was D:/1. Mis Documentos/Documents/R/win-library/4.1/rgdal/proj
library(sp)
library(raster)
library(gdalUtils)
library(parallel)
library(sf)
## Linking to GEOS 3.9.1, GDAL 3.2.1, PROJ 7.2.1; sf_use_s2() is TRUE
## 
## Attaching package: 'sf'
## The following object is masked from 'package:gdalUtils':
## 
##     gdal_rasterize
library(mapview)
library(gfcanalysis)

Select a work space or a folder for working on

Even though we are thinking of working on just one folder, in at least ones we are going to use another one. So be careful. At the moment, we use the next one and remember to type the folder address and take care about the order given with “/” icon.

setwd("F:/biato/wikiaves/Protocolo_distribucion_mapas/parque_natural")
getwd()
## [1] "F:/biato/wikiaves/Protocolo_distribucion_mapas/parque_natural"

Selecting our Polygon from the work space

Remember to have your own Polygon!!! So lets get import a Polygon that is our Conservation Area.

park<- st_read(
  "parque_select.shp") ## this function is to open some shapes objects
## Reading layer `parque_select' from data source 
##   `F:\biato\wikiaves\Protocolo_distribucion_mapas\parque_natural\parque_select.shp' 
##   using driver `ESRI Shapefile'
## Simple feature collection with 1 feature and 15 fields
## Geometry type: POLYGON
## Dimension:     XY
## Bounding box:  xmin: -75.13328 ymin: 5.431891 xmax: -75.00439 ymax: 5.542324
## Geodetic CRS:  WGS 84
st <- "+proj=longlat +datum=WGS84 +no_defs"

park <- st_transform(park, crs = st)

note: it is called park since here, so take it in mind.

mapview(park, layer = "Admin. level 0", color = "#000000")

On the other hand, we should create an outpoint for the next steps down to the code.

output_folder <- setwd("F:/biato/wikiaves/Protocolo_distribucion_mapas/parque_natural") # folder within were you want to download all the Hansen layers

Importing data from “Global Forest Watch” to the Conservation Area

Imagine the globe is too big so “Global Forest Watch” have had to make a decision to plot the forest data by a mosaic of raster which all tiles together make a total view. Because of that, we will procedure looking for how many tiles we should use to achieve our commitment.

Note: There is just needed one owing to our scale, and yet there may often be two in some cases.

tiles <- calc_gfc_tiles(park) #it is a funtion that provide us with the tiles amount needed to our area
## Warning in showSRID(uprojargs, format = "PROJ", multiline = "NO", prefer_proj = prefer_proj): Discarded datum WGS_1984 in Proj4 definition,
##  but +towgs84= values preserved
## Warning in RGEOSBinPredFunc(spgeom1, spgeom2, byid, func): spgeom1 and spgeom2
## have different proj4 strings
## Warning in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches"): spgeom1
## and spgeom2 have different proj4 strings
tiles
## class       : SpatialPolygons 
## features    : 1 
## extent      : -80, -70, 0, 10  (xmin, xmax, ymin, ymax)
## crs         : +proj=longlat +ellps=WGS84 +towgs84=0,0,0,0,0,0,0 +no_defs

note: take a brief look at features and what shall we see? 1 or 2 tiles?

At the same time, we should see what happens, so…

plot(tiles, col = "#EF2929", lwd = 1, main = "Plot Area + Tiles", xlab = "Long", ylab = "Lat") ## please input 1 or 2 depends on the features result - it is the number of tiles needed
plot(park,  col = "#555753", add = TRUE)
## Warning in plot.sf(park, col = "#555753", add = TRUE): ignoring all but the
## first attribute
box()

Importing rasters

We actually are ready to import and download all the Hansen layers, there is necessary nothing more.

gfc_extract <- extract_gfc(park, output_folder, dataset = "GFC-2019-v1.7", overwrite=TRUE, to_UTM= F, filename="Park_extract.tif")
## Warning in RGEOSBinPredFunc(spgeom1, spgeom2, byid, func): spgeom1 and spgeom2
## have different proj4 strings
## Warning in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches"): spgeom1
## and spgeom2 have different proj4 strings
## Warning in proj4string(tiles): CRS object has comment, which is lost in output; in tests, see
## https://cran.r-project.org/web/packages/sp/vignettes/CRS_warnings.html
## Warning in showSRID(uprojargs, format = "PROJ", multiline = "NO", prefer_proj
## = prefer_proj): Discarded datum Unknown based on WGS84 ellipsoid in Proj4
## definition

Going on

We now are downloading and creating a forest layer of our study and conservation are. Depends on your interest, you are able to download a raster with a different canopy density. On our field, there could be great to select 75%.

forest_threshold <- 75 
##### I have rather used it 
### but you would be able to change it so it is up to ya

gfc75 <- threshold_gfc(gfc_extract, forest_threshold=forest_threshold, filename="Park_ex_extract_2.tif", overwrite=TRUE)

note: “Park_ex_extract_2.tif” document is actually our raster. it has all the information per year on lost and gained forest.

In a nutshell, we have a raster with the information year by year of lost, gained, and presented forest. Lets get made different layers for each one and extract two data frames that have the amount of those.

gfc75_stats <- gfc_stats(park, gfc75, dataset = "GFC-2019-v1.7")
## Data appears to be in latitude/longitude. Calculating cell areas on a sphere.
#######
write.csv(gfc75_stats$loss_table, file="park_GFC75_losstable.csv", row.names=FALSE)
write.csv(gfc75_stats$gain_table, file="park_GFC75_gaintable.csv", row.names=FALSE)
#########3

note: “park_GFC75_losstable.csv” and “park_GFC75_gaintable.csv” are documents saved in your folder and have some interest information, so go and have a look.

Video showing the “devastation”

We have rather been watching the time-line than a simple woodland replacement.In order to have it, we should extract the tree cover by year since 1999.

gfc75_annual_stack <- annual_stack(gfc75, dataset = "GFC-2019-v1.7")
writeRaster(gfc75_annual_stack, filename="park_GFC75_annual.tif", overwrite=TRUE, bylayer=TRUE)

Second, we can build a videoclip with those one by the next script:

animate_annual(park, gfc75_annual_stack, out_dir='output_folder', site_name='MNA_GFC75',type = "html")
## HTML file created at: gfc_animation.html

Export a raster for each factor

In order to play our role exercise easier we are going to create and export a layer for each factor given (loss, gain, and forest). It is due to simplicity for forward work on QGis program.

Forest<- raster::raster("Park_ex_extract.tif", band = 1)
Lost<- raster::raster("Park_ex_extract.tif", band = 2)
gained<- raster::raster("Park_ex_extract.tif", band = 3)
fondo_1<- raster::raster("Park_ex_extract.tif", band = 4)
fondo_2<- raster::raster("Park_ex_extract.tif", band = 5)

writeRaster(Forest,
            "F:/biato/wikiaves/Protocolo_distribucion_mapas/parque_natural/bosque1.tiff",
            format="GTiff", overwrite=T)

writeRaster(Lost,
            "F:/biato/wikiaves/Protocolo_distribucion_mapas/parque_natural/perdida.tiff",
            format="GTiff", overwrite=T)

writeRaster(gained,
            "F:/biato/wikiaves/Protocolo_distribucion_mapas/parque_natural/ganado.tiff",
            format="GTiff", overwrite=T)


mapview(Forest, layer = "Admin. level 0")  ###please I encourage your to type every raster made before, making an idea of them visualization. 
## Warning in showSRID(uprojargs, format = "PROJ", multiline = "NO", prefer_proj =
## prefer_proj): Discarded ellps WGS 84 in Proj4 definition: +proj=merc +a=6378137
## +b=6378137 +lat_ts=0 +lon_0=0 +x_0=0 +y_0=0 +k=1 +units=m +nadgrids=@null
## +wktext +no_defs +type=crs
## Warning in showSRID(uprojargs, format = "PROJ", multiline = "NO", prefer_proj =
## prefer_proj): Discarded datum World Geodetic System 1984 in Proj4 definition

note: It is just extracting each layer from the general raster. Making the next few rasters, “bosque1.tiff”, “perdida.tiff”, and ganado.tiff. Where they are forest, loss, and gain, respectively.

Using and analysing “intact” and “primary” forest

How I told you, it is a good data but we must use a more accurate forest raster to make landscape analysis.The IDEAM provide with an more accurate layer of primary forest owing to using higher resolution and criteria. It dose not mean the last one is wrong, it is just different ways to achieve a goal.

Opening fragstats analysis and metrics in R

The people who works in GIS and conservation are used to play them analysis over “Fragstats” program. It is the most accurate program to make landscape analysis even on biographic approaches. There are two packages that provide the same analysis and procedure made by them. they are called “landscapemetrics” and “landscapetools”. So lets get downloaded.

#install.packages("landscapemetrics", "landscapetools", "ggplot2")

note: Again, please to make an installing run once, no more times per computer!!!

and open it…

library(landscapemetrics)
library(landscapetools)
library(ggplot2) #it is just for one example, so ya could just called off, it is up to ya

At this point, we can import the data taken by IDEAM. It is in a folder called “Bosque_primario” incite our mean folder.

primario <- raster("F:/biato/wikiaves/Protocolo_distribucion_mapas/Bosque_primario/SBQ_SMBYC_BQNBQ_V5_2010_WGS84.tif")

Note: Remember the object is called “primario”.

plot(primario)

As you can see It is quite heavy!!!!! and it would generate some issues over here. So, We would need cut it off by our work size (“park” object) due to simplicity.

primario_Park <- crop(x = primario, y = park) ## it is to cut it off
plot(primario_Park)

note: It already has the same size to our conservation area (“park”). another way to take a look…

mapview(primario_Park)

and even in ggplot is possible to plot it….

df<- as.data.frame(primario_Park, xy=T)

  ggplot() +
    geom_tile(data = df, aes(x=x, y=y, fill =SBQ_SMBYC_BQNBQ_V5_2010_WGS84 )) +
    scale_fill_gradientn(colors = terrain.colors(12)) +
    theme_bw()

Pulling your weight forward

We are going to extract some connectivity data and landscape analysis. Nonetheless, As you know there are two general kinds of coordinate system, geographic and metric ones. Our future packages only admit metrics.

Due to we must check this out, and the packages have a function to check it…

check_landscape(primario_Park)
## Warning: Caution: Coordinate reference system not metric - Units of results
## based on cellsizes and/or distances may be incorrect.
##   layer        crs   units   class n_classes OK
## 1     1 geographic degrees integer         3  x

There are an OK label that gets us an idea if it is in general ready to be processed or not. In our case it seems a no and it has could be done by the units system that is in degrees or Geographic ones. lets change it…

sr <- "+proj=utm +zone=18 +datum=WGS84 +units=m +no_defs" 
  primario_Park2 <- projectRaster(primario_Park, crs = sr, method ="ngb")

And we can check this out again…

check_landscape(primario_Park2)
##   layer       crs units   class n_classes OK
## 1     1 projected     m integer         3  v

Note: it seems like working well as you can see at “Ok” label.

extract some metrics

We are not getting so deeply in that package or point, but we can see how to extract the general information.

metrics_ <-calculate_lsm(primario_Park2, #a raster object
                                     what=c("lsm_c_ai", # metric name
                                            "lsm_c_frac_mn", #  metric name
                                            "lsm_l_division",# metric name
                                            "lsm_l_pd")) # metric name

head(metrics_)
## # A tibble: 6 x 6
##   layer level class    id metric  value
##   <int> <chr> <int> <int> <chr>   <dbl>
## 1     1 class     1    NA ai      94.0 
## 2     1 class     2    NA ai      88.7 
## 3     1 class     3    NA ai      85.3 
## 4     1 class     1    NA frac_mn  1.10
## 5     1 class     2    NA frac_mn  1.12
## 6     1 class     3    NA frac_mn  1.11
  metrics_names<-lsm_abbreviations_names #store the names
  tail(metrics_names) #show the last 6 observations
## # A tibble: 6 x 5
##   metric name                      type                 level     function_name
##   <chr>  <chr>                     <chr>                <chr>     <chr>        
## 1 sidi   simpson's diversity index diversity metric     landscape lsm_l_sidi   
## 2 siei   simspon's evenness index  diversity metric     landscape lsm_l_siei   
## 3 split  splitting index           aggregation metric   landscape lsm_l_split  
## 4 ta     total area                area and edge metric landscape lsm_l_ta     
## 5 tca    total core area           core area metric     landscape lsm_l_tca    
## 6 te     total edge                area and edge metric landscape lsm_l_te
  class_metrics<-calculate_lsm(list(primario_Park2), # add multiple maps 
                               level = c("class"), #patch- landscape other options
                               type="aggregation metric") #get all the metrics for the given level
  
 
  writeRaster(primario_Park2,
              "F:/biato/wikiaves/Protocolo_distribucion_mapas/Bosque_primario/conservado.tiff",
              format="GTiff", overwrite=T)

Vizualizing some data from Fragstat criteria

Now we are gonna make some visualization that are going to be useful for your task. To download them just export it on the bottom right corner with the export bottom.

show_patches(primario_Park2,labels = FALSE,class = 1) #shows the patches of class 3
## $layer_1

 show_lsm(primario_Park2,labels = F,class = 1,what = "lsm_p_area") #show the patches in colours based on the area, good to observe where that largest patch is located  
## $layer_1

  show_cores(primario_Park2,labels = F,class = 1) #get the core areas of class 3 patches
## $layer_1

Exporting just the necessary metrics for our procedure

We shall use “lsm_p_area” analysis and metrics for the exercise ahead. So lets get and export it.

Patch_area_data<-lsm_p_area(primario_Park2)
  
write.table(Patch_area_data, file = "F:/biato/wikiaves/Protocolo_distribucion_mapas/Bosque_primario/TamaParche.csv",
            sep = ",")

  Patch_area <- spatialize_lsm(primario_Park2,
                               level =  "class",
                               what = 'lsm_p_area') # creates a list
  
  
  
  
  writeRaster(Patch_area[[1]]$lsm_p_area,
              "F:/biato/wikiaves/Protocolo_distribucion_mapas/Bosque_primario/TamaParche.tiff",
              format="GTiff", overwrite=T)
  
plot(Patch_area[[1]]$lsm_p_area)

note: it dose not seem great and it is because of no bands specification. it is quite difficult to fix it and it is better to make over “QGis” Platform.