4  Geospatial artificial intelligence (GeoAI) for remote sensing data

4.1 GeoAI for spatial prediction of future bushfire

4.1.1 Future climate data collection

Aim

We separately collected CMIP6 datasets of temperature and precipitation for 2024 and 2030. The 2024 data will be used for model training, followed by predicting bushfire burn areas for 2030.

Description of steps
  1. Load NEX-GDDP-CMIP6 data from Google Earth Engine.
  2. Aggregate and export the temperature and precipitation data for 2024.
  3. Aggregate and export the temperature and precipitation data for 2030.
// Load the CMIP6 dataset
var dataset = ee.ImageCollection("NASA/GDDP-CMIP6")
  .filter(ee.Filter.eq('model', 'ACCESS-CM2'))  // Select climate model
  .filter(ee.Filter.eq('scenario', 'ssp585'))  // Select SSP scenario
  .filterBounds(roi);  // Restrict to the region of interest (ROI)

// Extract variables
var temperature = dataset.select('tas');  // Daily mean temperature (unit: K)
var precipitation = dataset.select('pr');  // Daily precipitation (unit: kg m-2 s-1)

// Aggregate data for 2024 and 2030
var tem24 = temperature.filter(ee.Filter.calendarRange(2024, 2024, 'year')).mean().clip(roi).subtract(273.15);  // Annual mean temperature for 2024 (°C)
var tem30 = temperature.filter(ee.Filter.calendarRange(2030, 2030, 'year')).mean().clip(roi).subtract(273.15);  // Annual mean temperature for 2030 (°C)
var pre24 = precipitation.filter(ee.Filter.calendarRange(2024, 2024, 'year')).sum().clip(roi).multiply(86400).multiply(365);  // Annual total precipitation for 2024 (mm)
var pre30 = precipitation.filter(ee.Filter.calendarRange(2030, 2030, 'year')).sum().clip(roi).multiply(86400).multiply(365);  // Annual total precipitation for 2030 (mm)

// Export aggregated data to Google Drive
Export.image.toDrive({
  image: tem24,
  description: 'tem24',
  scale: 27830,
  region: roi,
  fileFormat: 'GeoTIFF'
});
Export.image.toDrive({
  image: tem30,
  description: 'tem30',
  scale: 27830,
  region: roi,
  fileFormat: 'GeoTIFF'
});
Export.image.toDrive({
  image: pre24,
  description: 'pre24',
  scale: 27830,
  region: roi,
  fileFormat: 'GeoTIFF'
});
Export.image.toDrive({
  image: pre30,
  description: 'pre30',
  scale: 27830,
  region: roi,
  fileFormat: 'GeoTIFF'
});

4.1.2 GeoAI modelling

Aim

The step uses the gpboost model to fit temperature and precipitation data in 2024 for the next step of spatial prediction.

Description of steps
  1. Load necessary libraries and data.
  2. Read the data containing burned area and climate data.
  3. Process the data to fit the gpboost model.
  4. Build an gpboost model.

See here for more details about the gpboost model on spatio-temporal data.

Code
library(terra)
pre24 = terra::rast('./data/4. GeoAI Modeling/pre24.tif')
tem24 = terra::rast('./data/4. GeoAI Modeling/tem24.tif')
burnedarea = terra::rast('./data/1. Data collection/BurnedArea/BurnedArea_2024.tif') |> 
  terra::resample(tem24,method = 'average')
d24 = c(pre24,tem24,burnedarea)
names(d24) = c("pre","tem","burned")
d24 = d24 |> 
  terra::as.polygons(aggregate = FALSE) |> 
  sf::st_as_sf() |> 
  dplyr::filter(dplyr::if_all(1:3,\(.x) !is.na(.x)))
d24
# Simple feature collection with 35 features and 3 fields
# Geometry type: POLYGON
# Dimension:     XY
# Bounding box:  xmin: 129 ymin: -15 xmax: 131 ymax: -13.3
# Geodetic CRS:  WGS 84
# First 10 features:
#       pre  tem burned                       geometry
# 1  630364 28.7    141 POLYGON ((130 -13.5, 130 -1...
# 2  644814 28.6    173 POLYGON ((130 -13.5, 130 -1...
# 3  638137 28.2    178 POLYGON ((131 -13.5, 131 -1...
# 4  597248 28.7    154 POLYGON ((130 -13.8, 130 -1...
# 5  606609 28.5    176 POLYGON ((130 -13.8, 130 -1...
# 6  607450 28.4    182 POLYGON ((130 -13.8, 130 -1...
# 7  600348 28.3    193 POLYGON ((131 -13.8, 131 -1...
# 8  568667 28.7    204 POLYGON ((130 -14, 130 -13....
# 9  580674 28.4    164 POLYGON ((130 -14, 130 -13....
# 10 575810 28.0    171 POLYGON ((130 -14, 130 -13....

library(gpboost)
gp_model = GPModel(gp_coords = sdsfun::sf_coordinates(d24), 
                   cov_function = "exponential")
# Training
bst = gpboost(data = as.matrix(sf::st_drop_geometry(d24)[,1:2]), 
              label = as.matrix(sf::st_drop_geometry(d24)[,3,drop = FALSE]), 
              gp_model = gp_model, objective = "regression_l2", verbose = 0)
bst
# <gpb.Booster>
#   Public:
#     add_valid: function (data, name, valid_set_gp = NULL, use_gp_model_for_validation = TRUE) 
#     best_iter: -1
#     best_score: NA
#     current_iter: function () 
#     dump_model: function (num_iteration = NULL, feature_importance_type = 0L) 
#     eval: function (data, name, feval = NULL) 
#     eval_train: function (feval = NULL) 
#     eval_valid: function (feval = NULL) 
#     finalize: function () 
#     initialize: function (params = list(), train_set = NULL, modelfile = NULL, 
#     lower_bound: function () 
#     params: list
#     predict: function (data, start_iteration = NULL, num_iteration = NULL, 
#     raw: NA
#     record_evals: list
#     reset_parameter: function (params, ...) 
#     rollback_one_iter: function () 
#     save: function () 
#     save_model: function (filename, start_iteration = NULL, num_iteration = NULL, 
#     save_model_to_string: function (start_iteration = NULL, num_iteration = NULL, feature_importance_type = 0L, 
#     set_train_data_name: function (name) 
#     to_predictor: function () 
#     update: function (train_set = NULL, fobj = NULL) 
#     upper_bound: function () 
#   Private:
#     eval_names: NULL
#     fixed_effect_train_loaded_from_file: NULL
#     get_eval_info: function () 
#     gp_model: GPModel, R6
#     gp_model_prediction_data_loaded_from_file: FALSE
#     handle: gpb.Booster.handle
#     has_gp_model: TRUE
#     higher_better_inner_eval: NULL
#     init_predictor: NULL
#     inner_eval: function (data_name, data_idx, feval = NULL) 
#     inner_predict: function (idx) 
#     is_predicted_cur_iter: list
#     label_loaded_from_file: NULL
#     name_train_set: training
#     name_valid_sets: list
#     num_class: 1
#     num_dataset: 1
#     predict_buffer: list
#     residual_loaded_from_file: NULL
#     set_objective_to_none: FALSE
#     train_set: gpb.Dataset, R6
#     train_set_version: 1
#     use_gp_model_for_validation: TRUE
#     valid_sets: list
#     valid_sets_gp: list

4.1.3 Spatial prediction

Aim

In this step, the gpboost model constructed in the previous step is used to predict the burned area of 2030.

Description of steps
  1. Read the 2030 futural climate data.
  2. Process the data to use the gpboost model to predict.
  3. Do spatial prediction by the gpboost model.
Code
pre30 = terra::rast('./data/4. GeoAI Modeling/pre30.tif')
tem30 = terra::rast('./data/4. GeoAI Modeling/tem30.tif')
d30 = c(pre30,tem30)
names(d30) = c("pre","tem")
d30 = d30 |> 
  terra::as.polygons(aggregate = FALSE) |> 
  sf::st_as_sf() |> 
  dplyr::filter(dplyr::if_all(1:3,\(.x) !is.na(.x)))
d30
# Simple feature collection with 37 features and 2 fields
# Geometry type: POLYGON
# Dimension:     XY
# Bounding box:  xmin: 129 ymin: -15 xmax: 131 ymax: -13.3
# Geodetic CRS:  WGS 84
# First 10 features:
#       pre  tem                       geometry
# 1  652488 28.6 POLYGON ((130 -13.5, 130 -1...
# 2  695938 28.6 POLYGON ((130 -13.5, 130 -1...
# 3  712191 28.4 POLYGON ((130 -13.5, 130 -1...
# 4  701452 28.1 POLYGON ((131 -13.5, 131 -1...
# 5  682812 28.5 POLYGON ((130 -13.8, 130 -1...
# 6  696118 28.4 POLYGON ((130 -13.8, 130 -1...
# 7  697721 28.3 POLYGON ((130 -13.8, 130 -1...
# 8  684002 28.2 POLYGON ((131 -13.8, 131 -1...
# 9  671438 28.5 POLYGON ((130 -14, 130 -13....
# 10 689152 28.3 POLYGON ((130 -14, 130 -13....

pred = predict(bst, data = as.matrix(sf::st_drop_geometry(d30)[,1:2]), 
               gp_coords_pred = sdsfun::sf_coordinates(d30), 
               predict_var = TRUE, pred_latent = FALSE)

pred
# $fixed_effect
# NULL
# 
# $random_effect_mean
# [1] NA
# 
# $random_effect_cov
# [1] NA
# 
# $response_mean
#  [1] 155 149 172 179 158 171 181 191 193 169 170 174 195 168 180 182
# [17] 161 153 156 189 176 169 161 141 140 144 158 150 144 125 122 134
# [33] 129 133 132 129 128
# 
# $response_var
#  [1] 320 133 130 135 131 127 127 130 131 127 127 127 130 133 127 127
# [17] 127 127 130 130 127 127 127 127 130 133 127 127 127 127 130 329
# [33] 133 130 130 130 135

d30$burned = pred$response_mean
d30
# Simple feature collection with 37 features and 3 fields
# Geometry type: POLYGON
# Dimension:     XY
# Bounding box:  xmin: 129 ymin: -15 xmax: 131 ymax: -13.3
# Geodetic CRS:  WGS 84
# First 10 features:
#       pre  tem                       geometry burned
# 1  652488 28.6 POLYGON ((130 -13.5, 130 -1...    155
# 2  695938 28.6 POLYGON ((130 -13.5, 130 -1...    149
# 3  712191 28.4 POLYGON ((130 -13.5, 130 -1...    172
# 4  701452 28.1 POLYGON ((131 -13.5, 131 -1...    179
# 5  682812 28.5 POLYGON ((130 -13.8, 130 -1...    158
# 6  696118 28.4 POLYGON ((130 -13.8, 130 -1...    171
# 7  697721 28.3 POLYGON ((130 -13.8, 130 -1...    181
# 8  684002 28.2 POLYGON ((131 -13.8, 131 -1...    191
# 9  671438 28.5 POLYGON ((130 -14, 130 -13....    193
# 10 689152 28.3 POLYGON ((130 -14, 130 -13....    169

4.2 Analysing future bushfire patterns

Aim

In this step, the predicted burned area of 2030 is used to analyse the future bushfire patterns.

Description of steps
  1. Visualise the predicted burned area of 2030.
  2. Analyse the future bushfire patterns.