Skip to content

2AiBAIT/SatFire2

 
 

Repository files navigation

READ ME PLEASE:

This are the very first instructions you must follow before executing any code:

1. Install:

NOTE: All code presented on this repository was built under Python 3.9.2 (Anaconda) on Windows Platform in PyCharm

Process:

1. Clone the repository to your machine

2. Install Anaconda with Python 3.9.2

3. Open your Anaconda prompt

4. Create a virtual environment with the env.yml file located in the repository, typing in the console:

   conda env create -n your_new_envname --file env.yml (your machine path to this file)

5. Open PyCharm and create a new project with the correct virtual environment and Python interpreter

2. Workflow:

NOTE: The work sequence is not necessarily this. This was just the workflow we performed in the project

1. Choose the fires that will be used in the neural network

- This step will generate images of the fires from the original shapefile.
- Images are generated from the center of each fire with height and width of 2560 meters.
- The objective of this step is to select the fires that we are going to use in the neural network, eliminating small fires and fires whose height or width exceeds the defined limits.

2. Validation of the shapes

- Valid fires will be those in which the burned pixels are greater than 0.1 of the total number of pixels in the image.
- In addition, only those where the height/width of the burned shape does not exceed the proposed values (2560 m) will be chosen.

3. Generate shapefile with only the fires to be used

- Creation of a new shapefile from the original removing the shapes that will not be used.

4. Creation of txt files

- The txt file of each valid fire will have some data necessary for communication with the Scihub Api (https://scihub.copernicus.eu/dhus) and meteorology api (https://www.weatherbit.io/api).

5. Augmented bbox calculation

The new augmented bbox will have an area of interest with measurements:
- 2560 meters from the left end of the fire polygon to the right edge of the box;
- 2560 meters from the right end of the fire polygon to the left edge of the box;
- 2560 meters from the lower end of the fire polygon to the upper edge of the box;
- 2560 meters from the upper end of the fire polygon to the lower edge of the box;

6. Creation of geojson files

- The geojson files will be in CRS:4326 (lat/lon) to use in the Scihub API request.

7. Requests to SciHub API

The request will use the dates read from the txt file and coordinates of the geojson created for each fire.
The script created to manage the requests will:
- Check whether the products are online or not and add to the corresponding list.
- Search in ascending order of the date before the fire (14, 30 and 60 days before the fire).
- Check if the fire polygon is contained in the polygon obtained from the response footprint returned by the API.
- Of the available and valid products, will be downloaded those with the lowest percentage of global clouds.

8. Automated download of products unavailable on the Scihub Api

- Most products are not available for download right away.
- Therefore we store the necessary products in a list for later download on the Onda-Dias website (using an automated download script). - Url: https://catalogue.onda-dias.eu/catalogue/.

9. Creation of individual shapefiles

- Convert shapefiles to coordinates (EPSG:32629).
- Create individual shapefiles on those coordinates for each fire.

10. Generate geotif images of the fires

- The tif images are created using the individual shapefiles mentioned in the previous step.
- Two tif images are generated (one with the dimension of the fire bbox and the other with the dimension of the augmented bbox).
- The resolution of the images is 10 meters per pixel.

11. Create a new cos18 shapefile

- This new shapefile will be created from he original COS2018_V1 shapefile, adding a new “Class” field deducted from the “COS2018_n4” field of the original file.
- The resulting shapefile is later converted to the coordinate system (EPSG:32629).

12. Generate the cos18 images

- In this step we generate the cos18 images (10 m per pixel) for each fire in the defined area of interest (enlarged bbox).

13. Process for obtaining Copernicus Dems

- The Dems were obtained from Amazon's cloud service AWS.
- Here are some instructions in how to use the service:

            Install awscli.
            Execution is done in terminal/command line.

            Access the bucket:
            aws s3 ls s3://copernicus-dem-30m/ --no-sign-request

            Tiles are named after coordinates:
            Example: Copernicus_DSM_COG_10_N42_00_W007_00_DEM.tif

            Identify the coordinates of the tiles needed to generate the Portugal tif.

            Copy a specific tile:
            aws s3 cp s3://<bucket>/<folder> <local_folder> --no-sign-request
            This command copies from s3://<bucket>/<folder> to a local folder <local_folder>
            Example:
            aws s3 cp s3://copernicus-dem-30m/Copernicus_DSM_COG_10_N42_00_W007_00_DEM/Copernicus_DSM_COG_10_N42_00_W007_00_DEM.tif N42W007.tif --no-sign-request

            Knowing all the tiles we need we can download in bulk.
            Bulk download:
            aws s3 cp s3://path/to/bucket/ . --recursive --exclude "*" --include "files wanted"
            Example:
            --exclude "*" --include "*.txt"

14. Merge height tiles for Portugal

- Merge the downloaded tiles with the original coordinates (EPSG:4326) and convert with gdal.warp() the final raster of Portugal to coordinates (EPSG:32629) and resolution 30 m per pixel.
- This step will generate a height map for Portugal.

14. Generate the height images for each fire

- Crop the Portugal height map with the output size defined for the area georeferenced by the bbox.
- This will result in images with resolution of 10 meters per pixel.

15. Crops of satellite imagery in the defined area of the fire

- Process identical to the previous one using the gdal warp command, to perform a crop in the region of interest with a resolution of 10 meters per pixel.
- The bands to be cropped are selected in ascending order of their original resolution (10, 20 and 60 meters).

16. Api request to obtain meteorological data

- Request to api https://www.weatherbit.io/api for weather data in the region for the duration of the fire.
- In this step, a json file is generated for each fire with hourly weather data for the entire duration of that fire.

17. Normalization process

- Normalization process and dataset preparation for entry into the neural network.

18. Unet

- Training process of the neural network and consequent validation between the data resulting from the network and the ground truth data.

3. How to recreate datasets:

Portuguese fires of 2017, 2018 and 2019 were studied in this project.
We created different folders for each year.
The central script is CreateDatasets.py or CreateDatasetsUI.py.
The difference between these two is just the way of interaction.
For simplicity purposes we will now adress only CreateDatasets.py.
This script will ask you to introduce one of the above mentioned years and what individual part of the dataset you want to generate.
You can also generate the complete dataset for a selected year, but first we need to have the products of the satellite images to achieve this.
The process of identifying the sentinel product associated with the fire and its consequent download was done in a separate script because its execution in parallel would result in a very complex and time consuming central script.

You must perform the steps in this order:

1. Run the CreateDatasets.py file and select the option Ficheiros txt (txt files) to create a txt file with some data needed to comunicate with the Scihub Api.

2. Run the CreateDatasets.py file and select the option Geojson to create a file with geographic information of the fires.

3. Run the api_sentinel_2.py file to search which Sentinel product is associated with each fire and download it if the product is available.

4. Run the OndaDownload.py file to download the missing products.

5. Finally run the central script again and choose the parts of the dataset you want to generate or alternatively generate it completely in one step.

4. Download links:

Burned areas in Portugal - http://www2.icnf.pt/portal/florestas/dfci/inc/cartografia/areas-ardidas

COS2018 - http://mapas.dgterritorio.pt/DGT-ATOM-download/COS_Final/COS2018_v1/COS2018_v1.zip

https://mega.nz/folder/KJ5USagA#yqGznaBCM9IH7nXRoEz0pg

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%