Author: Dimo Dimov

Preprocessing of Sentinel-1 SAR data via Snappy Python module

This chapter demonstrates the Snappy Python module for the automatization of the ESA SNAP tool. Code examples will be shown for an automated processing chain for the preprocessing of Sentinel-1 SAR data including Calibration, Subsetting and Terrain Correction of GRD (Ground Range Detected data). A detailed installation tutorial for snappy can be found here: First, import the needed Python modules:   import snappy from snappy import ProductIO from snappy import HashMap import os, gc   from snappy import GPF GPF.getDefaultInstance().getOperatorSpiRegistry().loadOperatorSpis() HashMap = snappy.jpy.get_type('java.util.HashMap') Now loop through all Sentinel-1 data sub folders that are located within a super folder (of course, make sure, that the data is already unzipped): path = "D:\\SENTINEL\\" for folder in os.listdir(path): gc.enable()        output = path + folder + "\\"      timestamp = folder.split("_")[4]    date = timestamp[:8] Then, read in the Sentinel-1 data product:    sentinel_1 = ProductIO.readProduct(output + "\\")        print sentinel_1 If polarization bands are available, spolit up your code to process VH and VV intensity data separately. The first step is the calibration procedure by transforming the DN values to Sigma Naught respectively. You can specify the parameters to output the Image in Decibels as well.    pols = ['VH','VV']    for p in pols:         polarization = p               ### CALIBRATION         parameters...

Read More

Image classification with Python

We are going to classify a multitemporal image stack of MODIS NDVI time series (MOD13Q1). The stack consists of 23 bands (16-day composites) with a spatial resolution of 231m in sinusoidal projection. We want to classify the different land use types, especially to discriminate different crop types. Install Python and required image processing and scientific programming packages: sudo apt-get install python-numpy scikit-learn scikit-image For beginners, the basic Python IDLE is sufficient for scripting. However, another IDE, LiClipse is highly recommended: We assume that you already have created a bunch of training samples in 8bit TIFF format with distinct class labeling (1,2,3,4, etc). It is necessary that these single pixels are snapped to the pixel size of the original dataset and have the same dimensions and extent. Now write your Training script: # import all required Python packages: import as io import numpy as np import os, shutil    from sklearn.ensemble import AdaBoostClassifier, RandomForestClassifier, GradientBoostingClassifier, ExtraTreesClassifier from sklearn.externals import joblib # set up your directories with the MODIS data rootdir = "C:\\Data\\Raster\\MODIS\\" # path to your training data path_pix = "C:\\Data\\Samples\\" # path to your model path_model = "C:\\Data\\Models\\" # path to your classification results path_class = "C:\\Data\\Class\\" # declare a new function def training():     # path to your MODIS TIFF       raster = rootdir + "modis_stack_ndvi.tif"     # path to your corresponding pixel...

Read More

Extracting the central strip from LANDSAT 7 imagery

Here is a simple Python code to extract the central strip from Landsat 7 imagery (SLC-off),  that is not affected by the SLC failure. The algorithm shrinks the striping zones through a morphological filter (erosion) and creates a new shapefile AOI that extracts the desired raster extent without striping effects. The code is based on Python for ArcGIS (arcpy) – so you require a ArcGIS license. General steps: Loop through all Landsat 7 data folders Stack bands for each image Create a mask Erode the mask by 20 pixels Convert the mask to polygon Create a minimum bounding box Clip the original raster through the bbox   import arcpy from import * import sys,os #  Environment settings (Activate Spatial Analyst, Overwrite Outputs allowed and TIFF compression is LZW) arcpy.CheckOutExtension("spatial") arcpy.env.overwriteOutput = True arcpy.env.compression = 'LZW' # this is your main directory with all unzipped Landsat datasets rootdir = "D:\\DATA\\Landsat7\\" # create scratch folder "temp" temp = "D:\\DATA\\temp\\" # loop through directory with all unzipped Landsat 7 folders for files in os.listdir(rootdir):       files = os.path.join(rootdir, files)           # for each loop the subdir "files" is now the current workspace      # (e.g. LE71520322015157-SC20160224113319) that contains the Landsat bands     arcpy.env.workspace = files       rasters = arcpy.ListRasters("*", "TIF")           # create empty array     stack_liste = []  ...

Read More

Managing raster data with PostGIS and Python

PostGIS is the spatial extension of the open source database management system PostgreSQL. It helps you to manage your data (vector and raster) within a coherent geodatabase through a variety of spatial functions. Having a spatial database, the times of data clutter and messiness are over, especially when you are dealing with big data. Initially PostGIS was created to for the handling of vector data only. However, during the recent years more and more raster handling functionalities were introduced. For a complete overview of spatial raster operators, have a look at:   Download and install PostgreSQL and PostGIS Download PostgreSQL from here: The installer for PostgreSQL includes the PostgreSQL server, pgAdmin III; a GUI for managing and developing your databases, and StackBuilder; a package manager that can be used to download and install additional PostgreSQL applications and drivers. From the StackBuilder GUI, select Spatial Extensions and install the respective PostGIS 2.2 Bundle for PostgreSQL.   Create a new spatial database In pgAdmin, create a new database (right click: New Database): and the spatial extension postgis (right click on Extensions: New Extension): This will create a first table within your DB: spatial_ref_sys that contains the coordinate systems, map projections and the spatial indeces.   Set up Python Python provides awesome functionality for the automated raster processing within PostGIS. Automatization is necessary especially when you deal with a lot of data and iterative processes. Python scripting is also needed as the...

Read More

news blog by

the Remote Sensing Department
at the University of Würzburg
Institute of Geography and Geology
Oswald Külpe Weg 86
97074 Würzburg

University Webpage

Recent Tweets