X
13368 Rate this article:
3.5

Spatiotemporal Analysis: Red is Fled, Blue is New!

Zachary Norman

A great way to get additional information from imagery is to add changes over time to your analysis or workflow, and that is the focus of this blog. Spatiotemporal analysis has some potentially useful applications and one example is trying to determine when it is time to harvest a crop. Another use case is where spatiotemporal analysis can be used to detect where objects have appeared or disappeared in images. For this case, I'm going to outline the workflow I created to detect where airplanes appeared or disappeared from an airport.

 

The data that I had available was five Worldview 2 images over an airport in Rio De Janero. Here is a context map showing where the images are located:

 

Below is an animation showing what each image looks like in the data series. Note that you can see the buildings move on the left side of the image because of the change in the orientation of the satellite. This introduces some false-positives in the change detection workflow which can be seen in the results.

 

 

 

 

To perform the change detection on these images I used a pixel based change detection which is very similar to the Image Change Workflow, but it was written with the ENVI API and IDL. The reason I used the API to do this analysis is because there were a lot of steps that needed to be taken and it is a lot easier to create a workflow in IDL rather than use all of the separate tools in the ENVI Workbench for many images. Here was the approach that I took to perform the analysis.

 

1) Open Time One image and Time Two image for preprocessing using the following tasks:

RadiometricCalibration (for Top-of-Atmosphere Reflectance)
NNDiffusePanSharpening
RPCOrthorectification
SubsetRaster (with an ROI)


2) Register the two images together with the tasks:

 

GenerateTiePOintsByCrossCorreclation
FilterTiePOintsByGlobalTransform
ImageToImageRegistration


3) Find the intersection of the rasters

 

 

From steps 1 and 2 above, we can get some differences in the image sizes for Time 1 and Time 2. Although this difference change is small, the pixel based change detection cannot happen without the images having the exact same dimensions. To find the intersection of the two rasters and regrid each one to have the same dimensions, I followed the example outlined here which uses the intersection method for ENVIGridDefinition objects.


4) Perform the pixel-based change detection with the following tasks (taken from the Image Change Workflow)

 

 

RadiometricNormalization (Time 2 normalized to TIme 1)
ImageBandDifference
AutoChangeThresholdClassification (Kapur threshold method)
ClassificationSmoothing
ClassificationAggregation

 

 

After applying the changes to each pair of images, I produced 4 change detection images and the results are shown below. The red pixels correspond to pixel values decreasing and blue represents increases in a pixels value. An easy way to remember this is "red is fled, blue is new." Note how there are quite a few false-positives around the edges of the images due to the differences in the satellite's orientation. Apart from this, the change detection does a very good job of finding where planes have moved.