In this tutorial, you will use the Classification workflow to categorize pixels in an image into many classes. In the first part of the tutorial, you will perform an unsupervised classification. Unsupervised classification clusters pixels in a dataset based on statistics only and does not use defined training classes.

In the second part of the tutorial, you will create training data interactively in the dataset and use it to perform a supervised classification. Supervised classification clusters pixels in a dataset into classes based on training data that you define. Then you can select the classes that you want mapped in the output.

References

Mahalanobis, Maximum Likelihood, Minimum Distance:

J .A. Richards, 1999, Remote Sensing Digital Image Analysis, Springer-Verlag, Berlin, p. 240.

Spectral Angle Mapper:

Kruse, F. A., A. B. Lefkoff, J. B. Boardman, K. B. Heidebrecht, A. T. Shapiro, P. J. Barloon, and A. F. H. Goetz, 1993, "The Spectral Image Processing System (SIPS) - Interactive Visualization and Analysis of Imaging spectrometer Data." Remote Sensing of Environment, v. 44, p. 145 - 163.

ISODATA:

Tou, J. T. and R. C. Gonzalez, 1974. Pattern Recognition Principles, Addison-Wesley Publishing Company, Reading, Massachusetts.

Files Used in this Tutorial


Tutorial files are available from our ENVI Tutorials web page. Click the Classification link to download the .zip file to your machine, then unzip the files. You will use this file in the tutorial:

File

Description

Phoenix_AZ.tif

QuickBird image over Phoenix, Arizona

Performing Unsupervised Classification


The ISODATA method for unsupervised classification starts by calculating class means evenly distributed in the data space, then iteratively clusters the remaining pixels using minimum distance techniques. Each iteration recalculates means and reclassifies pixels with respect to the new means. This process continues until the percentage of pixels that change classes during an iteration is less than the change threshold or the maximum number of iterations is reached.

  1. Start ENVI.
  2. From the Toolbox, select Classification > Classification Workflow. The File Selection panel appears.
  3. Click Browse. The Data Selection dialog appears.
  4. Click Open File. The Open dialog appears.
  5. Navigate to classification, select Phoenix_AZ.tif, and click Open. This is a QuickBird true-color image.
  6. Click Next in the File Selection panel. The Classification Type panel appears.
  7. Select No Training Data, which will guide you through the unsupervised classification workflow steps.
  8. Click Next. The Unsupervised Classification panel appears.
  9. Enter 7 as the Requested Number of Classes to define. You do not need to change any settings on the Advanced tab, so click Next to begin classification.

    When classification is complete, the classified image loads in the view and the Cleanup panel appears.

    The following is a sample of the unsupervised classification results from part of the image. Your results may be slightly different. Notice the amount of speckling that occurs within the residential areas:

  10. Cleanup is an optional step, but you will use it in this exercise to determine if the classification output improves. The cleanup options are smoothing, which removes speckling, and aggregation, which removes small regions. In the Cleanup panel, keep the default settings.
  11. Enable the Preview option. A Preview Window opens, showing you what the classification cleanup will look like with the current settings. Click on the Preview Window using the Selection tool (the arrow icon located in the main toolbar), and drag it around the image to see how areas will be affected by cleanup step.

    The image below shows that the classification will benefit from using the Cleanup step. You can see that much of the speckling noise has been replaced with smoother regions.

  12. Click Next. The Export panel appears.
  13. Enable only the Export Classification Image check box. Use the default output image type of ENVI, and enter a path and filename for the classification image.
  14. Click Finish.

Next, you will perform supervised classification on the same image. To prepare, do the following:

  1. Select File > Data Manager. The Data Manager opens.
  2. Select the classification file that you just created, and click the Close button. Leave the Data Manager and the file Phoenix_AZ.TIF open.

Performing Supervised Classification


Supervised classification methods include Maximum likelihood, Minimum distance, Mahalanobis distance, and Spectral Angle Mapper (SAM). In this tutorial, you will use SAM. The SAM method is a spectral classification technique that uses an n-D angle to match pixels to training data. It determines the spectral similarity between two spectra by calculating the angle between the spectra and treating them as vectors in a space with dimensionality equal to the number of bands. Smaller angles represent closer matches to the reference spectrum. The pixels are assigned to the class with the smallest angle. When used with calibrated reflectance data, the SAM method is relatively insensitive to illumination and albedo effects.

  1. In the Data Manager, click the Phoenix_AZ.TIF file. If you have the file open in ENVI, you can drag the filename from the Layer Manager to Classification Workflow in the Toolbox. Or, you can double-click the Classification tool to start it. The File Selection panel appears, with Phoenix_AZ.TIF as the raster input file.
  2. Click Next in the File Selection panel to proceed. The Classification Type panel appears.
  3. Select Use Training Data, which will guide you through the supervised classification workflow steps.
  4. Click Next. The Supervised Classification panel appears.
  5. Select the Algorithm tab, then select Spectral Angle Mapper from the drop-down list. Keep the default Maximum Spectral Angle setting of None.
  6. You can define training data from an existing ROI file, but for this exercise you will interactively create your own ROIs.

Interactively Defining Training Data

You will define two classes, with at least one region per class. This is the minimum number of classes required to perform supervised classification.

  1. In the Supervised Classification panel, click the Properties tab and change the Class Name field from Class #1 to Undeveloped. Leave the Class Color as red.
  2. Locate different areas in the image that are undeveloped. They should not contain buildings or grass, and they should not be roads. Draw polygons inside three of these areas. To draw a polygon, click in an undeveloped area and hold down the mouse button while drawing, or click the mouse at various points to mark vertices. When you return the to the starting point of the polygon, double-click to accept it. The ROI is added to Undeveloped layer in the Layer Manager under the Regions of Interest tree.

  3. The following is an example of one polygon.

  4. Click the Add Class button to create a second class.
  5. Change the Class Name from Class #2 to Vegetation. Leave the Class Color as green.
  6. Locate different areas in the image that display healthy vegetation such as golf courses, trees, lawns, etc. Draw polygons inside three of these areas. The following zoomed-in image shows an example.

  7. Click the Add Class button to create a third class.
  8. Change the Class Name from Class #3 to Buildings. Leave the Class Color as blue.
  9. Locate different areas in the image that have rooftops. Draw polygons inside three of these areas, preferably rooftops with different brightness levels. The following zoomed-in image shows an example.

  10. Next you will preview the classification results, based on the training data you provided.

Previewing the Classification

  1. Enable the Preview option to open a Preview Window that shows the classification result based on the training data you created. The following figure shows an example.

    The Preview Window shows that roads are being classified as buildings, so you will need to add a fourth class for roads.

  2. Disable the Preview option.
  3. Click the Add Class button.
  4. Change the Class Name from Class #4 to Roads. Leave the Class Color as yellow.
  5. Draw polygons within three different road types, including a freeway. You may need to use the Zoom tool in the main toolbar to zoom in enough to draw a polygon inside a road.
  6. Enable Preview again.
  7. The Roads training region seemed to do a good job of classifying the roads, but it also reclassified some rooftops that were a shade of gray similar to the highway. The following image shows an example.

    Next, you will delete the Roads region, rename the Buildings region to Developed, and add three road training regions to Developed.

  8. Right-click on the Roads class in the Training Data tree, and select Delete Class. The view in the Preview Window updates with the change.
  9. Select the Buildings class, and change its Class Name to Developed.
  10. Draw polygons within three road sections, being sure to mark at least one section of a highway.

The Preview Window should show that roads and buildings are part of the new Developed class.

Comparing Methods

With the Preview option enabled, try each of the classification methods under the Algorithm tab. For more detailed information on each method, see the references at the beginning of this tutorial. Here is a brief summary:

Maximum Likelihood assumes that the statistics for each class in each band are normally distributed and calculates the probability that a given pixel belongs to a specific class. Each pixel is assigned to the class that has the highest probability (that is, the maximum likelihood).

Minimum Distance uses the mean vectors for each class and calculates the Euclidean distance from each unknown pixel to the mean vector for each class. The pixels are classified to the nearest class.

Mahalanobis Distance is a direction-sensitive distance classifier that uses statistics for each class. It is similar to the maximum likelihood classification, but assumes all class covariances are equal, and therefore is a faster method. All pixels are classified to the closest training data.

Spectral Angle Mapper (SAM)

It appears that either Maximum Likelihood or Spectral Angle Mapper will provide the best classification results for this image. For this exercise, keep Spectral Angle Mapper as the algorithm and click Next.

Cleaning Up Supervised Classification Results

When supervised classification is complete, the classified image loads in the Image window, and the Cleanup panel appears. Cleanup is an optional step, but you will use it in this exercise to determine if the classification output improves. The cleanup options are smoothing, which removes speckling, and aggregation, which removes small regions.

  1. In the Cleanup panel, disable the Enable Smoothing option. Select and keep the default setting for Enable Aggregation.
  2. The Preview Window should still be open, showing you a view of what the classification cleanup will look like with the current settings. Click on the Preview Window, and drag it around the image to see how areas will be affected by cleanup step.

  3. Click Next. When the classification process is finished, the Export panel appears.

Exporting Classification Results


In the Export panel, you can save the classification results to an image, the class polygons to a shapefile, and statistics to a text file.

To export results:

  1. Under the Export Files tab, enable the Export Classification Image option and keep ENVI as the output image type. Enter a valid path and filename for the classification image.
  2. Enable the Export Classification Vectors option and keep Shapefile as the output vector file type. Enter a valid path and filename for the shapefile.
  3. Under the Additional Export tab, enable the Export Classification Statistics option. Enter a valid path and filename for the statistics text file.
  4. Click Finish. ENVI creates the output, opens the classification and vector layers in the Image window, and saves the files to the directory you specified. You can view the statistics by opening the file in a text editor.

  5. Select File > Exit to close ENVI.