6733
Supplying a proper data scaling factor in ENVI Maximum Likelihood Classification
ENVI's implementation of the Maximum Likelihood Classification alogorithm asks the user to supply a data scaling factor prior to performing the analysis. This article explains how to choose the correct value for your data and why supplying the factor is necessary.
The scale factor is a division factor used to convert integer scaled reflectance or radiance data into floating point values. For example, for reflectance data scaled into the range of zero to 10,000, set the scale factor to 10,000. For uncalibrated integer data, set the scale factor to the maximum value the instrument can measure 2n- 1, where n is the bit depth of the instrument).
For example:
- 8-bit instruments (such as Landsat 4) set the scale factor to 255
- 10-bit instruments (such as NOAA 12 AVHRR) set the scale factor to 1023
- 11-bit instruments (such as IKONOS) set the scale factor to 2047
If a value is not supplied by a user in the GUI, ENVI will automatically choose a reasonable value based on the selected input data.
Using the proper scaling factor for the data ensures that all values will fall within the floating point range of 0.0 to 1.0--essentially normalizing them. This is necessary because the Maximum Likelihood Classification algorithm works with groups of probability distributions for each set of training data in relation to each possible class in order to figure out the class with the highest probability match (maximum likelihood) for that data. These groups of probability distributions have floating point values between 0.0 and 1.0 and the algorithm requires input data within a similar range.
___________________________________________
Reviewed by BC on 09/17/2014