Hi All,
I have a question regarding the continuum removal procedure implemented in ENVI. From the references (in particular CLARK et al. 1987 - "Automatic continuum analysis of reflectance spectra"), it sounds like continuum removal is performed by first detecting a set of local maxima within the spectrum, and then defining the continuum as a set of straight-line segments between local maxima. However, for spectra that are mostly concave (for instance, given a bowl-shaped spectrum with the most significant maxima located at the first and last spectral bands), the continuum removal procedure seems to just define the continuum as a straight line between the first and the last band positions, effectively delimiting the entire spectrum as one big "absorption region" (forgive the terminology, I'm lacking a better description here. : ) ).
So, my question is: is there any way to adjust the sensitivity of the local maxima detection proceture in ENVI, or is some other algorithm being used that I'm not aware of?
Thanks in advance for the replies,
-Brian
p.s. See posted images for some examples.
|