X
PrevPrev Go to previous topic
NextNext Go to next topic
Last Post 01 Sep 2011 11:19 AM by  anon
bytscl function in band math
 3 Replies
Sort:
You are not authorized to post a reply.
Author Messages

anon



New Member


Posts:
New Member


--
01 Sep 2011 11:19 AM
    Hi all, I'm trying to compare depths of a particular absorption feature derived from a series of different hyperspectral images. I used "bytscl" function and set as "minimum" the minimum from all images (although the value is a lot lower than the lowest for some images), and as a "maximum" the highest value from all images (although it is a lot higher for some). I also stretched the images using their own min and max values (derived through statistical computation). Visually, the two results looked the same, following the same pattern, but differing in values. I am not entirely sure if I can compare the absolute numbers between all images (which is ultimately my goal). Could anyone please tell me a bit more about the function, and hopefully comment on what method is best in order to compare values of pixels from different images? Thanks in advance, Mila

    Deleted User



    New Member


    Posts:
    New Member


    --
    01 Sep 2011 03:09 PM
    Hi Mila, I don't follow why you are using BYTSCL exactly. It seems to me that if you have a series of hyperspectral images, and they have all been atmospherically corrected accurately, then you would not need to additionally scale the values in order to calculate the depth of the absorption feature in the different images. A simple way to do it would be to decide a nearby wavelength that is the top of the feature, and then the wavelength that is the bottom of the feature, and subtract the value at the bottom from the value at the top, for each image. Is there something that you are trying to account for by doing the byte scaling? - Peg

    Deleted User



    New Member


    Posts:
    New Member


    --
    01 Sep 2011 11:58 PM
    Hi Peg, Thank you very much for your reply. Actually, the calculation of the feature depth is not the problem. I have done that, and created artificial images with pixel values containing only the band depth. The bytscl I applied in order to compare them in the same scale (intending linear data normalization). What I am not entirely certain is whether by doing that I can only compare between pixels in the same image, or I can compare all images (by setting the same min and max for all). thanks again, any thoughts are much appreciated! rgds, Mila

    Deleted User



    New Member


    Posts:
    New Member


    --
    02 Sep 2011 10:56 AM
    OK, so then I wonder how your data ended up using different scales for the feature depth? Have they been calibrated and corrected to reflectance? If so, then, theoretically, the depths you've calculated should be directly comparable. But things can complicate this in different ways. So, are your spectra from different sensors? Are they measuring different samples of the same material, but not the same exact sample? Any additional info about why you don't think they should be directly compared would be helpful. - Peg
    You are not authorized to post a reply.