| Hi all--
I have a 250 MODIS scenes in WGS84 geographic coordinate system.  Using IDL, I need to determine the area occupied by pixels that meet a given condition in square kilometers.  Accuracy is very important, thus reprojection isn't an option since the scenes are about 10 degrees wide--much wider than a UTM zone.  
I have an idea for a workaround.  Since all of the pixels are "square" in terms of their decimal-degree dimensions, I could write an ArcPython script to determine the exact area for every pixel in a single column in the center of a UTM zone.  Since we can assume that the area occupied by a pixel is the same across a row (i.e. pixels at the same latitude) we can just extrapolate across rows.  I'll then export this to a raster of the same dimensions as my data where the pixel value equals the area occupied by the pixel.  Then I can do something like:
TotalArea = Sum(AreaRaster[where(DataRaster==Conditions)])
Does this make sense?  Are there any flaws in this quite weird method?  Here's a diagram that explains my thinking:
 |