X
PrevPrev Go to previous topic
NextNext Go to next topic
Last Post 23 Mar 2014 08:30 AM by  anon
Handling big data cubes
 2 Replies
Sort:
You are not authorized to post a reply.
Author Messages

anon



New Member


Posts:
New Member


--
23 Mar 2014 08:30 AM
    Hi, I have ~5000 1024x1024 image arrays I'd like to hold in a data cube. A good percentage of the pixels (maybe 30%?) are data that I don't care about. Subsequently, I'm going to be extracting various statistics that drill down the longest dimension of the cube (i.e. the temporal direction). What's the most memory efficient way to handle this? Is there a trick with sparse arrays? More generally, perhaps: How do other folks handle large time-series data cubes in IDL without simply throwing more RAM at the problem? (Or is that my only solution?) Thanks!

    Deleted User



    New Member


    Posts:
    New Member


    --
    04 Apr 2014 07:35 AM
    data = ENVI_GET_DATA(DIMS=dims, FID=fid ,INTERP= 0 , POS=a) ; here variable a is an array of say 30 bands but the return value obtained in the array named data is only a single band.How do i read a datacube. Hoping for some inputs since you were the only relevant person i could find in this regard. Thankyou.

    Deleted User



    New Member


    Posts:
    New Member


    --
    05 Apr 2014 07:59 AM
    To handle large arrays like that, I use a tiling mechanism to only read a portion of the data at any one time, process it, and write the result back to disc. Storing the data in BIP interleave will make your temporal analyses run slightly quicker than if stored in BSQ interleave. If you have access to ENVI then use their tiling routines. New ENVI: http://www.exelisvis.com/...atetileiterator.html Classic ENVI: http://www.exelisvis.com/.../ENVI_INIT_TILE.html http://www.exelisvis.com/...s/ENVI_GET_TILE.html Hope that helps Josh
    You are not authorized to post a reply.