X
PrevPrev Go to previous topic
NextNext Go to next topic
Last Post 03 Mar 2011 06:25 AM by  anon
Bug report writing to hdf-files:
 0 Replies
Sort:
You are not authorized to post a reply.
Author Messages

anon



New Member


Posts:
New Member


--
03 Mar 2011 06:25 AM
    I recently tried to write some data to an hdf-file. I created an hdf-file and started adding a whole list of different arrays containing data. IDL crashed and gave a coredump. After some debugging it turns out that when the length of filename or path+filename exceeds 120 characters, writing several arrays to the hdf-file results in a crash. For 119 characters or less this does not occur. Note that a langth of 120 characters is not that unusual if you have a long path for a data directory. It happens in IDL v7.1.1 as well as v8.0. But it was not a problem in pre v7.1.1 versions of IDL. The example below is about the simplest one I could make in order to reproduce the error. filename = '12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456.hdf' data = [1] hdfID = hdf_open(filename,/create) for index=0,10 do begin sdID = hdf_sd_start(filename,/RDWR) sdsID = hdf_sd_create(sdID,'DATA_' + strcompress(index,/remove_all),[n_elements(data)],/dfnt_float32) hdf_sd_endaccess,sdsID hdf_sd_end,sdID endfor hdf_close,hdfID end The error message looks like this. *** glibc detected *** //usr/local/appl/installed/idl71/bin/bin.linux.x86/idl: double free or corruption (!prev): 0x09ace130 *** ======= Backtrace: ========= ... Abort (core dumped)
    You are not authorized to post a reply.