Doug,
I am only dealing with one very large netcdf file - with over 100 variables in it - so I only close it once at the very end. After I read in the netcdf file and ncdf_varid 90 of the variables, I get
heap memory used: 1512507, max: 140114814761, gets: 1511, frees: 640
Then I set fltarr for 19 files of size 1165, 720, 1440 and I get:
heap memory used: 91799787243, max: 91799787243, gets: 1530, frees: 640
then I ncdf_varget lon, lat, time, and two of the variables, and sum them to one of my newly created arrays, and get:
heap memory used: 101462790563, max: 106294280195, gets: 1542, frees: 6
I then use delvar to delete the two arrays read in with ncdf_varget that were used to create the new array, and then do ncdf_varget on 10 other variables, and sum those 10 into another of the newly created arrays, then do delvar on all 10 of those, and then get
heap memory used: 149777672003, max: 154609161635, gets: 1572, frees: 6
then I do ncdf_varget on three additional variables, which is when I get the Unable to allocate memory: to make array. error.
It seems like my memory usage keeps growing, even after using delvar.
Ben
|