memory leak issue with ncl?

From: Douglas Lowe <Douglas.Lowe_at_nyahnyahspammersnyahnyah>
Date: Tue Mar 26 2013 - 06:18:03 MDT

Hi all, I've written an ncl script which pulls out columns of model data from several (~200) wrfout* netcdf files, creating a storage array for this data, then writing out each variable to it's own text file so that I can create curtain plots later (see attached script). I'm currently testing it on a single wrfout* file --- although I am still setting the storage array to be large enough for ~200 output files that I'll eventually be processing --- just to check the script. What I've found is that the process of writing the data files starts off quickly (the first 3 files written take less than a minute), but there is a marked increase in the time taken to write each file (or, at least, to organise the data to write each file) as the script loops through each of the variables (by the time I reach the 8th and 9th variables to be written out the time taken to do this has increased by a factor of 4-5), and the memory footprint of the ncl process increases steadily over this period. I have added delete() statements to remove the variables once they've been finished with, but this doesn't seem to help at all. So I can only assume that there is a problem in the way that ncl handles memory (de)allocation? I've attached my script - is there anything obvious in there that I'm missing which is causing this inefficiency in memory usage? If not then is there anyway for me to write this script more efficiently? I don't really want to have to run the script individually for each variable of interest, as this would entail opening 200+ netcdf4 repeatedly, which I guess would end up being more inefficient than waiting for ncl to write out. I'm running this in ncl v5.2.1, on a snow leopard OSX machine. Thanks, Doug

ncl-talk mailing list
List instructions, subscriber options, unsubscribe:

Received on Tue Mar 26 06:18:19 2013

This archive was generated by hypermail 2.1.8 : Tue Apr 02 2013 - 21:23:48 MDT