Hi James,
I meant to respond to this sooner.
I will see about getting this routine to do better clean-up
internally, but meanwhile, I think you can do this yourself at your
end. (If you are uncomfortable doing this, let know and I'll send you
a modified version.)
Here's what you can do:
1. cp $NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRFUserARW.ncl .
2. Edit WRFUserARW.ncl and search for the line:
if( (variable .eq. "uvmet") ) then
In this block of code, you will see some variables
being created, like:
v = nc_file->V(time,:,:,:)
u = nc_file->U(time,:,:,:)
You can delete these variables right before the "return(uvmet)"
calls (there are two of them). You can also do the same for
any other variables that get read off the file, like "latitude"
and "longitude".
3. Load this modified "WRFUserARW.ncl" file instead of the one
under $NCARG_ROOT/...
Let me know if this helps, or if you don't want to modify the file
yourself.
--Mary
On Wed, 21 May 2008, James Correia wrote:
> All-
> My problem is that memory, while my ncl script is running, increases into
> the 3 GB range.
>
> The issue stems from this loop
> do it=0,97
> uv = wrf_user_getvar(f,"uvmet",it)
> temu(it) = uv(0,0,ij,ija) ; 1 grid point for u
> temv(it) = uv(1,0,ij,ija) ; 1 grid point for v
> delete(uv)
> end do
> I do this for 2 files with 3 more arrays.
>
> The memory keeps increasing even with the deletes. is there anyway to see
> what variables are being kept in the call to wrf_user_getvar so I may delete
> them, so the memory doesnt keep increasing. The loop is necessary since
> uvmet doesnt return time and my goal is plot a time series.
>
> --
> James Correia Jr.
> Post Doc
> Climate Physics Group, PNNL
>
_______________________________________________
ncl-talk mailing list
ncl-talk_at_ucar.edu
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Fri May 23 2008 - 09:11:45 MDT
This archive was generated by hypermail 2.2.0 : Fri May 23 2008 - 09:39:48 MDT