Hi James,
Could you run dim_stddev thru a do loop on the lat or lon dimension?
That should allow you stay below the memory limits of your machine.
Generally, it's recommended that you avoid do loops. However, single and
even double do loops are usually fine on most modern machines with
decent memory.. Adam
Correia, James wrote:
> All-
> I have a netcdf file with 1 variable dimensioned: 23423 x 109 x 134
> [time,lat,lon].
>
> NCL runs out of memory on computing dim_avg wrt time (I have 2GB available
> and 2 GB swap space). I know I can do dim_avg in 2 chunks. But what about
> computing the standard deviation, dim_stddev? Does anyone have a workaround?
>
> I tried using nco to do this but my machine came back with an malloc error,
> and another machine with 4GB of memory and 4GB of swap also returned the
> malloc error.
>
> Jimmyc
>
>
>
> James Correia Jr., PhD
> Climate Physics Group
> Post. Doc.
> Pacific Northwest National Lab
>
> "Wisdom. Strength. Courage. Generosity. Each of us are born with one of
> these. It is up to us to find the other three inside of us."
> -Into the West
>
>
> _______________________________________________
> ncl-talk mailing list
> List instructions, subscriber options, unsubscribe:
> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
-- -------------------------------------------------------------- Adam Phillips asphilli_at_ucar.edu National Center for Atmospheric Research tel: (303) 497-1726 ESSL/CGD/CAS fax: (303) 497-1333 P.O. Box 3000 Boulder, CO 80307-3000 http://www.cgd.ucar.edu/cas/asphilli _______________________________________________ ncl-talk mailing list List instructions, subscriber options, unsubscribe: http://mailman.ucar.edu/mailman/listinfo/ncl-talkReceived on Wed Jan 21 2009 - 12:53:42 MST
This archive was generated by hypermail 2.2.0 : Mon Mar 09 2009 - 21:47:10 MDT