Re: Memory size of data in NCL script

From: Dennis Shea <shea_at_nyahnyahspammersnyahnyah>
Date: Mon Jan 27 2014 - 20:16:18 MST

re: "...but the code is aborted."

[1] You should always include the error message.
     On a 64-bit system, running 64-bit NCL, there
     should be no memory limit for 64-bit OS.

[2] How much memory do you have available?

[3] Include output from

     %> uname -a
     %> ncl -V

Respond to ncl-talk only.

On 1/27/14, 5:40 PM, Dave Allured - NOAA Affiliate wrote:
> Compressing files will not reduce the amount of memory needed by NCL,
> because NCL must always uncompress data for internal use by your program.
> The one exception is scale/offset packed data, which is not a part of the
> current discussion.
>
> A good approach for very large data sets is to add a loop over one
> dimension, then read, compute, and output results for one dimension step at
> a time. Very often in climate science, this is the time dimension. You
> need to learn how to efficiently use array subscripting on file variables.
>
> http://www.ncl.ucar.edu/Document/Manuals/Ref_Manual/NclVariables.shtml#ReferencingFileVariables
>
> By this method, you do NOT need to break up one large file into many small
> files. That would be unneeded effort.
>
> Many calculations are spread over one or more dimensions. For example,
> spatial averaging over X and Y would require that you NOT loop over these
> dimensions for processing one step at a time. Ih this case you would loop
> over a different dimension such as time or level. In contrast, if you were
> running a temporal filter on 3-D data, you would keep the time dimension
> intact, and loop over X or Y. HTH.
>
> --Dave
>
>
> On Mon, Jan 27, 2014 at 4:51 PM, Jatin Kala <jatin.kala.jk@gmail.com> wrote:
>
>> I would be tempted to use NCO's "ncks" utility and make a bunch of
>> smaller files, then read these in with NCL.
>> ALso note, netcdf libraries version 4 and above support compression
>> "nccopy -d 9 in_file.nc out_file.nc".
>> I have found in many cases, this can more than half the file size. Whether
>> NCL needs half the memory then, i have not tested, but would assume so.
>> Cheers,
>> Jatin.
>>
>>
>> On 28/01/2014, at 10:38 AM, Waqar Younas wrote:
>>
>> Hi every one,
>> I am trying to read one data file in NCL which have size 14 GB but the
>> code is aborted.
>> Is there any way to load it OR i need to reduce the dimensions?
>> Thanks.
>>
>>
>
>
>
> _______________________________________________
> ncl-talk mailing list
> List instructions, subscriber options, unsubscribe:
> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>
_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Mon Jan 27 20:16:24 2014

This archive was generated by hypermail 2.1.8 : Fri Feb 07 2014 - 16:39:11 MST