Re: Memory

From: Dennis Shea <shea_at_nyahnyahspammersnyahnyah>
Date: Tue Jul 30 2013 - 07:55:47 MDT

There is no magic. There is no way to use an external drive.

==

[1] Only do (say) one 'len' per run; maybe two 'len'

[2] Since these are monthly data, the adjacent 0.5 spaced grid points
     are likely highly correlated. Hence, you can thin the array
     upon input and not lose any 'scientific' value. NCL array syntax
     can be used.

       f = addfile("..","r")
       prc = f->precip(:,::2,::2)
       printVarSummary(prc)

     This will retain the full number of time steps so that
     you will have the max number of values for the distribution
     at each grid point.

     It will reduce the memory required by 75% !!

Maybe a combination of [1] and [2]

Good luck

On 7/30/13 3:36 AM, Setareh Rahimi wrote:
> Dear all,
>
> I run a program which need lots of memory say 15 G, but my laptop does not
> have such a free memory to run the program. Does anbody know how can I use
> an external hard to run the program?
>
> Thanks in advance
>
>
>
> _______________________________________________
> ncl-talk mailing list
> List instructions, subscriber options, unsubscribe:
> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>
_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Tue Jul 30 07:55:54 2013

This archive was generated by hypermail 2.1.8 : Thu Aug 01 2013 - 15:55:03 MDT