From: Hyacinth Nnamchi <hyacinth.1_at_nyahnyahspammersnyahnyah>

Date: Thu Jan 24 2013 - 07:27:29 MST

Date: Thu Jan 24 2013 - 07:27:29 MST

Hi Dennis + other users,

I have solved the memory problem by selecting my area of interest (instead of using the global data) after realizing that I can use the function: center_finite_diff to calculate gradients on non-global data. This way, Dennis' previous suggestion works. Thanks a million!

Hyacinth

*> Date: Thu, 24 Jan 2013 07:18:26 -0700
*

*> From: shea@ucar.edu
*

*> To: hyacinth.1@hotmail.com
*

*> CC: ncl-talk@ucar.edu
*

*> Subject: Re: Vertical averaging using varying depths
*

*>
*

*> variable > 2GB are not an issue for NCL 6.0.0 and higher
*

*>
*

*> If you are encountering a memory error, it is likely that
*

*> you are exceeding your operating system memory
*

*>
*

*> On 1/24/13 4:22 AM, Hyacinth Nnamchi wrote:
*

*> >
*

*> > Hi Dennis,
*

*> > A more fundamental constraint is the size of my data: 624x29x193x256 and 624x193x256, which exceed 2G by far even before I create additional arrays. I'll therefore have to subset at some fixed levels and average directly. Thanks for your repeated hints, they may be useful for me elsewhere.
*

*> > Hyacinth
*

*> >
*

*> >> Date: Wed, 23 Jan 2013 10:10:47 -0700
*

*> >> From: shea@ucar.edu
*

*> >> To: hyacinth.1@hotmail.com
*

*> >> CC: ncl-talk@ucar.edu
*

*> >> Subject: Re: Vertical averaging using varying depths
*

*> >>
*

*> >> The denominators should have a 2nd argumenty
*

*> >>
*

*> >> dim_sum_n(dz) ==> dim_sum_n(dz , 0)
*

*> >> dim_sum_n(dz4) ==> dim_sum_n(dz4, 1) ;
*

*> >>
*

*> >> On 01/23/2013 09:09 AM, Hyacinth Nnamchi wrote:
*

*> >>> Thanks for your response Dennis, I'll try out your suggestion.
*

*> >>>
*

*> >>> Hyacinth
*

*> >>>
*

*> >>>
*

*> >>>
*

*> >>> > Date: Mon, 21 Jan 2013 11:26:15 -0700
*

*> >>> > From: shea@ucar.edu
*

*> >>> > To: hyacinth.1@hotmail.com
*

*> >>> > CC: ncl-talk@ucar.edu
*

*> >>> > Subject: Re: Vertical averaging using varying depths
*

*> >>> >
*

*> >>> >
*

*> >>> > There can not be function for everything.
*

*> >>> > You can always write your own.
*

*> >>> > Also, is the (x,y,z,t) fortran ordering???
*

*> >>> > Is 'z' one dimensional or 4 dimensional?
*

*> >>> >
*

*> >>> > let: dz=layer_thickness,
*

*> >>> > q=variable, fortran(x,y,z,t)=>ncl(t,z,y,x) ; 0,1,2,3
*

*> >>> >
*

*> >>> > z_zavg = dim_sum_n(q*dz, 1)/dim_sum_n(dz) ; q and dz (x,y,z,t)
*

*> >>> > --
*

*> >>> > dz4 = conform(dz,z,1)
*

*> >>> > z_zavg = dim_sum_n(q*dz4, 1)/dim_sum_n(dz4) ; q(x,y,z,t), dz(z)
*

*> >>> >
*

*> >>> > On 1/21/13 10:00 AM, Hyacinth Nnamchi wrote:
*

*> >>> > >
*

*> >>> & gt; > Hi users,
*

*> >>> > > I want to calculate the vertical average of a 4D (x,y,z,t) ocean
*

*> >>> variable. The problem is that the depths will have to vary: I want to
*

*> >>> use the thermocline depth (already calculated, x,y,t) as the base at
*

*> >>> each grid point, for the vertical vertical averaging. Is there a
*

*> >>> way/function to do this ncl?
*

*> >>> > > Thanks in advance.
*

*> >>> > > Hyacinth
*

*> >>> > >
*

*> >>> > >
*

*> >>> > >
*

*> >>> > >
*

*> >>> > > _______________________________________________
*

*> >>> > > ncl-talk mailing list
*

*> >>> > > List instructions, subscriber options, unsubscribe:
*

*> >>> > > http://mailman.ucar.edu/mailman/listinfo/ncl-talk
*

*> >>> > >
*

*> >>>
*

*> >>>
*

*> >>> _______________________________________________
*

*> >>> ncl-talk mailing list
*

*> >>> List instructions, subscriber options, unsubscribe:
*

*> >>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
*

*> >>>
*

*> >>
*

*> >
*

*> >
*

*> >
*

*> >
*

*> > _______________________________________________
*

*> > ncl-talk mailing list
*

*> > List instructions, subscriber options, unsubscribe:
*

*> > http://mailman.ucar.edu/mailman/listinfo/ncl-talk
*

*> >
*

_______________________________________________

ncl-talk mailing list

List instructions, subscriber options, unsubscribe:

http://mailman.ucar.edu/mailman/listinfo/ncl-talk

Received on Thu Jan 24 07:27:42 2013

*
This archive was generated by hypermail 2.1.8
: Tue Jan 29 2013 - 22:44:26 MST
*