Re: Segmentation Fault

From: Mary Haley <haley_at_nyahnyahspammersnyahnyah>
Date: Wed, 22 Apr 2009 14:48:22 -0600 (MDT)

Hi Joe,

It looks like you may simply be asking for too much memory.

A 900x800x31x109 float array is over 9 Gb of memory. If it's a double
array, then you will be asking for over 18 Gb of memory.

If you are on a 32-bit system, the most you can ask for is roughly
2Gb.

If you are on a 64-bit system, you can ask for more memory but your
system will need to be configured to allow you to make those kind of
requests. You will need to check with your system administrator on this.

Increasing the wsMaximumSize resource doesn't affect reading
in variables. It only comes into play when doing graphics.

I don't think NCL should be seg faulting, however, so we'll look into
this.

Meanwhile, you may need to read your variable in by chunks. You didn't
indicate what you plan to do with "T", but you could do something like
this:

   dims = getfilevardimsizes(wrf_input,"T")

   t = dims(0) ;Number of timesteps
   x = dims(3) ;West-east grid dimension
   y = dims(2) ;North-south grid dimension
   z = dims(1) ;Bottom-top grid dimension

   do i=0,t-1
     T = wrf_input->T(i,:,:,:) ;Load variable from which to extract dimensions

     ...process T as needed...
   end do

--Mary

On Wed, 22 Apr 2009, Joseph Zambon wrote:

> Hello,
>
> I am running an NCL script which takes in WRF output and produces a simple
> MSLP output netcdf file.
>
> ****************************
> load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
> load "$NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRFUserARW.ncl"
> load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
>
> wrf_input_file = [WRF INPUT FILE]
> ;must append .nc to end of filename, so just do it
> diag_out_file = [NETCDF OUTPUT FILE]
>
> wrf_input = addfile(wrf_input_file,"r")
> system("/bin/rm -f " + diag_out_file )
> diag_out = addfile(diag_out_file,"c")
>
> T = wrf_input->T(:,:,:,:) ;Load variable from which to extract dimensions
> dims = dimsizes(T) ;Get dimensions
> x = dims(3) ;West-east grid dimension
> y = dims(2) ;North-south grid dimension
> z = dims(1) ;Bottom-top grid dimension
> t = dims(0) ;Number of timesteps
>
> dim_names = (/ "bottom_top", "south_north", "west_east", "time" /)
> dim_sizes = (/ z, y, x, t /)
> dim_Unlim = (/ False, False, False, False /)
> filedimdef( diag_out, dim_names, dim_sizes, dim_Unlim )
> var_names_3D = (/ "slp" /)
> ;var_names_4D = (/ "pressure", "tc", "rh" /)
> var_types_3D = (/ "float" /)
> ;var_types_4D = (/ "float", "float", "float" /)
> dim_names_3D = (/ "time", "south_north", "west_east" /)
> ;dim_names_4D = (/ "time", "bottom_top", "south_north", "west_east" /)
>
> filevardef(diag_out, var_names_3D, var_types_3D, dim_names_3D )
> ;filevardef(diag_out, var_names_4D, var_types_4D, dim_names_4D )
> diag_out_at_title = "Isabel WRF Run SLP Diagnostic"
>
>
>
> slp_ar = new((/ t, y, x /), float)
>
> do time = 0,t-1,1
> slp = wrf_user_getvar(wrf_input,"slp",time)
> slp_ar(time,:,:)=slp
> print(time)
> end do
> diag_out->slp = slp_ar
>
> ****************************
>
> When I run the script, I get a segmentation fault when it attempts to load
> the T variable. As you can see, I encoded it in such a way I could run it
> for different domain sizes and not have to worry about changing dimensions.
> It works great for smaller grids, but the current grid I am working on,
> 900(w_e), 800(s_n), 31(vert), 109(timesteps) I get a Segmentation Fault error
> when it attempts to load the T variable. I am able to run the script through
> completion when manually inputting values of x, y, z, t. The Segmentation
> Fault error appears to only occur when trying to load the 900x800x31x109
> sized T variable (which does exist).
>
> Is there any way to allocate lots of memory? My .hluresfile has
> wsMaximumSize: 2147483648 bytes (~204MB), should this be much larger? I also
> plan on importing a lot of larger data in the future, how do I go about
> allocating *much* more memory to avoid this problem?
>
> wsMaximumSize
> This resource specifies the total size in bytes that is allowed to be
> allocated at any single time for all workspaces managed by the Workspace
> object during the execution of an HLU program. If the Workspaceobject is
> asked to allocate or reallocate a workspace such that this value would be
> exceeded, theWorkspace refuses to attempt the allocation and returns a fatal
> error
>
> I don't specifically get an error referring me to this variable, just
> "Segmentation Fault" as the program exits. NCL v5.1.0 was installed on our
> computing cluster by our techs and has been working perfectly up until this
> point. Please let me know if I missed any required info. Thanks!
>
> -Joe Zambon
> jbzambon_at_ncsu.edu
>
>
_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Wed Apr 22 2009 - 14:48:22 MDT

This archive was generated by hypermail 2.2.0 : Fri Apr 24 2009 - 14:12:40 MDT