Re: Segmentation fault

From: Dennis Shea <shea_at_nyahnyahspammersnyahnyah>
Date: Thu Jul 21 2011 - 18:23:25 MDT

ncdump -h does not "read" the variables into memory.
It only prints the dimension and variable names and the attributes.

I speculate that

%> ncl_filedump

will work fine also.


You are reading into memory variables that are too large for the
memory of your machine.

Look at the ncdump -h and see how much memory the varibles you are
reading require

      float x(10,20,30,40)

      size_x_bytes = 4*(10*20*30*40)

the 4 is because a flot is 4 bytes.

If the variables are type double use 8*(10*20*30*40)

If there are other variables that you are reading ... do the same.
The sum is how much memory you need.

On 7/21/11 6:14 PM, H.Dang wrote:
> Hi dear all,
> I got the following mistake when run a ncl script:
> Copyright (C) 1995-2011 - All Rights Reserved
> University Corporation for Atmospheric Research
> NCAR Command Language Version 6.0.0
> The use of this software is governed by a License Agreement.
> See for more details.
> fatal:NclMalloc Failed:[errno=12]
> Segmentation fault
> I do have 3 netCDF files to open in the script and the size of one file
> is really big. The 3 nc files can be recognized by "ncdump" so the data
> are okay. So what can be the most likely reason for my problem? Thank you.
> --
> Cordially,
> Hongyan(鸿雁)
> Tel: 1-519-8884567ext36667
> _______________________________________________
> ncl-talk mailing list
> List instructions, subscriber options, unsubscribe:
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
Received on Thu Jul 21 18:23:32 2011

This archive was generated by hypermail 2.1.8 : Fri Jul 29 2011 - 08:44:18 MDT