ncl_convert2nc chokes on large files

From: William Gallery <wgallery_at_nyahnyahspammersnyahnyah>
Date: Thu, 02 Aug 2007 10:44:53 -0400

I am trying to convert a 1.3 Gig grib file to NetCDF using
ncl_convert2nc. The program aborted after several hours with a message
that it was unable to allocate memory. When I ran 'top' while
ncl_convert2nc was running, it showed that ncl was using 89% of the
memory but was using less than 1% of the CPU. I suspect that it was
spending all its time paging memory until it ran out of virtual memory.
When I ran ncl_convert2nc with the -v option to extract only a small
subset of the data, the program ran to completion.

Is this a bug (memory leak?) or a design issue?

Setup:

Computer:
Dual core intel 32 processor, 2 Gig memory, Linix, gfortran compiler

NCL version: 4.3.0

Input data file:
-rw-rw-r-- 1 wgallery green 1322235040 Aug 1 14:51
e4oper.fc.ml.19970116-19970120.grib
This is an NCAR era40 (ds121.1) file.

Command:
ncl_convert2nc e4oper.fc.ml.19970116-19970120.grib

Error message:
Original message lost but it was from MALLOC saying unable to alloocate
memory followed by a segmentation fault

Bill Gallery

-- 
William O. Gallery
Atmospheric and Environmental Research, Inc.
131 Hartwell Avenue
Lexington, MA 02421-3136
Tel: (781) 761-2288 ext. 284
Fax: (781) 761-2299 
_______________________________________________
ncl-talk mailing list
ncl-talk_at_ucar.edu
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Thu Aug 02 2007 - 08:44:53 MDT

This archive was generated by hypermail 2.2.0 : Thu Aug 09 2007 - 10:59:25 MDT