Re: problem accessing multiple files using fbinread

From: Helly John J. <hellyj_at_nyahnyahspammersnyahnyah>
Date: Fri, 13 Feb 2009 11:54:51 -0800

Hi Sam.

In linux there is a file-max parameter in the kernel that could
conceivably impose this limit. I don't have any direct experience
with this limit in NCL but I'm a novice user of NCL. BTW, there is a
workshop at SDSC on NCL 24-26 Feb. You can talk to Mary Tyree is that
is of interest to you.

John Helly, University of California, San Diego
San Diego Supercomputer Center, Mail Code 0527
Scripps Institution of Oceanography, Climate, Atmospheric Science, and
Physical Oceanography, Mail Code 0224
9500 Gilman Dr., La Jolla CA 92093
+01 760 840 8660 mobile / stonesteps (Skype) / stonesteps7 (iChat) /

On Feb 13, 2009, at 11:11 AM, Sam Iacobellis wrote:


I'm trying to run a simple NCL script that reads data using the fbinread
function. I need to access quite a few individual files, around 1200 or
so, each file for a different day of data. After reading about 1020
files, the run stops and I get an error saying that fbinread could not
open the file.

The problem seems not to be with the individual file, but rather that it
is the 1020th file that the script has tried to access. I've reordered
the loop so that the files are accessed in a different order and the
problem happens for the 1020th file.

My guess is that the files remain open after I read the data and there
a limit to the amount of open files. Is there a way to close the file
after I am finished reading the data?

Has anyone else come across this problem? I looked through the
but could not find any related information.

Thanks for any help.


ncl-talk mailing list
List instructions, subscriber options, unsubscribe:

ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
Received on Fri Feb 13 2009 - 12:54:51 MST

This archive was generated by hypermail 2.2.0 : Thu Feb 19 2009 - 09:54:51 MST