Re: problem accessing multiple files using fbinread

From: David Brown <dbrown_at_nyahnyahspammersnyahnyah>
Date: Fri, 13 Feb 2009 14:41:44 -0700

Hi Sam,
Well in fact it is a bug in the code. fbinread should close the file
before it returns and it does not currently.
I am fixing it now.
But I am glad you were able to work around the problem.
  -dave

On Feb 13, 2009, at 2:20 PM, Sam Iacobellis wrote:

> Hi Dave,
>
> While the "SuppressClose" parameter in setfileoption does not work for
> fbinread as you pointed out, your note gave me the idea to increase my
> limit of file descriptors to 2048 and the ncl code now works.
> Still would
> be nice to know how to close files using fbinread.
>
> Thanks for your help.
>
> Sam
>
>
>
>> Hi Sam,
>>
>> Sorry I did not look closely enough at your message. The
>> instructions below only
>> work for NetCDF files, not binary files.
>>
>> In fact, it does not seem right that fbinread does not close the file
>> after reading from it.
>> I will look into this.
>> -dave
>>
>>
>> On Feb 13, 2009, at 1:29 PM, David Brown wrote:
>>
>>> Hi Sam,
>>>
>>> You need to use the following incantation in your script prior to
>>> opening any files:
>>>
>>> setfileoption("nc","SuppressClose",False)
>>>
>>> See http://www.ncl.ucar.edu/Document/Functions/Built-in/
>>> setfileoption.shtml
>>>
>>> Your system has a limit of 1024 open file descriptors, and
>>> subtracting several descriptors that
>>> NCL opens internally for the life of its execution, you are left
>>> with
>>> around 1020 files that can
>>> be simultaneously opened. SuppressClose defaults to True because it
>>> improves performance
>>> particularly when writing files, but it does need to be modified
>>> when
>>> you have many files to open
>>> at once.
>>> -dave
>>>
>>>
>>> On Feb 13, 2009, at 12:11 PM, Sam Iacobellis wrote:
>>>
>>>> Hi,
>>>>
>>>> I'm trying to run a simple NCL script that reads data using the
>>>> fbinread
>>>> function. I need to access quite a few individual files, around
>>>> 1200 or
>>>> so, each file for a different day of data. After reading about
>>>> 1020
>>>> files, the run stops and I get an error saying that fbinread could
>>>> not
>>>> open the file.
>>>>
>>>> The problem seems not to be with the individual file, but rather
>>>> that it
>>>> is the 1020th file that the script has tried to access. I've
>>>> reordered
>>>> the loop so that the files are accessed in a different order and
>>>> the same
>>>> problem happens for the 1020th file.
>>>>
>>>> My guess is that the files remain open after I read the data and
>>>> there is
>>>> a limit to the amount of open files. Is there a way to close the
>>>> file
>>>> after I am finished reading the data?
>>>>
>>>> Has anyone else come across this problem? I looked through the
>>>> archives,
>>>> but could not find any related information.
>>>>
>>>> Thanks for any help.
>>>>
>>>> Sam.
>>>>
>>>>
>>>> _______________________________________________
>>>> ncl-talk mailing list
>>>> List instructions, subscriber options, unsubscribe:
>>>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>>
>>> _______________________________________________
>>> ncl-talk mailing list
>>> List instructions, subscriber options, unsubscribe:
>>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>
>>
>

_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Fri Feb 13 2009 - 14:41:44 MST

This archive was generated by hypermail 2.2.0 : Thu Feb 19 2009 - 09:54:51 MST