bus error reading multiple files

From: Gary Bates <gary.bates_at_nyahnyahspammersnyahnyah>
Date: Tue Jul 13 2010 - 11:39:59 MDT

hi,
I have been successful writing 1 of 2 large files that I want. The
successful file that I've written is about 10GB using these settings:
    setfileoption("nc","Format","NetCDF4Classic")
    setfileoption("nc","CompressionLevel",4)

However, when I try to write a 2nd similar file (reading slightly
different dataset) I encounter a 'bus error'. The problem seems to be
in this line when reading in data across multiple files:
   sstper = short2flt(files3[:]->sst(ind3,:,:))

Initially I was using dimsizes(ind3)=308. I've tried reducing the
number of files I read in, as well as the size of the output file but
still get 'bus error'. I've printed out ind3 and those indices look OK
to me.

My NCL script is attached. How can I diagnose the problem from a 'bus
error'?

-Gary

On 7/9/10 12:12 PM, David Brown wrote:
> Hi Gary,
>
> You are correct that there is in general not much benefit from
> compressing using higher levels of compression. It only slows the whole
> process down substantially. Actually the best performance/compression ratio
> might be compression level 1.
> -dave
>
> On Jul 9, 2010, at 12:02 PM, Gary Bates wrote:
>
>>
>> Dennis,
>>
>> This is exactly what I needed. Thanks!
>>
>> netCDF4 compression works like a charm. The level of compression does
>> not seem to matter much however. After converting to 3 decimal places
>> as you suggested here are some numbers from compression tests on a
>> relatively small file:
>>
>> CompressionLevel Time Resulting Size
>>
>> 2 1:12 282MB
>> 4 1:19 276MB
>> 7 1:36 274MB
>>
>> With no compression this file was 598MB.
>>
>> Gary
>>
>>
>>
>>
>> On 7/9/10 11:18 AM, Dennis Shea wrote:
>>> Hi Gary,
>>>
>>> A float is 32-bit no matter the precision. If u want
>>> a physically smaller file, there are 2 approaches:
>>>
>>> [1]
>>> Using classic netCDF3 the best approach is to
>>> convert the float to short and archive the data as type
>>> short [16-bits] with a scale_factor and add_offset. The following
>>> might help.
>>>
>>> http://www.ncl.ucar.edu/Document/Functions/Contributed/pack_values.shtml
>>>
>>> [2]
>>> Given that it is SST anomalies, compression [netCDF4] would be even
>>> better. 30% of the data are missing [land], if u convert
>>>
>>> SST3 = toint( SST*1000 )*0.001 ; 3 decimal palces
>>>
>>> then use compression on the SSTf
>>>
>>> you would see an even more dramatic reduction in file size.
>>>
>>> I must leave till about 1:30.
>>>
>>> I am cc'ing Mary and Dave Brown.
>>>
>>> ===
>>>
>>> If u want send a small sample file to
>>>
>>> ftp ftp.cgd.ucar.edu
>>> anonymous
>>> email
>>> cd incoming
>>> put sample_file
>>> quit
>>>
>>> Then tell us the name of the file.
>>>
>>> D
>>>
>>> On 7/9/10 10:48 AM, Gary Bates wrote:
>>>> Dennis,
>>>>
>>>> I think I've figured out a way to write a large file w/o explicitly
>>>> declaring a large array in NCL (using filevardef, etc).
>>>>
>>>> This 365 x 231 x 180 x 360 array (float) is SST anomalies on a 1deg
>>>> grid. I only need 2-3 digit accuracy.
>>>>
>>>> Which leads to the question: Is it possible to make my output file
>>>> smaller by specifying the number of significant digits desired? If so,
>>>> do you have an example?
>>>>
>>>> Thanks,
>>>> Gary
>>>>
>>>>

_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk

Received on Tue Jul 13 11:40:05 2010

This archive was generated by hypermail 2.1.8 : Mon Jul 19 2010 - 09:39:01 MDT