Re: Reading IBTrACS netCDF

From: Dennis Shea <shea_at_nyahnyahspammersnyahnyah>
Date: Thu Jul 21 2011 - 11:06:32 MDT

NCL uses the software provided by Unidata. The NCL overhead
of allocating memory for the returned variable is a
standard C malloc. I speculate other commercial or
public domain tools would take the same time.

If you remove the 'short2flt' function, the script runs in about
50-60% of the time.

NCL developers are working with others on a DOE grant to
parallize aspects of NCL. One aspect is parallel netCDF (pNetCDF).
This would increase the netCDF IO efficiency.

D

On 07/21/2011 08:46 AM, Carl Schreck wrote:
> I'm working with tropical cyclone data from the IBTrACS archive, and it
> seems like it's taking NCL too long to read it. It takes ~ 10 s, even
> though the file is ~ 500 MB. I'm probably spoiled to think 10 s is slow,
> but it seems like it should be much faster.
>
> The data can be found at:
>
> ftp://eclipse.ncdc.noaa.gov/pub/ibtracs/v03r03/all/netcdf/Allstorms.ibtracs_all.v03r03.nc
>
> And a script to read it is attached. Is it slow because it's so many (9)
> variables? Is there something I or the dataset developers could do to
> speed this up?
>
> Thanks!
_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Thu Jul 21 11:06:37 2011

This archive was generated by hypermail 2.1.8 : Fri Jul 29 2011 - 08:44:18 MDT