Large data processing

From: Correia, James <james.correia_at_nyahnyahspammersnyahnyah>
Date: Wed, 21 Jan 2009 08:58:59 -0800

All-
 I have a netcdf file with 1 variable dimensioned: 23423 x 109 x 134
[time,lat,lon].

NCL runs out of memory on computing dim_avg wrt time (I have 2GB available
and 2 GB swap space). I know I can do dim_avg in 2 chunks. But what about
computing the standard deviation, dim_stddev? Does anyone have a workaround?

I tried using nco to do this but my machine came back with an malloc error,
and another machine with 4GB of memory and 4GB of swap also returned the
malloc error.

Jimmyc

James Correia Jr., PhD
Climate Physics Group
Post. Doc.
Pacific Northwest National Lab

"Wisdom. Strength. Courage. Generosity. Each of us are born with one of
these. It is up to us to find the other three inside of us."
-Into the West

_______________________________________________
ncl-talk mailing list
List instructions, subscriber options, unsubscribe:
http://mailman.ucar.edu/mailman/listinfo/ncl-talk
Received on Wed Jan 21 2009 - 09:58:59 MST

This archive was generated by hypermail 2.2.0 : Mon Mar 09 2009 - 21:47:10 MDT