[rrd-users] Good way to compute average over many data sources

Carsten Aulbert carsten at welcomes-you.com
Mon Feb 18 08:25:18 CET 2008


I've just started with rrdtool and would like to know if there is a more 
efficient way to solve my problem:

I've got an ensemble of compute racks which I can query via SNMP. I do 
so and would like to record the average water input temperature. For 
that I'm using something along this line:

RRDs::create ( "$stem-Water.rrd",
                "--start=$starttime",         # when to start (manually 
read from sample file)
                "--step=60",                  # expect one value every 
                "DS:TotPow:GAUGE:120:0:50000", # total power [W] 
according to LCP
                "DS:WatFlow:GAUGE:120:0:200",  # Water flow [l/min]
                "DS:Valve:GAUGE:120:0:100",    # Valve open [%]
                "DS:WatIn:GAUGE:120:0:100",    # Water temp in [C]
                "DS:WatOut:GAUGE:120:0:100",   # Water temp out [C]
                "RRA:AVERAGE:0.5:1:10080",      # archive of all values, 
kept for one day
                "RRA:MIN:0.5:15:4800",         # archive of minima (over 
15 minutes), kept for 10 days
                "RRA:MAX:0.5:15:4800",         # archive of maxima (over 
15 minutes), kept for 10 days
                "RRA:AVERAGE:0.5:15:4800"      # archive of averages 
(over 15 minutes), kept for 10 days

Now I want to compute the average *system* water input temperature based 
on 24 of those RRAs. However, a rack might be power off from time to 
time and I may have UNKNOWNs in the archive.

Currently, I'm computing the average like this (streamlined for only 4 

Of course this looks clumsy (and I need to generate this by script, 
otherwise I make too many errors).

Any suggestions how to make it better/more readable/optimal?

Thanks a lot!


More information about the rrd-users mailing list