[rrd-users] Usual RRD question on AVERGAE, MAX and consolidation :)

James Bensley jwbensley at gmail.com
Thu Dec 19 12:08:27 CET 2013


Hi all,

Short version:
Why is AVERAGE producing difference values to MAX from the same RRA,
when I am keeping 2 years worth of data and query for data gathered
only a month or two ago?


Full version:

I have some RRDs that keep 5 minutes samples for 2 years, for tracking
router/switch/firewall/server interfaces speeds;

They look like this;

me at server:~$ rrdtool info /var/lib/cacti/rra/router1_traffic_in_11666.rrd
filename = "/var/lib/cacti/rra/router1_traffic_in_11666.rrd"
rrd_version = "0003"
step = 300
last_update = 1386606906
ds[traffic_in].type = "COUNTER"
ds[traffic_in].minimal_heartbeat = 600
ds[traffic_in].min = 0.0000000000e+00
ds[traffic_in].max = 1.0000000000e+09
ds[traffic_in].last_ds = "1601976753"
ds[traffic_in].value = 1.6551294400e+06
ds[traffic_in].unknown_sec = 0
ds[traffic_out].type = "COUNTER"
ds[traffic_out].minimal_heartbeat = 600
ds[traffic_out].min = 0.0000000000e+00
ds[traffic_out].max = 1.0000000000e+08
ds[traffic_out].last_ds = "72738702"
ds[traffic_out].value = 1.5563000000e+04
ds[traffic_out].unknown_sec = 0
rra[0].cf = "AVERAGE"
rra[0].rows = 230400
rra[0].pdp_per_row = 1
rra[0].xff = 5.0000000000e-01
rra[0].cdp_prep[0].value = NaN
rra[0].cdp_prep[0].unknown_datapoints = 0
rra[0].cdp_prep[1].value = NaN
rra[0].cdp_prep[1].unknown_datapoints = 0
rra[1].cf = "MAX"
rra[1].rows = 230400
rra[1].pdp_per_row = 1
rra[1].xff = 5.0000000000e-01
rra[1].cdp_prep[0].value = NaN
rra[1].cdp_prep[0].unknown_datapoints = 0
rra[1].cdp_prep[1].value = NaN
rra[1].cdp_prep[1].unknown_datapoints = 0


As you can see, 230400 rows in each rra, recording data in 5 minute steps;
(230400*5)=1152000 minutes, 1152000/60/24/365 = 2.19 years

I want full accuracy of data for 2 years. I have a problem though when
I generate graphs from this RRD file. I am using Cacti to gather data
from devices using SNMP, I have modified Cacti to create these two
year RRDs for me.

Cacti generates a graph like this;

http://i.imgur.com/UZR4QYU.png

When generate a graph by hand on the CLI I get this;

http://i.imgur.com/jcXHYu5.png

The difference is that I use "DEF:in="my.rrd":traffic_in:MAX" where as
Cacti uses "AVERAGE". If I change my data source to be AVERAGE I get
the same graph Cacti produces. Simple enough.

So why is AVERAGE producing a different value to MAX here, when I am
keeping 2 years worth of data and query for data gathered only a month
or two ago?


Many thanks all,
James,



More information about the rrd-users mailing list