[rrd-users] "wrong" highest resolution?

Marco Marongiu brontolinux at gmail.com
Tue Nov 30 11:27:40 CET 2010


Hi all

I have an RRD file from cacti whose info is summarized below (non
relevant RRAs left out). If my understanding was correct, I would expect
to be able to fetch data with the following parameters:

CF == AVERAGE
--start 1288566000
--end 1291158000
--resolution 300

In fact, if my understaning was right, rra[0] is the archive with the
highest resolution, being:

rra[0].pdp_per_row = 1
rra[1].pdp_per_row = 6
rra[2].pdp_per_row = 2
rra[3].pdp_per_row = 288

What happens is that the archive returns data as if the highest
resolution was 600, not 300 (this happens whether I use -r or not, and
is concordant with the manual):

$ rrdtool fetch /tmp/x*.rrd AVERAGE -s 1288566000 -e 1291158000 -r 300 |
head
                       cpu_idle

1288566600: nan
1288567200: nan
1288567800: nan
1288568400: nan
1288569000: nan
1288569600: nan
1288570200: nan
1288570800: nan

(the output when not using -r is the same).

Start and end time are integer multiples of 300, and so is the interval
(which is trivial, since it's the difference of two multiples of 300...).

The info summary is:

filename = "/tmp/x01-03_cpu_idle_3677.rrd"
rrd_version = "0003"
step = 300
last_update = 1290615226
ds[cpu_idle].type = "COUNTER"
ds[cpu_idle].minimal_heartbeat = 600
ds[cpu_idle].min = 0,0000000000e+00
ds[cpu_idle].max = 1,0000000000e+04
ds[cpu_idle].last_ds = "258249960"
ds[cpu_idle].value = 3,5498196667e+05
ds[cpu_idle].unknown_sec = 0
rra[0].cf = "AVERAGE"
rra[0].rows = 600
rra[0].cur_row = 599
rra[0].pdp_per_row = 1
rra[0].xff = 5,0000000000e-01
rra[0].cdp_prep[0].value = NaN
rra[0].cdp_prep[0].unknown_datapoints = 0
rra[1].cf = "AVERAGE"
rra[1].rows = 700
rra[1].cur_row = 699
rra[1].pdp_per_row = 6
rra[1].xff = 5,0000000000e-01
rra[1].cdp_prep[0].value = 3,2111851609e+03
rra[1].cdp_prep[0].unknown_datapoints = 0
rra[2].cf = "AVERAGE"
rra[2].rows = 9216
rra[2].cur_row = 9215
rra[2].pdp_per_row = 2
rra[2].xff = 5,0000000000e-01
rra[2].cdp_prep[0].value = 0,0000000000e+00
rra[2].cdp_prep[0].unknown_datapoints = 0
rra[3].cf = "AVERAGE"
rra[3].rows = 797
rra[3].cur_row = 796
rra[3].pdp_per_row = 288
rra[3].xff = 5,0000000000e-01
rra[3].cdp_prep[0].value = 3,1040640971e+05
rra[3].cdp_prep[0].unknown_datapoints = 0

What am I doing wrong?

Apologies for the rather newbie question, but after reading through the
manual several time, and a number of tests, I really can't make sense of
this. Should I blame it to the cold that is plaguing me? :)

Thanks

Ciao
--bronto

PS: I am assuming that resolution[N] = step * rra[i].pdp_per_row, and
that highest resolution is actually

min{ resolution[i] | i = 1...N }

Please correct me if my assumption is wrong.



More information about the rrd-users mailing list