[rrd-users] newbie rrd question - probably a very old issue

Dan Gahlinger dgahling at hotmail.com
Fri Aug 31 19:07:44 CEST 2007

I'm not really a newbie to rrdtool, but this seems like a newbie question,

and it probably has been asked since the beginning of time, but I've never 
seen a good explanation
of what's going on with the graph/consolidation functions.

we have this problem and it seems obvious, but we can't seem to figure out 
how to fix it.
We have a data set, a script collects data every two minutes and puts data 
in the database.

we defined the data set to be 3 months of daata as follows:
93 days (3 months x 31 - worst case) x 24 hours x 60 minutes x 60 seconds.

so that's 3 months of data points for every second, even though we only use 
once every 2 minutes.

now technically we have enough data points that creating a graph for 1 day 
or 1 week or 1 month, should have the ability to plot all the values.

but it doesn't.

for example, we do a 1 day graph and the "maximum" is 90
we do the same graph for 1 week and the "maximum" has magically changed to 
do it again for 1 month, and it gets even worse.

when we query the rrd database directly, we see that rrd gives us these 
values, so it's not the graphing function doing it, it's rrd.

but when we check the raw data, we can see all the data points are there.
if we have so many data points, it shouldn't be consolidating the data, but 
it is.

and it's driving me crazy trying to figure out why.
we thought of using separate data files for min, max and average, but triple 
the data storatge and triple the overhead is killing our server.

can someone tell us what we're doing wrong or give us a suggestion so that 
the data will be reported properly?

It seems I recall this issue, and there some some setting we could use, like 
ABS, but that doesn't seem to fit properly.


Show Your Messenger Buddies How You Really Feel 

More information about the rrd-users mailing list