[rrd-users] VDEF MINIMUM

Alex van den Bogaerdt alex at vandenbogaerdt.nl
Sat Aug 13 04:12:29 CEST 2011


> Yes, but only by a tenth of a unit, but enough to make me investigate. When
> I did testing and found that it was possible for the last 10,000 seconds to
> have a much higher minimum than the last 15,000 seconds I lost
confidence
> in
> the function and moved to fetch-ing and parsing.

The graph should show the same min and max as reported by VDEF. Or by the
old-style GPRINT, which still works.

If you like to investigate further, here are two suggestions:

1: create linear increasing data, which makes it very easy to investigate.
A direct relation to time (e.g. a rate equalling the seconds since the
epoch) works best. Then the maximum rate should equal the end time, the
minimum rate should equal the start time + step size of the graph.

2: try to have a small amount of data on the graph, e.g. 100 pixel columns
per RRA row, in other words 4 times a step size in a 400 pixels wide
graph. Now examine the start and end of the graph, is there one pixel
column showing a different rate?

>
> The DEF is the same, DEF:noise=noise.rrd:noise:AVERAGE .

When you collect min and max using fetch and sort, you are probably
getting the data in the best resolution possible. If you then display a
lot of data on your graph, I expect the minimum and maximum visible data
to smooth out (due to on the fly consolidation). So in the end you have
the same problem, min and max on the graph do not equal min and max as
numbers.






More information about the rrd-users mailing list