[rrd-users] Strange GAUGE values when parsing MySQL output

Marc Powell marc at ena.com
Tue Aug 11 21:56:18 CEST 2009


On Aug 11, 2009, at 2:28 PM, Arthur Meeks Meeks wrote:


> Using GAUGE helps but you're not as much as you believe. Internally to
> RRDtool, they're still rates that fall into very specific interval
> buckets. If you want the exact values out that you put in, you must
> input the value on an even rrd STEP from the rrd START.
>
> If you are not inputing the values with a timestamp that falls exactly
> on a step (bucket) from the start time, RRDtool will adjust the value
> based on how early or late you are from the bucket time. If you input
> early, the value will be increased. If you input late, the value will
> be decreased.
>
> I would like to input the values with a timestamp, as the piece of  
> log I posted in my first email shows,

You do input the values with a timestamp --
> $RRDTOOL update $RRDFILE $TIMESTAMP:$DATA

> but I'm not sure if I understood this correctly (I'm a newbie in  
> rrdtool :-) ), you mean I should change the --start 1? But, in  
> favour of what?

>     $RRDTOOL create $RRDFILE  -s 300 DS:delay:GAUGE:600:U:U RRA:LAST: 
> 0.5:1:10000

When you create the rrd file, you say to expect input values every 300  
seconds, starting from the second the rrdfile is created (you don't  
specify a specific start time with --start). These are the bucket  
intervals that rrdtool will store a single value each. Let's say that  
time is 1250014800 in unix seconds. RRDtool will then expect updates  
at 1250014800, 1250015100, 1250015400, .... You show this in your dump  
--

>             <!-- 2009-08-11 20:20:00 CEST / 1250014800 --> <row><v>  
> 1.6333333333e-01 </v></row>
>             <!-- 2009-08-11 20:25:00 CEST / 1250015100 --> <row><v>  
> 3.6666666667e-02 </v></row>
>             <!-- 2009-08-11 20:30:00 CEST / 1250015400 --> <row><v>  
> 0.0000000000e+00 </v></row>
>             <!-- 2009-08-11 20:35:00 CEST / 1250015700 --> <row><v>  
> 0.0000000000e+00 </v></row>
>             <!-- 2009-08-11 20:40:00 CEST / 1250016000 --> <row><v>  
> 0.0000000000e+00 </v></row>


> > The problem disappear when the value got from the MySQL is something
> > like: 3989
> > Then the graph shows 3K, 3.5K, 4K etc in the vertical margin, which
> > is correct. But I don't understand why a value like: "1" is shown as
> > 100M in the graph.
>
> The value was input very much head of the step time?
>
> Probably not, the script runs every 30 seconds and it took about a  
> second to generate the graph.

Certainly so, then. If you tell rrdtool to expect updates every 300  
seconds, but give it one 30 seconds later, that's very ahead of  
time. ;) rrdtool tries to do something intelligent with the 9 or 10  
updates that you give it between it's expected intervals that it will  
store but in the end, you've given it multiple points of data, earlier  
than expected and it tries to do *something* with that. That  
*something* will almost certainly not be what you're expecting as it  
will consolidate those into one value for storage (I'm fuzzy on the  
exact mechanics of this but I'm certain I've seen documentation/ 
tutorial related to it). The interpolation is happening on the large  
values as well but because of their magnitude, it's just not obvious  
to you I expect.

In the end, if you can't input on exact steps from the start time (and  
you likely won't), you just need to try to get as close as you can and  
understand that the values are very close, but not exact.

--
Marc



More information about the rrd-users mailing list