[rrd-users] [rrd] Can't I turn off data smoothing for GAUGE?

Philip Peake philip at vogon.net
Thu Aug 19 18:43:10 CEST 2010


 Yes, you are right ... that was a bit strong :-)

However, I started by reading the into, the tutorial etc. and came away
with a certain impression.
I then went through the rather steep learning curve (well started on it,
I'm not finished by a long way yet!), got something running, and it
didn't behave at all like my mental picture built up in working through
those intro/tutorial notes.

I was ready to give up, or hack the code (which, although I am an open
source believer, I don't think is really the right answer, unless my
needs are really far away from the original objectives).

Anyway, thanks to your suggestion I now have it doing what I want (still
think the way it actually behaves for GAUGE is broken...).

My data gathering script is in perl.
It has an inner loop which sleeps for 30 seconds, which means that
processing is going to take it over the 30 second boundary.

What I did was:

    $t = time();
    $t = $t - ($t%30);

and use $t in place of "N".

Looking at the stored values, they are now integer values (well,
floating point representations of).

Thanks again for the suggestion.

Philip

---------------------------
On 8/19/2010 9:04 AM, Simon Hobson wrote:
> At 08:37 -0700 19/8/10, Philip Peake wrote:
>
>>  > And something no-one has mentioned so far - you do not have to use
>>>  "now" in your update statement. Taking the example above, you can
>>>  compute what the time was at the last integral multiple of "step" and
>>>  use that. Eg, if you end up calling rrd tool at 00:00:07, then
>>>  instead of being 7 seconds late, you could use a timestamp of
>>>  00:00:00.
>>>
>>>  Obviously you may some some precision in timing, but if your primary
>>>  concern is to avoid normalisation then overall you gain.
>>>
>> That may actually be the best solution.
>>
>> I have no control over how fast the remote systems respond with their
>> data, and under heavy load they will be slower.
>>
>> Doing it this way, I can tolerate quite slow response ...
>>
>> The other problem I was mulling over in my mind is that the data
>> gathering process might (will) be re-started from time to time -- how
>> would it ever find out the exact tim of the first update so that it
>> could time its own to be exactly N minutes, to the second, later.
>>
>> Forcing the first entry on some convenient boundary, then calculating
>> boundary multiples and using those would fix my problem
>
> And adding to what you wrote earlier :
> At 07:07 -0700 19/8/10, Philip Peake wrote:
>> Tell me how I synchronize a data source to RDD's concept of sample times?
>> Actually, what is RDD's concept of sample times? How does it determine
>> the start?
>> I would assume time starts with the first entry (or is it the time it is
>> created?).
>
> As Tobias wrote earlier, all times in rrd are relative to Unix epoch 
> - midnight 1st Jan 1970. You need know nothing of when the database 
> was created to know when the next/previous step boundaries occurred - 
> they are simply an integer number of step periods from Unix epoch. 
> Thus 300s steps will always be on the hour, 5 past, 10 past and so 
> on. 2 hour steps will always be at midnight, 2am, 4am, and so on (all 
> UTC).
>
> A quick look shows that in my graphing routines (Bash script), I use 
> the following to get graph end times that are on a step boundary 
> (${Step} is set elsewhere according to the graph being drawn) :
> Etime=`/bin/date +%s`
> Etime=$(( ${Etime} / ${Step} * ${Step} ))
>
>
>
>>  Then for a huge class of problems, actually, IMHO, most real world
>> problems not involving rate of change, its useless.
> That's a bit strong I think. RRD Tool isn't intended to be the right 
> tool for all jobs, it is designed to do ONE job WELL. It is known and 
> already acknowledged that it isn't the correct tool for a great many 
> jobs - and I'd suggest it is NOT the right tool for the task that 
> started this thread.
>
> The fact that other outside constraints mean that using another tool 
> is either difficult or not possible doesn't give you grounds for 
> complaining that RRD Tool isn't the right tool for your job.
>
> And I would very much dispute your assertion that "most real world 
> problems not involving rate of change" are not suitable for RRD Tool. 
> It handles gauge data types just fine - as long as your requirements 
> fit in with the normalisation and consolidation model RRD Tool uses. 
> I use RRD Tool for two main jobs at work - one is rate based (network 
> traffic flows), the other is gauge based (temperatures). It works 
> equally well for both.
>
> Your argument is like saying "I don't have any drills, it's the 
> hammers fault that it doesn't make tidy holes".
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.oetiker.ch/pipermail/rrd-users/attachments/20100819/38b1a75b/attachment-0001.htm 


More information about the rrd-users mailing list