[smokeping-users] Centralized Monitoring

King, Michael MKing at bridgew.edu
Tue Dec 21 00:31:00 MET 2004


 I have 6 sites that we want to monitor the speed of certain WebPages.
(google, yahoo, etc.. Pages that provide a decent baseline)

Somewhere in the docs it was mentioned that the --filter directive was
written just for this reason.

Create 7 boxes with identical config files.  Filter the data collection
for each site, and somehow get the rrd data back to the central host.
(Rsync is what I think the doc said)

I was wondering however.  Would creating a RemoteCurl probe accomplish
pretty much the same thing.  Granted, the 6 boxes would have to still be
they're, but they are no longer running an instance of smokeping.

Granted, I have very little experience with Perl, but I figure with the
RemoteFPing and Curl probes already written, I should be able to create
some mutant progeny of the two.

Before I run off and spend a few days doing this, I wanted to make sure.

A.  This could not be accomplished some other "Better' way.  (IE, I'm
forgetting some crucial point, and the rsync solution is better)
B.  The functionality doesn't already exist.

I was playing around, and on one of my RemoteFping hosts, I passed it a
commandline of 
/usr/bin/ssh -l root remote.host.com /usr/local/bin/echoping -t 50 -h /
-t 50 -A -a -n 20 www.google.com:80

Which returned data, so I know it should fundamentally should work.

Unfortunately, when I tried CURL at the commandline, it gave off errors.


Thanks

Mike

--
Unsubscribe mailto:smokeping-users-request at list.ee.ethz.ch?subject=unsubscribe
Help        mailto:smokeping-users-request at list.ee.ethz.ch?subject=help
Archive     http://www.ee.ethz.ch/~slist/smokeping-users
WebAdmin    http://www.ee.ethz.ch/~slist/lsg2.cgi



More information about the smokeping-users mailing list