[rrd-users] Re: Updating 1000's of rrd files
Jason Fesler
jfesler at gigo.com
Mon Jul 5 17:54:03 MEST 2004
> I need to collect some data from ~2,000 to 3,000 devices (perhaps more)
> and put it into rrd files. about 100 files per device. The code to
> collect all the data can run in ~2-3 min. Also, code to update one
> rrd file ~5,000 times can run in ~20 sec.
Can you consolidate the 100 files per device at all into a smaller number
of files with multiple DS's? That helps cut down the I/O cost.
> I assume this is due to the time to update so many files (in the test case
> it's just one file over and over, so it's very fast).
That's hitting OS cache.
> I assume it would go much faster if I could put all of the data
> sources for each device in one file, but new data sources will get
> added all the time, and I see no way to add a DS to an rrd file.
That's a serious flaw, and a serious thorn in my ass.
That said, you do need to consolidate at least some of those down to
improve performance, and add as many spindles as you can afford.
You can also try running with a certain amount of parallism. Depending on
your gear and OS this might buy you some improvement. Try breaking the
job up into 2-4 peices and benchmark running them in parallel.
Or, start throwing money at the problem. I run my big rrd boxes as raid10,
and I've currently got 56 ide spindles distributed over 4 hosts to do our
collection. I'm collecting fifty thousand rrd files once a minute;
average RRD file has 10 DS values in it.
Last thing that comes to mind is to make sure that you're not loading
rrdtool once per file :-) (I doubt you are, but it is worth checking!).
--
Unsubscribe mailto:rrd-users-request at list.ee.ethz.ch?subject=unsubscribe
Help mailto:rrd-users-request at list.ee.ethz.ch?subject=help
Archive http://www.ee.ethz.ch/~slist/rrd-users
WebAdmin http://www.ee.ethz.ch/~slist/lsg2.cgi
More information about the rrd-users
mailing list