On Thu, Jun 21, 2012 at 1:55 PM, Sean Carolan <scarolan(a)gmail.com> wrote:
I wrote a basic python CGI script to parse cobbler data and generate
some graphs. The script works fine but it takes a really long time to
run due to the large number of hosts stored in cobbler's *.json files.
Any suggestions for improving performance? I know I can convert the
*.json data and import into a mysql database, but thought I'd check
here first before rewriting the code.
How many systems? What kind of disk is the system running on? If it's
a slower SATA type, you might look at relocating the /var/lib/cobbler
directory to another partition, especially if your server is busy
(lots of http requests to /var/www for instance).
Beyond that, you might look at one of the cobbler nosql backends
(couch and mongo), however those are not widely used and probably not
ready for prime time yet. Dumping the data into one of the various
nosql solutions would probably be easier anyway, since JSON is well
supported there and you wouldn't have to worry about mysql schemas
(unless you plan to use mysql as a bastardized key/value store, which
is quite common too).
A sample of the script might be useful too, to see how you're
requesting the data.