generating report from 14 other sql servers

  • I have 14 sql server 2000 instances. (Yes I know, old stuff and we should upgrade.)

    Each server has an automated task to execute a stored procedure which writes to a text file a very simple report. The report is stored at c:\foo\report.txt.

    I need to consolidate this to a single point and email the results to a distribution list.. How I do this is entirely up to me. I have both a unix and a Windows machine available to me that can talk to each sql server. I know perl. I know the db portion of sql server pretty well (strong Sybase background) but don't know much about SIS and some of the other cool tools in sql server.

    My first task is to simply get the 14 text files and mail them out.

    My second task is to take the text data (or the output from the stp) and create a summary table in another database (my preference would be mysql).

    So how can I do this?

    My first thought was a perl program (either windows or unix) using odbc to talk to the 14 servers, run the stp, suck up the data, store it in mysql, and generate the email. Is there a better way?

    Thanks,

    John

  • better way? no...different ways , yes.

    you are going to use the tools you know, and get the job done. I don't see a problem with that at all.

    my background is limited to vb6/.NET and SQL...so I'd do something similar with ODBC linked servers and .net. the steps you re planning would be the same, I'd just use a different set of tools.

    others might argue that you could use this tool or that to replace a certain step in your plan, but if you KNOW how to use the tool, I'd use that before trying something else. Only if there was a performance issue, or maintenance issue in the future would i consider trying something else out, i think. Just make sure if you were to get hit by lightning or get promoted or something, it's easy and straightforward for someone else to take over

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Thanks Lowell,

    Yea, I'm just concerned about odbc in perl to sql server. Never done that. But I guess using tools I know (perl, dbi, mysql) etc is likely the best way to go.

    Thanks and I'm still looking for other ideas.

    John.

  • Btw, have you tried to use Linked Server to consolidate them?

    Since your instance is part of the MS SQL Server Database Engine,

    I think you can use link server to solve this issue.

    I recommend you to use synonym if you wanna use Linked Server.

    Regards,

    Eko Indriyawan

  • Yes, I had thought of that and discarded it as being too much of a hassle, primarily because of network issues. The 14 servers are pretty isolated from each other and I'd have to offer many dead chickens to our network people to get firewall holes open (and even then the answer might be no, or "...maybe by Christmas").

    But (go figure) I do have both a unix and a windows box (the windows boxes has no sql server on it) that can get to each of the 14 sql servers.

    I'm wondering whether odbc + perl from unix will be easier than odbc + perl from windows. Any thoughts?

    But thanks.

    John.

  • I'm sorry, I never use perl before.

    Now, I only use the Delphi for programming.

    I hope yourself or another people can solve your issue.

    Regards,

    Eko Indriyawan

  • If you know Perl, and have connectivity, why not just copy each text file to your local machine? I'd consolidate the data on your one machine first. Make that a single set of jobs (loosely coupled), so if one fails it doesn't impact others. And it's easily extensible. Have a parameter to your Perl job that tells it to go to xxx machine and grab a file.

    Second, I'd make a local database and then import the data into it. I assume you have a standard set of data, so as you import it, add a server name (source server) and a time notation.

    Now you can summarize that data, and more importantly, you can detect if you don't get data from a machine on a particular day.

  • thanks, that's a great suggestion. I can just suck up the file using scp/ftp or whatever works on these windows boxes (hoping I can find something that'll work) then I don't even need to fool with dbi. I'll give this a try Monday.

    John.

  • yep, that's the approach I'm going to take. have not had time to tackle this yet. Probably get serious with it later this week.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply