August 11, 2005 at 2:53 am
I am relatively new to using DTS so I will give you as breakdown of what I am trying to achieve.
I have two SQL servers, one on a local machine, and one in a remote location acting as our live server.
I have an admin application which amends the local database and when these amendments are complete I want the editors to instigate a DTS package to upload the changes.
I have tried using DTS Import/Export wizard to export the whole local database, but this always seems to conflict with my FK constraints.
Is there an easy way to just upload differences between two databases?
I have also tried just deleting and then uploading again the data from the relevant tables, but I cannot seem to get an 'on success' workflow item to link two data transformation tasks??
Any advice is greatfully sought.
August 11, 2005 at 3:49 am
I presume that you are not capturing the changes themselves - you just know what tables are affected and want to send those to your 'live' server? This sounds a bit strange - almost as if you want to overwrite what is on your live server, which doesn't sound very live to me
If you really are just overwriting what is on the live server, why not just automate a backup/restore process using an Agent, having your live server as a 'linked' server to make this easier?
Alternatively, perhaps you could look into replication?
I do not know why deleting everything from the live server's tables and then reloading does not work. Perhaps there are FK problems (you are not performing the deletion in the correct logical order). Perhaps there are identity columns that do not match. We would have to go into a lot more detail to work it out.
Are the differences purely new records? Or is it a mixture of new, amended and deleted records? Is there a modify/creation date on the tables that you can use to select newly created/amended records?
I'm starting to go round in circles here. A bit more info please!
Phil
August 11, 2005 at 5:47 am
Phil
Basically the admin system makes changes to the local database (inserts, updates and deletes). Then when they are happy they will commit these changes to the live database. Now all I want to happen is that the changes from the relevant tables are uploaded (I won't know which changes they have made, just the tables which may have been affected.)
If I just do a straight copy transformation it will add all the new records but duplicates all the exisiting information in the tables. So that is why I set up a delete task first.
At one time I had set up a task which got 90% of the way to achieving what I wanted. In order to comply with FK_constraints I had to delete several tables worth of data from Live and then upload all the new data. But in DTS designer where I had 2 data tranformation tasks one after another I could not tell DTS to only copy the 2nd table after all the data was copied to the 1st table. It would not let me apply an 'on success' between 2 data transformation tasks.
Hope this makes it slightly clearer what I am trying to achieve?
August 11, 2005 at 6:07 am
That is somewhat clearer, though I am still a bit puzzled about how you are prepared to wipe all of the data that exists in some of the tables in the 'live' database. Aren't any changes being made independently to this database that you would like to keep?
I am not a DTS expert, so cannot help resolve your problem with parallel transformation execution. But if it were me, I think I would write the whole lot in a stored procedure - which does give you the required control over the execution flow - and then schedule that to run as often as required.
Phil
August 11, 2005 at 6:15 am
Not in the tables affected. These only contain information for display. The other tables which may be updated on the 'Live' server, would remain unaffected and the batch process which then copies these back down to the 'Local' database would ensure that they remain intact.
Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply