Copy tables from one db to another db

  • I have a user that needs to truncate a number of tables in DB2 and then make a backup of some tables from DB1 before a lengthy import process begins.

    What I would like to do is have the user call an SP that truncates the destination tables in DB2 and then copies the data from DB1.

    I have read quite a bit about least permissions and how to setup a truncate_proxy user without a login. That part I got working. What does not work is the actual copying of the data. I keep getting

    "The server principal "truncate_proxy" is not able to access the database "DB1" under the current security context.

    What users/permissions do I need to setup on DB1 in order to be able to copy the data?

    I have total control of the database and can set whatever permissions/rights/users/etc.. that are necessary .

    Thanks.

    __________________________________________________________________________________________________________
    How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/

  • Thanks for posting your issue and hopefully someone will answer soon.

    This is an automated bump to increase visibility of your question.

  • Don't give the privs to the user.  Instead, write a stored procedure that uses EXECUTE AS OWNER and give the user privs to execute that stored procedure.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I like Jeff's approach here, but my preference when moving large amounts of data from table A to table B on a different database is to do it outside of SQL Server.  Main reason being I don't want to bog down tempdb or my main system with tasks that can be offloaded to a different tool.

    Just my opinion on this, but I feel this is a good use case for SSIS.  SISS first truncates the destination tables, then pulls the source tables into memory and finally pushes them to the destination.  There are some things to watch out for with SSIS (such as schema changes), but you can set this up as a job and schedule it or have the end user run the job to move the data.  Get the same risk with permissions here with regards to moving the data, but get the hidden benefit of having some logging built in.

    Another nice thing about going with the SSIS approach is that you could use SSIS to do your new data import as well.

    Personally, I am not a big fan of the "copy data to another table" as a backup method/process and much prefer the database backup (full backup) approach.  If the method the end user is using to push data in is untested or not trusted, I prefer having the end user push the data in on a test system and ensure the data import worked as expected before repeating on live.  This is my preference to doing it because I have a known good version of the table (on live), a testable version of the table (on the test system), and test results where I can validate that things worked as expected.  If I hose the test database, restore from backup.

     

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • Jeff Moden wrote:

    Don't give the privs to the user.  Instead, write a stored procedure that uses EXECUTE AS OWNER and give the user privs to execute that stored procedure.

    That is the approach I am taking. It is the easiest way given what I need to do. In my research of this, I did come across this deep dive article  from Erland Sommarskog http://www.sommarskog.se/grantperm.html#execasserver.  I'm still reading and making my way through the examples!

    Thank you.

    • This reply was modified 3 years, 1 month ago by  LinksUp.

    __________________________________________________________________________________________________________
    How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/

  • Mr. Brian Gale wrote:

    I like Jeff's approach here, but my preference when moving large amounts of data from table A to table B on a different database is to do it outside of SQL Server.  Main reason being I don't want to bog down tempdb or my main system with tasks that can be offloaded to a different tool.

    Just my opinion on this, but I feel this is a good use case for SSIS.  SISS first truncates the destination tables, then pulls the source tables into memory and finally pushes them to the destination.  There are some things to watch out for with SSIS (such as schema changes), but you can set this up as a job and schedule it or have the end user run the job to move the data.  Get the same risk with permissions here with regards to moving the data, but get the hidden benefit of having some logging built in.

    Another nice thing about going with the SSIS approach is that you could use SSIS to do your new data import as well.

    Personally, I am not a big fan of the "copy data to another table" as a backup method/process and much prefer the database backup (full backup) approach.  If the method the end user is using to push data in is untested or not trusted, I prefer having the end user push the data in on a test system and ensure the data import worked as expected before repeating on live.  This is my preference to doing it because I have a known good version of the table (on live), a testable version of the table (on the test system), and test results where I can validate that things worked as expected.  If I hose the test database, restore from backup.

    I agree with you that the full database backup is the way to do backups. I have daily full backups with incremental backups every hour and even a differential thrown in there. But in this case,  the import is not just a straight import. There is a lot of processing on the data that is being read in before it is written to the database. It is easier to copy the 8 tables that are affected to another database then do the import. If there is a problem with the import or it crashes, I can quickly restore just those 8 tables.

     

    __________________________________________________________________________________________________________
    How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/

  • For a little disk space, you can create staging tables in DB2.  Load to the staging tables.  You then don't need a separate back up of the original tables because they're still there, and accessible, during the load.

    Once you've verified the load, you can swap out to make the staging tables the "real" table.  You can do this with object renames or a SWITCH.  Or, if you really wanted, you could TRUNCATE the main table and load the staging table into it, but that would be a lot slower.

    If you use object renames -- my preferred method -- you can use a separate filegroup for the staging tables, so they never even affect the physical storage of the main table.  You just need very brief exclusive locks long enough to rename the objects.

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • I would probably use BCP.

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply