Setup SQL Server on Amazon RDS

  • Great Tim;

    That is basically what I did. The only issue I had was with SQL Data Compare. Now that I got feedback on how to relocate the deployment script I may just give this a try, at least on some of the larger tables.

    I am aware of some of the quirks with SSIS import/export and have dealt with them in the past. I did more than my share of "data only" restores from one database to another.

    I need to run some benchmarks with SSIS and BCP to see which one is going to perform quicker. My working time window is 12 hours. Last time I initialized the database it took about 36 hours, non stop.

    I'm up for the challenge.

    Thanks for you input.

    Kurt

    Kurt W. Zimmerman
    SR DBA
    Lefrak Organization
    New York, NY

    http://www.linkedin.com/in/kurtwzimmerman

  • Good luck!

    Best Regards

    Tim

    .

  • Kurt W. Zimmerman (1/20/2015)


    I need to run some benchmarks with SSIS and BCP to see which one is going to perform quicker. My working time window is 12 hours. Last time I initialized the database it took about 36 hours, non stop.

    Hi Kurt. If you are moving data from a regular instance of SQL Server into either Amazon RDS or Azure SQL Database, one option is to try the DB_BulkCopy stored procedure in the SQL#[/url] SQLCLR library (which I wrote). This proc is in the Free version and is essentially the same thing as bcp. It might be slightly faster due to not needing to transfer the data to an intermediary server (assuming that SSIS / BCP / anything else would not be running on the server). And being a stored proc makes it very easy to run for several tables in a proc and/or SQL Agent Job 🙂 .

    Take care,

    Solomon..

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Solomon Rutzky (1/21/2015)


    Kurt W. Zimmerman (1/20/2015)


    I need to run some benchmarks with SSIS and BCP to see which one is going to perform quicker. My working time window is 12 hours. Last time I initialized the database it took about 36 hours, non stop.

    Hi Kurt. If you are moving data from a regular instance of SQL Server into either Amazon RDS or Azure SQL Database, one option is to try the DB_BulkCopy stored procedure in the SQL#[/url] SQLCLR library (which I wrote). This proc is in the Free version and is essentially the same thing as bcp. It might be slightly faster due to not needing to transfer the data to an intermediary server (assuming that SSIS / BCP / anything else would not be running on the server). And being a stored proc makes it very easy to run for several tables in a proc and/or SQL Agent Job 🙂 .

    Take care,

    Solomon..

    Thanks Solomon... I will have to check it out. I'll test it this evening.

    Kurt

    Kurt W. Zimmerman
    SR DBA
    Lefrak Organization
    New York, NY

    http://www.linkedin.com/in/kurtwzimmerman

  • Solomon, just wondering how much difference in performance is your DB_BulkCopy compared to creating an SSIS EXPORT/IMPORT package?

    Kurt

    Kurt W. Zimmerman
    SR DBA
    Lefrak Organization
    New York, NY

    http://www.linkedin.com/in/kurtwzimmerman

  • Kurt W. Zimmerman (1/21/2015)


    Solomon, just wondering how much difference in performance is your DB_BulkCopy compared to creating an SSIS EXPORT/IMPORT package?

    Kurt, I have no idea what the difference performance-wise is. I do know that most of the actual work performed to do the data movement is handled by the SqlBulkCopy functionality that exists in the .NET Framework, and that might be the same code that SSIS uses. If so, then it should be about the same. Obviously a stored procedure is not going to be multi-threaded like SSIS, but as I said before, the data transfer is direct from SQL Server to destination without going through SSIS and/or another server that might be hosting SSIS.

    I also know that I can write a simple cursor to loop through the tables in a database and call DB_BulkCopy for each one (or even a subset or whatever) in the time it would take me to open BIDS (Business Intelligence Development Studio), create a new SSIS package, and place the first task in it ;-).

    I like the concept of SSIS and think it is really powerful, but if I can avoid using SSIS then I avoid it.

    Take care,

    Solomon..

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Solomon Rutzky (1/22/2015)


    Kurt W. Zimmerman (1/21/2015)


    Solomon, just wondering how much difference in performance is your DB_BulkCopy compared to creating an SSIS EXPORT/IMPORT package?

    Kurt, I have no idea what the difference performance-wise is. I do know that most of the actual work performed to do the data movement is handled by the SqlBulkCopy functionality that exists in the .NET Framework, and that might be the same code that SSIS uses. If so, then it should be about the same. Obviously a stored procedure is not going to be multi-threaded like SSIS, but as I said before, the data transfer is direct from SQL Server to destination without going through SSIS and/or another server that might be hosting SSIS.

    I also know that I can write a simple cursor to loop through the tables in a database and call DB_BulkCopy for each one (or even a subset or whatever) in the time it would take me to open BIDS (Business Intelligence Development Studio), create a new SSIS package, and place the first task in it ;-).

    I like the concept of SSIS and think it is really powerful, but if I can avoid using SSIS then I avoid it.

    Take care,

    Solomon..

    Thanks Solomon.... The neat thing about creating the SSIS package is you don't need to go into BIDS to do it, just but just use the IMPORT/EXPORT wizard. However I did find that this has limitations and may not always work.

    I agree with you simply spinning though a cursor is easy to write. What I've done was develop a control table that contains the name of the table and the status. This way I can run multiple copies of the same script without table contention.

    I did this with the syncing process I wrote and had as many as 8-10 processes running at the same time. I had to watch the disk IOs, memory usage & NIC utilization because it was easy to swamp those resources and bring everything to a halt.

    I attempted to use your DB_BulkCopy last evening with no success. Maybe if I can drop you a PM with the errors I was receiving to sort this out.

    Again thanks.

    Kurt

    Kurt W. Zimmerman
    SR DBA
    Lefrak Organization
    New York, NY

    http://www.linkedin.com/in/kurtwzimmerman

  • Kurt W. Zimmerman (1/22/2015)


    I agree with you simply spinning though a cursor is easy to write. What I've done was develop a control table that contains the name of the table and the status. This way I can run multiple copies of the same script without table contention.

    I did this with the syncing process I wrote and had as many as 8-10 processes running at the same time. I had to watch the disk IOs, memory usage & NIC utilization because it was easy to swamp those resources and bring everything to a halt.

    Great! Then you already have the perfect setup for automating this.

    I attempted to use your DB_BulkCopy last evening with no success. Maybe if I can drop you a PM with the errors I was receiving to sort this out.

    Yes, that would be great.

    Take care,

    Solomon..

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

Viewing 8 posts - 16 through 22 (of 22 total)

You must be logged in to reply to this topic. Login to reply