SQL Database Migration :Lower version of SQL Server to Higher version

  • Comments posted to this topic are about the item SQL Database Migration :Lower version of SQL Server to Higher version

  • Dathuraj Pasarge (2/9/2015)


    Comments posted to this topic are about the item <A HREF="/scripts/Database+Migration/121942/">SQL Database Migration :Lower version of SQL Server to Higher version</A>

    >> What if we have to migrate 50+ or 100+ user databases from one SQL instance to another SQL instance, well, the best possible method and solutions are outlined over here to ease the administrator activity.

    Even if you have that many databases, I would still go for backup/restore. Why do you think that detach/attach will save time ? You still end up copying the mdf and ldf files.

    Backup/Restore is more secure and reliable as if anything goes wrong (in worst cases) then atleast you have a backup to restore and recover your database/s.

    For backup/restore, you can use third party software like redgate's backup pro or if you are on sql server 2008 r2 then Backup compression is supported in Standard edition as well.

    There are many pre and post steps that you should mention as a part of your SQL Database Migration :Lower version of SQL Server to Higher version.

    ______________________________________________________________________________________________________________________________________________________________________________________
    HTH !
    Kin
    MCTS : 2005, 2008
    Active SQL Server Community Contributor 🙂

  • Great post, and I learned a few things -- like updating usage and statistics. Since reading this I've determined that my SQL Server 2000 DBs have many usage discrepancies, which I will remedy after migrating them to 2008.

    Interesting point about backup/restore -- it's the method I've always used. I will soon be migrating about 200 DBs in one instance, and if I have time, I'll try to benchmark both methods.

    Detach/attach only copies each file once, whereas backup/restore has to copy them twice. Furthermore if you use robocopy instead of xcopy, it has an option to use multi-threading which may speed up the copy even further.

    However I already have a backup/restore procedure working and so that's my default method if I'm not allocated time to experiment.

    Meanwhile, I was trying out some of the scripts, and ran into a problem on #6. Trying to save the generated script into a procedure (which seems the logical place to save it), I got:

    Msg 154, Level 15, State 1, Procedure Migrate6_Usage, Line 9

    a USE database statement is not allowed in a procedure, function or trigger.

    So instead of generating code like this

    USE [dbname]

    DBCC updateusage(0)

    the script should generate

    DBCC updateusage([dbname])

    I suppose this issue may pop up in some of the other scripts also.

  • Hi Kin,

    Thanks for the reading.

    The time we spent for taking a database backup will probably equal to detaching a database and copying their files to the destination server. Here we can avoid the restoration time because attaching a database hardly takes few seconds. Importantly If your destination server doesn't have the same drives and folder structures for the database files, we have to spend a lot of time with WITH MOVE options while restoring the db.

    The given Xcopy does a copy & paste AND not cut & paste, so reattaching the databases on source server just takes one or two seconds.

    Thanks,

    Dathuraj

  • OK, shortly after my earlier post, I discovered the USE-inside-stored-procedure issue did indeed occur elsewhere -- in script 7. Here the solution isn't quite as simple, because EXEC sp_updatestats doesn't have a variant (that I can find) that allows passing the database as a parameter. However a little Googling produced a solution, described here:

    http://stackoverflow.com/questions/8166608/use-database-inside-a-stored-procedure

    It turns out you CAN actually have a USE statement inside a stored procedure, if it's embedded inside an EXEC, like this:

    EXEC('USE [dbname]; sql-statement-here')

    So to generate this code, all that's needed is to change this line in script 7

    SELECT @SQL = 'USE [' + @DB +']' + CHAR(13) + 'EXEC sp_updatestats' + CHAR(13)

    to this:

    SELECT @SQL = 'EXEC(''USE [' + @DB +']; EXEC sp_updatestats'')' + CHAR(13)

    It looks like the same sort of approach can be used in script 8, which also generates USE statements.

  • Hi, first of all thank you for sharing.

    I'm on the way to implement the same kind of thing but all under stored procedures.

    I have a problem with the xcopy command : it asks me whether it's a file or a directory but does not actually copy the file... (Does xxx specify a file name…)

    Any idea about why it would do it ?

    Edit : I found this post which says to use copy instead

  • jelias 69485 (4/1/2015)


    I have a problem with the xcopy command : it asks me whether it's a file or a directory but does not actually copy the file... (Does xxx specify a file name…)

    Any idea about why it would do it ?

    If you put a slash after the directory name it should know it's a directory and not ask.

    Alternatively, you can use the /I switch which, according to the Xcopy help text, acts as follows: "If destination does not exist and copying more than one file, assumes that destination must be a directory."

    This implies yet another way: Apparently if you create the destination directory first, it won't have to ask.

    Alternatively, check out Robocopy (short for Robust Copy) which comes with newer versions of Windows and is kind of like a super-Xcopy.

  • Thanks for the reply.

    Instead, I used powershell's Copy-Item command let.

  • Excellent scripts! Thanks for sharing them. I did run into one issue however. If the database has multiple data files, the generated sp_attach_db statements do NOT include the additional files. Thanks.

    Lee

  • Lots of good info, thanks.

  • Again, thank you for this incredible document. Very helpful. I've had to perform several migrations/upgrades recently and can suggest a few additions to already useful document:

    01. In addition to checking and scripting linked server connections on the original SQL instance you should check to see if any OTHER SQL instances have linked connections TO the original SQL instance. These would need to be updated after the migration and coordinated with impacted users.

    (You could run this query on the other SQL instances: select * from sys.servers where name = '<Original SQL Instance>')

    02. Check for and script any SERVER triggers. (select * from sys.server_triggers)

    03. In addition to scripting any user objects that exist in the system databases you should also script any database users that exist in the system databases (i.e. members of of SQLAgent roles or DatabaseMailUserRole in msdb)

    04. I'm not sure of an easy way to do this one, but I had a problem where a SQL Agent job on another SQL instance had a step that referenced the SQL instance I was migrating. Just something to consider though in planning.

    Thanks again for sharing your work with us.

    Lee

  • For future visitors, you can use https://dbatools.io/ - its powershell based tool that helps migrate entire instance or selected databases from one server to another including everything - linked servers, logins, jobs, etc

    Very powerful and I use it as a default tool for migrations.

    Note: This is a good tool too, but powershell makes life easy.

    Thanks,

    Kin

    ______________________________________________________________________________________________________________________________________________________________________________________
    HTH !
    Kin
    MCTS : 2005, 2008
    Active SQL Server Community Contributor 🙂

Viewing 12 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply