Scripting objects into TFS automatically

  • I'm looking for a way to automatically script database objects into TFS nightly. I've looked at ApexSQL Diff, Red-Gate, Randolph, yet none of the even work. Randolph is riddled with bugs, ApexSQL doesn't work with native TFS, and red-gate just doesn't have anything to do this.

    Any suggestions?

    Thanks

  • Are you trying to get the CREATE statement checked into TFS every night? I'm not sure why you do this. Really you only check in changes, so you'd just be looking for changed objects.

    The Red Gate tool requires the developer to do this, though any developer can see if an item in a db is changed against VCS.

    If you want to automate some check to see if things are not getting checked in, really you are auditing. You might like this article: https://www.simple-talk.com/sql/database-administration/auditing-ddl-changes-in-sql-server-databases/. It can probably be adapted for TFS.

    Disclosure: I work for Red Gate

  • Steve Jones - SSC Editor (2/5/2014)


    Are you trying to get the CREATE statement checked into TFS every night? I'm not sure why you do this. Really you only check in changes, so you'd just be looking for changed objects.

    The Red Gate tool requires the developer to do this, though any developer can see if an item in a db is changed against VCS.

    If you want to automate some check to see if things are not getting checked in, really you are auditing. You might like this article: https://www.simple-talk.com/sql/database-administration/auditing-ddl-changes-in-sql-server-databases/. It can probably be adapted for TFS.

    Disclosure: I work for Red Gate

    Thanks Steve. The problem is that the developers don't want to be bothered remembering to check in their code from visual studio. So, they want all objects (changed only of course) checked into tfs on a nightly basis. I've looked a the red-gate tool and it requires the developer to do something. I also am looking at database projects in visual studio, which basically seem to do the same thing as the red-gate tool.

  • Yeah, most of the tools are integrated with the developers' actions. However, this is what VS requires. They make commits manually in VS. If they don't, or they don't add comments, that's an issue, and should be addressed by management. That's some of the power of VCS: comments that explain things.

    The same thing occurs with the VS/RG tools. You make a change, you test it, you think it works, so you commit it.

    The automatic thing can work, but it loses comments and you have to review all code when you're looking back for issues rather than seeing what things were changed. IMHO, if you're not adding those comments in, you're not being professional and it's lazy. Auto check ins are a bad idea. Auditing is a good idea, because people forget, and we understand, but choosing to just not try is bad.

  • Agreed. I was just writing a c# app using the TFS API to do the automatic check-ins and figured why can't they just do it themself.

    So my question to you then (if I was to look at red gate's tool), is say we're going agile development, can multiple c# developers utilize a single database project at the same time? Even if some developers are using different c# projects but pointing to the same database?

  • Yes. We usually recommend dedicated databases so schema changes from one developer don't break another's work, but it does fine with shared databases. The objects changed by other developers will appear on each developer's system. So they might get confused about whether they changed an object, but the check in process will show who committed things once they're changed.

  • So there isn't a need to script the entire contents of a database. Basically each developer would only script into their project those objects which they will utilize, correct?

  • No, or not really.

    Here's how the items should work. When I implement VCS, I check in my entire C# project or script out and check in my entire database schema. We're in sync.

    The DB exists on a server instance. It's there, and it's a central point of access for everyone. I make a change to a schema object in SSMS. That change only exists in my editor.

    Once I execute the change, I've then changed the server. If this is my own server, I only affect my own work. If it's a shared server, I potentially impact other developers and slow them/interrupt them.

    In both cases, I should have a script file (even using the GUI, save the script file) that I can save and commit to VCS. This is either a new commit or an update to an existing object. This happens the same as a C# commit from VS. The RG tooling (or VS tooling) makes this easier from one front end as opposed to the VCS client, but in either case, I manually commit.

    At this point VCS looks like the database (or the C#project). We're now in sync.

    The idea is I repeat this over and over, making changes that cause my project or database to become out of sync with VCS and then I sync it back up. Tooling makes this easier, but it needs to be done.

    There's no need to script my project or db objects constantly and determine what's changed. I check in things when I change them and the compile/testing says they're good enough to check in.

    Auditing is a process that periodically (probably daily) looks at VCS and my project/database and determines what's out of sync. If something is, I send a note to the mgr/developer to check things in.

    Tooling is there to make it easy for a developer to see what he/she has changed, or what others have changed by looking at VCS -> project/db comparisons and determining differences.

    If you work with a dedicated db for each developer, the RG tools will easily pull the entire project from VCS and drop it onto an instance/db for the developer.

  • here's what i think you could do:

    use powershell to check out a folder, and then script out any objects.

    then use powershell again to add any new items, and check in the who folder, with a comment like "changes found + date; then it would ignore items that did not change automatically.

    i googled "powershell check in TFS" and this was the first worthy link:

    http://social.msdn.microsoft.com/Forums/en-US/f796a935-9760-4590-9335-9213be764f2f/documentation-for-using-tfs-with-powershell?forum=tfspowertools

    once that is working, it should be easy to create a scheduled tas that runs once a day or so.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • I read some recent SQL blogs where the comments indicated at least a few people had put together DDL triggers and PowerShell scripts to extract changes from a database each night and check it into source control. Unfortunately I don't think I kept track of them and people hadn't published the scripts themselves.

    If I was in your position, assuming the database structure was simple enough (i.e. not multiple databases referencing each other, < 1000 tables / procedures) I would probably investigate switching to the free SQL Server Data Tools; importing the existing structure or project, getting it up and running, and changing all of your developers over (depending on how many there are).

    It provides an SSMS-style GUI interface to modifying tables and maintains the corresponding scripts itself, and full source control support. I'm also fairly certain it can also compare what's in the repository against the existing database structure to find people who have made unauthorized changes (though its intended purpose for that is to generate change-scripts for you to deploy).

    The only problem is that this could be a lot of work, and some of the import scripts might need editing (i.e. SSDT doesn't like .. references with a .dbo. in between). There is also very little published documentation on using it in larger scenarios and managing the whole SDLC process... sure they tell you about generating a DACPAC... but I really need a more comprehensive workflow than that.

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply