I am surely controversial in my opinions about version control systems here and that is fine by me.
Let me start with this. I consider complete models, including the associated seed data, part of a solutions source-code and as such it should be store-able and version-able, isolated from the actual operational data.
As for VCS itself, every use i seen in the last few decades, worked as a change log system and did virtually nothing with respect to maintaining versions and/or defining versions (let alone upfront).
Let us be real and admit that versions in practice are very ill defined. Hell even even individual changes/issues often are. In database terms, does adding a non-clustered index for performance reasons, constitute a new version? Does it even count as official when not part of the design during the modelling phase? What about configured values in lookup tables, which are usually scripted at design time? Do such changes always matter to the applications running on top of a database? Do the applications and the database always need to be in sync and only together form a version?
There really is no "one" good answer here! Even when one thinks there is, there are huge "buts" and there will be costs/compromises associated with that belief. And even then reality will pop the dream, eventually!
Yet I really do like the idea of having easy access to snapshorts or automated change log of models, constraint definitions, indexes and base tables with important lookup values of my databases. It really should be a feature in SQL Server itself. Call it "pure model backup" and then simply tag specific tables (and/or records) as essential part of the model/design itself. Automated check-in of snapshot scripts or delta's of said scripts would be a logical extension to this and it be fairly straightforward to make.
Overall, and here comes the controversy, current version control systems really are used as file and file collection revision systems. People submit changes to a certain branch, complete or not, working or not just to "not loose their work". It is a form of saving ones work regardless of its finalization state. Even when done in specific branches, those branches are either very course (debug/release) or very ill defined themselves. It all ends up being rather cosmetic and as ill defined as the definition of versions.
As such, over the years I developed a dislike for some, so called "best practice" techniques and "best practice" tools. More often then not, following/using those costs a ton of time, brings lots of discussion and new problems, while honestly they are just liked for the "feel good" / "feel safe" factors and thus loved mostly by the most inexperienced.
I have come to see such things as distractions from the actual substance people need to bring to their work. Writing software of questionable quality, but doing it following an almost religious recipe that most will follow and then feel good/superior about it, has become an epidemic in my eyes.
Tools are nice and all, but they should not define our work. Tools are a means to an end, professionals need to be well able to define and they need to be in control (not the tool)! When a tool brings high costs in price, pre-conditions or technical constraints, a less advanced more down to earth solution looks preferable to me.
We do not live in a perfect world and it would be much better when more people realize that and not try to make it perfect, one way or another. Because rest assured, everyone of us will get perfect wrong in some crucial way!