3

I've got a reasonably large / complicated DB which I need to upgrade in the field from version 1 to version 2. There's a lot of changes in schema and importantly data between the two.

Yes, I know this should have been version controlled alla:
http://www.codinghorror.com/blog/2008/02/get-your-database-under-version-control.html but it wasn't - it will be when I am done.

So, current problem, I'm face with the choice of either going through all the commits or trying to diff between two versions of the db. So far I've tried:
http://opendbiff.codeplex.com/
http://www.sqldelta.com/
http://www.red-gate.com/

However none of them seem to be able to successfully generate schema upgrade scripts because they don't also do the data at the same time. This results in foreign key violations when adding new keys to tables as the table it references is new and while the schema for the table has been created, the data whcih it contains has not. Well it could be, but that requires me to use a different part of the tool and then mix together the two scripts.

I know this may look like a duplicate of:
What is best tool to compare two SQL Server databases (schema and data)? which is where I found most of the existing tools I've tried, but so far I've not managed to get any of these to produce a working schema migration script (I'm really not too fussed about the data, but I do need the data which is required for foreign keys - which tbh is all the difference as I've deploy old version and new version).

Am I expecting too much? Should I give up and start manually stitching together what I do have? Or do I go through all the commits and manually create upgrade scripts?

Community
  • 1
  • 1
jmc
  • 762
  • 9
  • 18
  • 2
    You can try my tools - you can find them at http://www.atlantis-interactive.co.uk. They might not work any better, but they may be able to. Certainly you can create a schema sync that doesn't include the foreign keys, then sync the data, then include the keys. – Matt Whitfield Jul 26 '12 at 10:17
  • @MattWhitfield, wow, I never saw your products before. But back in 2008/2009 when I did my research there weren't any good and affordable products around. Thumbs up for sharing your work! – Lucero Jul 26 '12 at 10:39
  • Thanks Matt, I think your and Lucero's solution about syncing the schema minus FK's, then data, then FK's is going to have to be it. I'm just a little suprised that no-one else has this issue. Does no-one use FK's? Or do people just write proper upgrades scripts during development? – jmc Jul 26 '12 at 10:41
  • @JamieMcNaught - actually, the number of people that email me asking about that is pretty high. Usually though it's because they haven't been using FK's, write them in DEV, sync them to production and realise at that point that their production database has inconsistent data - so the data sync bit doesn't normally come into it. – Matt Whitfield Jul 26 '12 at 10:45
  • @JamieMcNaught Everyone should have these issues... When using my toolset the FKs are handled automatically (temporarily disabled) during initial data set-up and when running table migration scripts. – Lucero Jul 26 '12 at 10:46

2 Answers2

3

I can't think of more powerful tools available than the ones you seem to have tried. If those fail, my homegrown versioning system probably won't help you much either.

However, you should be able to generate an update script and then manually edit it to add the data transformations to it.

And/or you could disable the foreign key constraints for the time that the update script runs.

Lucero
  • 56,592
  • 6
  • 112
  • 151
  • Thanks Lucero. I'll hold out for a while. I'm still hoping someone will say "Didn't you tick this box in the app?"!! – jmc Jul 26 '12 at 10:43
  • Final solution was to generate the schema and data separately (the only option) using SQL Delta and then run all of the schema script excluding the FK's, run the data script and then finally run the FK's. Helpfully, SQL Delta leaves all the FK's in the schema script it generates till the end, so I only needed to split the schema script into two, rather than having to go through the whole thing finding FK's littered everywhere. – jmc Aug 01 '12 at 11:12
  • @JamieMcNaught, thanks for the feedback. Glad you found a solution. Is everything under source control now? ;) – Lucero Aug 01 '12 at 14:06
  • We're getting there. Schema has always been version controlled, but not in a way which is easily upgradable. Now have version numbers in the schema, upgrade scripts and recently added an SMO script + MD5 hashing function to check that the schema is actually the version number it claims to be before upgrading. – jmc Sep 05 '12 at 16:43
  • @JamieMcNaught, ha, that sounds pretty similar to what my versioning system does - did you ever have a look at it? I also create hashes (SHA1 though) of the schema objects, and store that in the database to quickly find out if the DB is the "same" as the program schema. And I use my own scripting since SMO has proven to be unpractical (different releases of the SMO script slightly differently, and having SMO as a dependency with the unmanaged parser component is a pain). – Lucero Sep 05 '12 at 20:41
0

There is no such thing as doing schema and data "at the same time". Even if you have them in one big script you would still be doing the schema first and then the data. If the schema script creates a new table and adds a constraint to it there is no reason you should get a referential integrity violation error as there are no rows in those tables.

In any case, you should give our xSQL Schema Compare and Data Compare tools a try, you will be impressed with the performance and the level of control you get.