Manual T-SQL Scripting VS SSDT (SQL Server Data Tools)

Started by Mr. Analog, October 28, 2014, 12:12:10 PM

Previous topic - Next topic

Mr. Analog

SSDT Pros:
-Lets you easily compare database schemas
-Generate a targeted merge script

SSDT Cons:
-BATCHES every goddamn THING
-Generated Script cannot be created as a batch
-Generated Script does not add IF EXISTS checks
-Generated Script is not for updating any target DB only the DB you compared to

VS

Manual Script Pros:
-You control the batch statements
-Add control flow where needed
-No weird flip flop toggling of DB features

Manual Script Cons:
-You need to know every database change and include it
-Takes a long time

I've used SSDT a lot in the past but there are two key assumptions that Microsoft needs to change to put other schema compare tools out of business:

1. The ability to automatically batch the whole script

See SSDT puts a GO (end of batch) statement after literally everything, do I need an individual batch for a PRINT statement? No! Obviously some things need to be batched, like SPROC creation but then why couldn't SSDT give you an option to use EXEC sp_executesql for proc creation le sigh

2. The ability to automatically add IF EXISTS statements

This should be obvious but it's not and it's very frustrating, obviously there could be collisions between things like indexes but I kind of want to know if there are name conflicts by setting print statements that would allow me to print out the name of something I think exists but doesn't, at least it would make debugging easier!

THE VERDICT?

Both ways suck, but in different ways

UGH
By Grabthar's Hammer

Thorin

If it's Sql Server only, you'd probably enjoy using DB Ghost.

I will continue to use manual scripts.

You haven't covered the problem of when something goes wrong in the middle of a script with GO statements in it - if you're in a transaction you get booted out, so even though the transaction got rolled back and your first thirty changes in your script didn't occur, the rest of the changes will occur and will do so outside of a transaction.  This could completely screw up your database!  In short, making db update scripts safe is not as simple as starting them with a Begin Transaction statement.
Prayin' for a 20!

gcc thorin.c -pedantic -o Thorin
compile successful

Mr. Analog

Believe me I've seen this in action more than a few times and it's a load of horse pucky

Like I say, the SSDT schema compare tool is like 90% useful for doing exactly one thing and with just a few tiny tweaks could be an outstanding tool for building scripts.

The reason why it's not? Microsoft wants you to make updates via packages and their own brand of cool aid so no good schema merge tools for YOU
By Grabthar's Hammer

Darren Dirt

Sometimes after a long mentally exhausting day I read the forum for a change in focus and read stuff like this and am thankful that I am not like a hardcore DBA role.

And sometimes threads are so tech-heavy or full of TLAs and ETLAs that I feel like this guy:


Also I love lamp.
_____________________

Strive for progress. Not perfection.
_____________________

Mr. Analog

I hear ya

I just get so frustrated when I see stuff that's mere inches from being everything I wanted.

Maybe as a side project I'll write my own comparison / script generation tool, except with hookers and blackjack... in fact forget about the script tool!
By Grabthar's Hammer

Thorin

Darren, we're not in hardcore DBA roles, either.

In fact, we barely need to remember how to manipulate the SGA settings in Oracle...
Prayin' for a 20!

gcc thorin.c -pedantic -o Thorin
compile successful

Mr. Analog

I'm guessing we're a lot closer to those roles than most
By Grabthar's Hammer

Lazybones

#7
Finding a DBA that can explain "why" max out the ram is the only solution they will give to anything or a developer who bothers to take two seconds to optimize their query or index their damn table appears to be rare.

In the mean time it's my fault the DB that is over 4TB is slow even, thought it is on high speed disk and write latencies are in the single digits .

Edited some typos and clarifications.

Mr. Analog

Ugggg

Y'know what's funny? Or rather not funny? For a fraction of hardware upgrade cost you can just buy DB Analysis software that can point out optimisations

Then again I see people trying to use things like ORMs to do seriously heavy operations that should be in PROCs or rethought completely.

ORMs are great for flat structures, for everything else build Views or SPROCs
By Grabthar's Hammer

Lazybones

Quote from: Mr. Analog on October 30, 2014, 09:57:01 AM
Ugggg

Y'know what's funny? Or rather not funny? For a fraction of hardware upgrade cost you can just buy DB Analysis software that can point out optimisations

Then again I see people trying to use things like ORMs to do seriously heavy operations that should be in PROCs or rethought completely.

ORMs are great for flat structures, for everything else build Views or SPROCs

Please send me the link to this tool NOW.

Mr. Analog

SQL Server Management Studio has a built in tool for tuning that works well:
SQL Server Database Engine Tuning Advisor

You can combine this with SQL Server Profiler and even using some of the default profile templates learn where bottlenecks are and then use the results to run the Tuning Advisor for suggestions on what to add / change. Sometimes all it takes is the right index here and there.

Redgate has tools for optimising / analysing SQL Server as well that I have used in the past that also work pretty well.

There are more out there for sure!
By Grabthar's Hammer

Thorin

Lazy, are you running a single 4TB database on Sql Server?  Or is that multiple databases on a single instance?

If you are using Sql Server, then Database Engine Tuning Advisor is a useful tool.  But you're right, there aren't enough developers that really, truly, understand how their queries are more or less effective.  And DBAs do tend to dumb down the message when talking to people they perceive as not technically competent.  But if you have a decent DBA and a decent developer and tell them to work together to figure out ways to make the queries more efficient and tell the developer to make any changes the two of them decide need to be made, suddenly you have empowered, knowledgeable employees who just got told to do their job and do it well.
Prayin' for a 20!

gcc thorin.c -pedantic -o Thorin
compile successful

Mr. Analog

Developers need to learn what a Query Execution Plan is and why it's important to look at for frequently run queries.

That .001 millisecond collation step on a join might not mean much for a 100 records but when your clients are pulling 100k+ in parallel the cost multiplies significantly.
By Grabthar's Hammer

Lazybones

Ya I thought you where referring to something more than query analyzer, been there done that. More problems than I can solve for the external developers.

I currently have one application that has a dedicated SQL Instance / VM, with several databases totaling over 4 TB, some of the individual ones are between 1 and 2TB.  I have another application with a dedicated SQL DB Instance / VM with a DB around 128GB..

In both cases the we have found HIGHLY inefficient queries (query analyzer,) but that isn't the whole performance story we nailed some big ones but getting down further would take too much of my teams time when this is supposed to be the job of the developer, however they seem lazy and the solution is toss more resources on the fire.

We do not have a Dedicated DBA, I contracted one in that highlighted a few of the bad quires but I can't afford a full time resource. I also can't afford to be providing hundreds of gigs of ram to DBs  just to try and cache the ENTIRE DB in ram once they get to those larger sizes.. That is why we have them on SSD accelerated / Highspeed disk.

Thorin

Wow, sounds like you have crappy devs.  I would take it personally if someone told me my queries were garbage inefficient queries.
Prayin' for a 20!

gcc thorin.c -pedantic -o Thorin
compile successful