The benefit of T-SQL 3 part naming

So yesterday I made the second ever major technical boo-boo of my career - I accidentally ran a dynamic "DROP TABLE" script on an extremely large, high-availability system. The last time I did something of this magnitude was literally when I first started out with T-SQL, and I ran a load of DROP / CREATE PROC statements on a similar sized system, right in the middle of rush hour; instantly breaking the 500 of so clients that were connected to it. Obviously this kind of thing is very embarrassing, not to mention the massive inconvenience caused throughout the departments using the system. Luckily the database is backed up hourly (thank you Mr DBA), and although I made the mistake 6 minutes before the next back up, the impact was relatively minimal due to the main user's being in a departmental meeting. I'm blogging to outline my mistake and document changes to my T-SQL script, which up until 13:34 yesterday, made my life considerably easier and I thought was the dog's plums.

So yeah... as previously mentioned, I write my table creation scripts by hand and run them against a database - it makes database development, set up and deployment a cinch, though I've always had issues around the amount of time it takes me to write the DROP TABLE statements, due to constraint, trigger and key restrictions. Take the following example:

CREATE TABLE UserStatus (
    UserStatusID                INT                NOT NULL    IDENTITY    PRIMARY KEY,
    [Description]                VARCHAR(30)        NOT NULL    UNIQUE
)

CREATE TABLE Users (
    UserID                        INT                NOT NULL    IDENTITY    PRIMARY KEY,
    UserStatusID                INT                NOT NULL    FOREIGN KEY REFERENCES UserStatus(UserStatusID),
    UserName                    VARCHAR(30)        NOT NULL    UNIQUE
)

The FOREIGN KEY reference on the UserStatusID column on the Users table means that you cannot run a DROP TABLE command on the UserStatus table before the Users table is dropped. You have to run the statements in the following order:

IF OBJECT_ID('Users', 'U') IS NOT NULL
    DROP TABLE Users
IF OBJECT_ID('UserStatus', 'U') IS NOT NULL
    DROP TABLE UserStatus

While running two DROP TABLE statements in a pre-defined order is not a major concern, maintaining a list of a few hundred tables, all with their own constraints, is a nightmare. Enter the Script of Doom; it basically loops through all of the tables in the current database, disables and drops all constraints, then drops all the tables:

-- Delete all tables
DECLARE @CONSTRAINT_CATALOG NVARCHAR(2000), @TABLE_SCHEMA NVARCHAR(2000)
DECLARE @TABLE_NAME NVARCHAR(2000), @CONSTRAINT_NAME NVARCHAR(2000)
DECLARE @table NVARCHAR(MAX), @command NVARCHAR(MAX)

DECLARE constraints CURSOR READ_ONLY FAST_FORWARD FOR
SELECT DISTINCT c.TABLE_SCHEMA, c.TABLE_NAME, c.CONSTRAINT_NAME
FROM sysobjects s
INNER JOIN INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE c ON s.[name] = c.CONSTRAINT_NAME
WHERE s.xtype IN('F', 'C')

OPEN constraints

FETCH NEXT FROM constraints INTO @TABLE_SCHEMA, @TABLE_NAME, @CONSTRAINT_NAME

WHILE @@FETCH_STATUS = 0
    BEGIN
        SET @table = @TABLE_SCHEMA + '.' + @TABLE_NAME

        SET @command = N'ALTER TABLE ' + @table + ' NOCHECK CONSTRAINT ' + @CONSTRAINT_NAME
        EXECUTE sp_executesql @statement = @command
        PRINT @command
        
        SET @command = N'ALTER TABLE ' + @table + ' DROP CONSTRAINT ' + @CONSTRAINT_NAME
        EXECUTE sp_executesql @statement = @command
        PRINT @command
        
        FETCH NEXT FROM constraints INTO @TABLE_SCHEMA, @TABLE_NAME, @CONSTRAINT_NAME
    END
CLOSE constraints
DEALLOCATE constraints

-- Drop tables, now constraints are deleted
SET @command = 'DROP TABLE ?'
EXEC sp_MSforeachtable @command1 = @command, @command2= 'PRINT ''? dropped'''

As exciting as this script is, this was the one that was responsible for causing my issues yesterday. You can run this script on any MSSQL database, and it will indiscriminately drop all table constraints, followed by every table in the database - stopping the SQL Query Engine from reporting constraint errors during a DROP TABLE. Excellent when dealing with my above issues; not so excellent when you accidentally execute it against a live database, used by God's secretary to log how many paedophiles are accidentally let in to Heaven. Suffice to say, I have learned my lesson.

Whilst discrimination is usually a bad thing, in this instance it is more than welcome. There are a few simple ways to avoid the monumental catastrophe that I caused yesterday - the first being the simple addition of a "USE" statement at the top of the file, with more elaborate attempts surrounding creating transactions, wrapping in TRY...CATCH statements and using RAISERROR to halt script execution. Here's my new, overly safe version of the same script:


BEGIN TRY

    DECLARE @DBName VARCHAR(100), @command NVARCHAR(MAX)
    -- HARD CODE TO DB
    USE [MyDatabase]
    SET @DBName = '[MyDatabase]'

    BEGIN TRANSACTION DelTables

        -- Delete all tables
        DECLARE @CONSTRAINT_CATALOG NVARCHAR(2000), @TABLE_SCHEMA NVARCHAR(2000)
        DECLARE @TABLE_NAME NVARCHAR(2000), @CONSTRAINT_NAME NVARCHAR(2000)
        DECLARE @table NVARCHAR(MAX)

        DECLARE constraints CURSOR READ_ONLY FAST_FORWARD FOR
        SELECT DISTINCT c.TABLE_SCHEMA, c.TABLE_NAME, c.CONSTRAINT_NAME
        FROM sysobjects s
        INNER JOIN INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE c ON s.[name] = c.CONSTRAINT_NAME
        WHERE s.xtype IN('F', 'C')

        OPEN constraints

        FETCH NEXT FROM constraints INTO @TABLE_SCHEMA, @TABLE_NAME, @CONSTRAINT_NAME

        WHILE @@FETCH_STATUS = 0
            BEGIN
                SET @table = @DBName + '.' + @TABLE_SCHEMA + '.' + @TABLE_NAME

                SET @command = N'ALTER TABLE ' + @table + ' NOCHECK CONSTRAINT ' + @CONSTRAINT_NAME
                EXECUTE sp_executesql @statement = @command
                PRINT @command
                
                SET @command = N'ALTER TABLE ' + @table + ' DROP CONSTRAINT ' + @CONSTRAINT_NAME
                EXECUTE sp_executesql @statement = @command
                PRINT @command
                
                FETCH NEXT FROM constraints INTO @TABLE_SCHEMA, @TABLE_NAME, @CONSTRAINT_NAME
            END
        CLOSE constraints
        DEALLOCATE constraints
        
        -- Drop tables, now constraints are deleted
        SET @command = 'DROP TABLE ' + @DBName + '.?'
        EXEC sp_MSforeachtable @command1 = @command, @command2= 'PRINT ''? dropped'''

    COMMIT TRANSACTION DelTables
END TRY
BEGIN CATCH

    ROLLBACK TRANSACTION DelTables

    SET @command = 'Error Occurred, rollback committed' + CHAR(13) + CHAR(10) + CHAR(13) + CHAR(10) + ERROR_MESSAGE()
    RAISERROR(@command, 20, 1)

END CATCH

What we have here is an almost foolproof version of my dynamic DROP TABLE script. The USE [MyDatabase] statement is now present, with a parameter added at the top of the file to hold the database name, which is used to qualify all object references within the scripts. This should be enough, to be honest - the additional error handling (the TRY...CATCH and the transaction management statements) are superfluous to requirements; you cannot undo a DROP TABLE script using transactions, and all the TRY...CATCH will do is report errors in a managed way.

Now, if you drag this script in to a server that does not have a database called "MyDatabase" and run it, you'll get the following error:

Msg 911, Level 16, State 1, Line 14
Database 'MyDatabase' does not exist. Make sure that the name is entered correctly.
And script execution immediately halts, which is almost ideal. Now, the only mistake you could make is by running the script on an incorrect instance of the database; i.e. LIVE instead of TEST. This is always a risk, and one which cannot be avoided (unless the database names are suffixed with the instance, which creates it's own deployment issues), but it certainly stops you from running this script against a completely unrelated database, destroying entire infrastructures in the process :(

Luckily (to those who care), I jumped on it and rectified the mistake as quickly as I could, rather than brush it under the carpet and deny all knowledge (as I have seen many people do before). I held my hand up and admitted responsibility; something a lot of developers tend to shirk if they can get away with it. My issue only occurred because I left a connection to the high-availability database open in SSMS after helping out a colleague, then got distracted with some other work, and an hour later I dragged the script on to what I thought was my local SQL instance. One press of F5 later and boom, my whole world crumbled around me for the next few hours...

Popular posts from this blog

Handling uploads with MVC4, JQuery, Plupload and CKEditor

Generating a self-signed SSL certificate for my QNAP NAS

Why GUIDs Are a Bad Choice for SQL Server Key and Identity Columns