Blog Post

Azure SQL Blog
5 MIN READ

Logging Schema-changes in a Database using DDL Trigger

AndreasWolter's avatar
AndreasWolter
Icon for Microsoft rankMicrosoft
Dec 02, 2020

Logging Schema-changes in a Database using DDL Trigger

 

Every so often I have been asked “how it is possible to log and review the audit-logs when Developers change objects but do not have access to the SQL Audit?”.

Now, even a security person like me should try to understand the real motivation for this before jumping to the conclusion that Auditing is the only right solution.

So first let’s understand the potential motivations:

 

What is the purpose?

Very often, the background is simply to keep a record of who changed which database-objects to answer typical question like:

  • “Did anyone change any index lately?”
  • “I thought we had an Index here on this table, did anyone drop it?”
  • “Who changed my procedure?” (Although I would advise to keep a record of all changes within the header of any module as a best practice)
  • “When did the last deployment run/stop?”
  • “My old SQL-Function code is gone; does anyone have a copy?”

The motivation is roughly about troubleshooting schema-changes. The aim is not to detect security breaches or even to support tamper-proof security investigations.

If that is the case, my advice here is to not use Auditing at all for such purpose.
The Auditing feature in SQL Server and Azure SQL Database is really meant to be used by security personae, or Administrators with high privileges:

  • The ability to create change, stop and remove Audits is collected under one permission: either ALTER ANY SERVER AUDIT or ALTER ANY DATABASE AUDIT, depending on the scope.
  • To read the audit result, at a minimum the ALTER ANY SERVER AUDIT-permission is required for Audits to the Application or Security Log but CONTROL SERVER for Audits to a file. In Azure SQL Database CONTROL DATABASE is required when using the system function sys.fn_get_audit_file to read from a File target in Blob Storage. Event Hub or Log Analytics are accessed outside from the SQL engine and not controlled with SQL-permissions.

The above request is much more easily fulfilled with a simple solution that allows the results of the DDL-activities to be stored directly within a database table. This will provide a convenient way to understand what scripts or ad-hoc changes to a database schema happened – but it will not serve as measure against evil actors/attackers.

 

Security-Note
If the requirement is, to provide an Audit trail for Security and Compliance reasons, out of reach for normal Database Developers or anyone accessing SQL Server alone, then SQL Auditing is the right solution. DDL Triggers and local tables can be easily tampered with once an attacker elevated to basic datawriter or ddl_admin-level permissions.

 

The solution

If you need to support your Engineering team with a simple record of all DDL statements (Data Definition Language) run against a database, then I recommend using DDL Triggers.

A DDL Trigger is comparable to a stored procedure except it gets activated when any DDL event for which it was created occurs. The list of Events that can be used is here: DDL Events - SQL Server | Microsoft Docs.

 

Personal experience
Over the years working on customer systems, I personally found it to be invaluable and as best practice equipped any database that I designed with such a small trigger and DDL-log-table, just in case. It has helped many times to quickly solve issues with deployments scripts, non-scripted changes to the systems, problems with Source Control and simply getting answers quickly.
The concept is almost trivial and because DDL changes are usually not in performance-critical code-paths, the theoretical overhead on the DDL statement-runtimes is not relevant. (Unless frequent schema-changes are part of a performance-sensitive workload – in which case I would then question if using DDL is a good idea at all in such a place. By the way: DDL on temporary tables is not caught by DDL Triggers.)

 

If we want to log “almost any DDL statement”, we can use the keyword DDL_DATABASE_LEVEL_EVENTS and then within the body of the trigger we can ignore certain events.

 

Note
Events that typically make sense to ignore are: regular Index Rebuild, Reorganize and Update Statistics – routines.
- Rebuild and Reorganize are both covered by the ALTER_INDEX event. But I would recommend keeping an eye on ALTER INDEX … DISABLE, therefore in my code-example you will find a bit of logic to exclude only ALTER INDEX that is not caused by the DISABLE-option. (“Enable” would be done via a Rebuild, so unfortunately there is a small gap: we will not know when the Index is enabled again. If this turns out to be important, include the Rebuild-option as well, but expect a high volume of events in that case.)

 

Inside the Trigger we have access to the EVENTDATA()-Function which returns the details on the event in xml-format. The trigger-code extracts the most useful data from it.
The so captured events can then be written to a local database table, which is much more convenient for the purpose of troubleshooting than having to go to a Security Audit.

 

Note
If you are interested in the Set-Options used when the command was run (this can be useful for debugging purposes), then the SetOptions-Node below TSQLCommand comes in handy.

 

Finally, we need to make sure that the Trigger will succeed in writing to the table even if the User who triggered the event may not have explicit permissions to write to any table. To keep this example simple my code grants INSERT onto the dedicated table to public. This is the easy way and sufficient for many scenarios. If you need extra security, I recommend using a special account for impersonation during the trigger execution only, using the EXECUTE AS-clause. At the end of the script, you will find an example of how to go about that. That way you can make sure to have least privileges applied and only when really needed.

 

Attached you can find the T-SQL Script which will create the following objects:

  • A Database-Level DDL Trigger
  • A Table in a dedicated Schema
  • A stored procedure to purge data from the table when needed.

Object Explorer with the objects accompanying the trigger

 

Also, I include a Script with some queries to test.

This is how the results in the table look like after running some DDL-commands:

Result table

 

Documentation-links:

 

I hope you find this script a useful example.

Let me know how you solve similar requirements in the comments below.

Andreas

 

 

Updated Jan 20, 2021
Version 13.0
  • nLightJC's avatar
    nLightJC
    Copper Contributor

    My database is a replication subscriber. While initializing the replication, I got the error ’’’SELECT failed because the following SET options have incorrect settings: 'ANSI_PADDING'’’’. After disabling this DDL trigger, the process of initializing works. Any suggestions?

  • mbourgon_optum's avatar
    mbourgon_optum
    Copper Contributor

    nLightJC ever figure out a way around this issue with DDL Triggers? Log Analytics doesn't work well for DDL changes, and Event Notifications don't exist on Azure SQL Database, so I'm back to using DDL Triggers which I abandoned for the same reason you found. AndreasWolter any suggestions?