Forum Discussion

AayushS1935's avatar
AayushS1935
Copper Contributor
Oct 22, 2024

Capturing Historic SQL with session info in metadata tables.

 

I'm looking to capture historical SQL queries in SQL Server, along with associated user/session/application information, CPU time, execution time, and table size for auditing purposes. I've explored several methods with a focus on minimizing performance impact with minimum external logging:

  1. SQL Server Audit: Efficient and built-in, suitable for capturing DML statements with moderate overhead. However, it can generate large audit files.

  2. Extended Events: Lightweight and highly configurable for detailed DML event logging with low overhead, though setup can be complex.

  3. Change Data Capture (CDC): Captures data changes by reading the transaction log, suitable for ETL and data warehousing with moderate performance impact.

  4. DML Triggers: Provides real-time logging but can significantly impact performance on high-transaction tables, making them less ideal for general use.

  5. Query Store: Focuses on query performance over time, but doesn’t capture a complete history of executed statements.

    I've also created an Extended Events session for detailed logging of executed batches. Here’s the SQL for setting it up:

     

    CREATE EVENT SESSION [Detailed_Execution_Logs] ON SERVER ADD EVENT sqlserver.sql_batch_completed( ACTION ( sqlserver.sql_text, sqlserver.database_name, sqlserver.server_principal_name, sqlserver.client_hostname, sqlserver.client_app_name ) ) ADD TARGET package0.event_file( SET filename = N'./Detailed_Execution_Logs.xel', max_file_size = 10, max_rollover_files = 5 ) WITH ( EVENT_RETENTION_MODE = ALLOW_SINGLE_EVENT_LOSS, MAX_DISPATCH_LATENCY = 30 SECONDS, MEMORY_PARTITION_MODE = NONE, TRACK_CAUSALITY = OFF, STARTUP_STATE = OFF );

     

    I’d appreciate any insights or recommendations on capturing SQL statements effectively while minimizing performance impacts!

No RepliesBe the first to reply

Resources