
-Marek
Share it:The SQL Server place…

-Marek
Share it:This is the fifth post from my short 5 part series in which I demonstrate you 5 different ways to track database schema changes. Thanks to this, you always can easily answer questions about what has changed, who changed it and when. Today, I describe how to use SQL Server Audit for this purpose.
Here you can find my previous posts from this series:
The fifth option is SQL Server Audit. This functionality has been introduced in SQL Server 2008. It is based on Extended Events and allows you to audit SQL Server instance or an individual database. The results of an audit can be saved to a file, the Windows Security event log, or the Windows Application event log. The audit can be created to log a single event or events from a predefined event group. Sounds familiar? This mechanism is very similar to that used with DDL Triggers.
The SQL Server Audit consists of several elements:
To test this solution, we must create an SQL Server Audit that captures the group of object modification events. In our case it is:
This group tracks create, alter, and drop events on schema and its objects like table, view, stored procedure, function, etc.
As a first step, we have to create the SQL Server Audit object. It can be done using Wizard or using SQL. To use GUI, in Object Explorer expand Instance and then Security. Right Click on “Audits” and choose “New Audit…”.

In the “Create Audit” window provide the audit name, destination and the data retention. Then click “OK”.

By default, the audit is created in a disabled state.

As a next step, we must create Audit Specification. If you want to log object schema change events in all databases, then it will be Server Audit Specification. If you’re interested in logging only events from one particular database, then it will be Database Audit Specification.
For demo purposes, I choose the Database Audit Specification. To create it, in Object Explorer, expand Instance -> Databases -> DB -> Security. Right click on “Database Audit Specifications” and choose “New Database Audit Specification…”.

In the “Create Database Audit Specification” window we need to do a few things. Provide the name for the new Database Audit Specification. Then select the audit created in the previous step. Finally, choose actions (or groups of actions) that should be traced. In our case, it is SCHEMA_OBJECT_CHANGE_GROUP. Then click “OK”.

By default, the Database Audit Specification is created in a disabled state. To start auditing, we must enable it and then enable SQL Server Audit.
The same steps can be done with the usage of T-SQL. You can use this script to create such SQL Server Audit together with Database Audit Specification.
USE [master]
GO
CREATE SERVER AUDIT [ObjectsModification]
TO FILE
( FILEPATH = N'C:\Program Files\Microsoft SQL Server\MSSQL14.SS2017\MSSQL\Log\'
,MAXSIZE = 10 MB
,MAX_ROLLOVER_FILES = 10
,RESERVE_DISK_SPACE = OFF
)
WITH
( QUEUE_DELAY = 1000
,ON_FAILURE = CONTINUE
)
GO
USE [TrackMyChanges]
GO
CREATE DATABASE AUDIT SPECIFICATION [DatabaseAuditSpecification]
FOR SERVER AUDIT [ObjectsModification]
ADD (SCHEMA_OBJECT_CHANGE_GROUP)
WITH (STATE = ON)
GO
ALTER SERVER AUDIT [ObjectsModification] WITH (STATE = ON)
GO
To view logged events in SSMS right-click on the SQL Server Audit you want to check, and choose “View Audit Logs”.
SQL Server Audit works similarly to the Extended Events Sessions and DDL Triggers and logs very comparable data. We know what object was changed, when it was done, and by whom. We also have access to the SQL query that was executed.
If you log to the Windows Application Log or the Windows Security Log, you don’t have any possibilities to configure data retention. In case of logging to files, you can specify the number of files and their maximum sizes.
In the next post (yes – there will be another one), I will provide you with a general summary of all 5 ways to track database schema changes.
-Marek
Share it:This is the fourth post from my short 5 part series in which I demonstrate you 5 different ways to track database schema changes. Thanks to this, you always can easily answer questions about what has changed, who changed it and when. Today, I describe how to use DDL Trigger for this purpose.
Here you can find my previous posts from this series:
The fourth option is a DDL Trigger. These kinds of triggers are executed in response to Data Definition Language (DDL) events. They can be created in the database or server scope. Database-scoped DDL triggers are stored as objects in the database in which they are created. Server-scoped DDL triggers are stored in the master database. They can be created to be fired in response to a single particular event or the event from a predefined event group.
In our case, we are interested in capturing all events from the DDL_DATABASE_LEVEL_EVENTS group.
To get information about the event that fired our DDL trigger, we use the EVENTDATA()function. It returns an XML value that contains details about the event that triggered the execution of the trigger.
To log objects’ changes we first have to create a table to which we will save captured events. Below code snippet shows the simplest possible table structure that can be used for this purpose.
CREATE TABLE [dbo].[DatabaseLogs](
[Id] [int] IDENTITY(1,1) NOT NULL,
[DateTime] [datetime] NOT NULL
CONSTRAINT DF_DatabaseLogs_DateTime DEFAULT (GETDATE()),
[EventData] [xml] NULL,
CONSTRAINT [PK_DatabaseLogs] PRIMARY KEY CLUSTERED ( [Id] ASC )
)
GO
CREATE NONCLUSTERED INDEX nix_DatabaseLogs ON [dbo].[DatabaseLogs] ([DateTime] ASC) INCLUDE ([Id]);
GO
The trigger can be created using this query.
CREATE TRIGGER [tr_DatabaseLogs]
ON DATABASE
FOR DDL_DATABASE_LEVEL_EVENTS
AS
BEGIN
SET NOCOUNT ON;
IF OBJECT_ID('dbo.DatabaseLogs') IS NOT NULL
BEGIN
BEGIN TRY
DECLARE @Eventdata XML;
SET @Eventdata = EVENTDATA();
INSERT dbo.DatabaseLogs (
[DateTime]
, [EventData]
)
VALUES (
GETUTCDATE()
, @Eventdata
);
END TRY
BEGIN CATCH
SET @Eventdata= NULL;
END CATCH
END
END
GO
Pay attention to the additional checks that are implemented in case if the table is dropped or if the insert statement fails. It is critical to keep such DDL trigger transparent to the users and applications to not impact their work if something goes wrong.

Execute this simple SELECT statement to view the log of captured object modifications.
SELECT * FROM dbo.DatabaseLogs ORDER BY [DateTime];
Here, you have an example of one of the inserted XMLs.
<EVENT_INSTANCE>
<EventType>CREATE_PROCEDURE</EventType>
<PostTime>2018-11-17T17:52:51.700</PostTime>
<SPID>54</SPID>
<ServerName>MAREK-PC\SS2017</ServerName>
<LoginName>Marek-PC\Marek</LoginName>
<UserName>dbo</UserName>
<DatabaseName>TrackMyChanges</DatabaseName>
<SchemaName>dbo</SchemaName>
<ObjectName>usp_NewProc</ObjectName>
<ObjectType>PROCEDURE</ObjectType>
<TSQLCommand>
<SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE" />
<CommandText>CREATE PROCEDURE dbo.usp_NewProc
AS
BEGIN
SELECT 'Version 1';
END
</CommandText>
</TSQLCommand>
</EVENT_INSTANCE>
Of course, it is possible to extract this detailed information in the trigger itself and insert them into additional columns in the logging table.
As a first step, we have to extend our logging table.
CREATE TABLE [dbo].[DatabaseLogs](
[Id] [int] IDENTITY(1,1) NOT NULL,
[DateTime] [datetime] NOT NULL CONSTRAINT DF_DatabaseLogs_DateTime DEFAULT (GETDATE()),
[ServerName] [nvarchar](128) NULL,
[ServiceName] [nvarchar](128) NULL,
[SPID] [int] NULL,
[SourceHostName] [nvarchar](128) NULL,
[LoginName] [nvarchar](128) NULL,
[UserName] [nvarchar](128) NULL,
[SchemaName] [nvarchar](128) NULL,
[ObjectName] [nvarchar](128) NULL,
[TargetObjectName] [nvarchar](128) NULL,
[EventType] [nvarchar](128) NULL,
[ObjectType] [nvarchar](128) NULL,
[TargetObjectType] [nvarchar](128) NULL,
[EventData] [xml] NULL,
CONSTRAINT [PK_DatabaseLogs] PRIMARY KEY CLUSTERED ( [Id] ASC )
)
GO
CREATE NONCLUSTERED INDEX nix_DatabaseLogs ON [dbo].[DatabaseLogs] ([DateTime] ASC) INCLUDE ([Id]);
GO
Such table structure should be sufficient to meet most needs. As the next step, we must implement XML parsing logic in our DDL Trigger. To do this, we use SQL Server XQuery methods.
CREATE TRIGGER [tr_DatabaseLogs]
ON DATABASE
FOR DDL_DATABASE_LEVEL_EVENTS
AS
BEGIN
SET NOCOUNT ON;
IF OBJECT_ID('dbo.DatabaseLogs') IS NOT NULL
BEGIN
BEGIN TRY
DECLARE @Eventdata XML;
SET @Eventdata = EVENTDATA();
INSERT dbo.DatabaseLogs (
[DateTime]
, [ServerName]
, [ServiceName]
, [SPID]
, [SourceHostName]
, [LoginName]
, [UserName]
, [SchemaName]
, [ObjectName]
, [TargetObjectName]
, [EventType]
, [ObjectType]
, [TargetObjectType]
, [EventData]
)
VALUES (
GETUTCDATE()
, @@SERVERNAME
, @@SERVICENAME
, @Eventdata.value('(/EVENT_INSTANCE/SPID)[1]', 'int')
, HOST_NAME()
, @Eventdata.value('(/EVENT_INSTANCE/LoginName)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/UserName)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/SchemaName)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/ObjectName)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/TargetObjectName)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/ObjectType)[1]', 'nvarchar(128)')
, @Eventdata.value('(/EVENT_INSTANCE/TargetObjectType)[1]', 'nvarchar(128)')
, @Eventdata
);
END TRY
BEGIN CATCH
SET @Eventdata= NULL;
END CATCH
END
END
GO
Thanks to this solution we can easily filter logged events without the need to write complicated ad-hoc queries using XQuery.
DDL Triggers capture very similar pieces of information to the Extended Events Sessions. We know what object was changed, when it was done, and by whom. We also have access to the SQL query that was executed.
In this case, there is no default mechanism that clean up old data. If you need one, you have to implement it by yourself. Otherwise, the logging table will contain all events since DDL Trigger has been created. That, of course, has some advantages and disadvantages.
In the next part, I will demonstrate you how to use SQL Server Audit to capture database schema changes.
-Marek
Share it:Last week I published the second post from my short 5 part series in which I demonstrate you 5 different ways to track database schema changes. Thanks to this, you always can easily answer questions about what has changed, who changed it and when. Today’s post describes how to use Extended Events Session for this purpose.
You can find my previous post here:
The third option is the Extended Events Session. This functionality has been introduced in SQL Server 2008. It provides very lightweight monitoring and event tracking system that can be very helpful in performance troubleshooting and (in our case) changes monitoring.
Microsoft announced it as a successor of SQL Trace (that has been marked as deprecated in SQL Server 2012). That means SQL Trace can be removed in one of the future versions. However, SQL Trace and SQL Server Profiler are still very popular and widely used, so I really doubt that will happen anytime soon.
In the beginning, Extended Events didn’t gain too much popularity because it was tough to use them. Now, with the help of SSMS, it’s much more comfortable.
To test this solution we have to create an Extended Events Session that captures object modification events. In our case these are:
You can create such session in two ways. Using T-SQL or using a Wizard in SQL Server Management Studio. I find the latter much more comfortable.
Create a Session using Wizard
To create a new Session, in Object Explorer expand Instance -> Management -> Extended Events. Right Click on Sessions and choose New Session Wizard.

On the “Set Session Properties” page provide a name for your session and decide if this session should start at server startup. Then, click the Next button.

On the next page, you can decide whether you want to use a predefined template or not. In our case, there is no template that we can use to track object schema changes. Choose “Do not use template” option and click “Next”.

On the “Select Events To Capture” page, select object_altered, object_created, and object_deleted events.
It should look like this:
On the “Capture Global Fields” page, you can decide what data you want to collect. My recommendation is to select the following ones:
This gives you an overview of what really happened. You know who performed a change, from which machine, and from which application. Most importantly, you also know what SQL statement was executed. When you set this then click “Next”.
On the “Set Session Event Filters” page, you can add additional filters. That enables you, for example, to capture events just for one database instead of for all databases on your instance. Then click “Next”.

On the “Specify Session Data Storage” page, you need to decide if you want to log data to file(s) (event_file target) or maybe keep them in memory (ring_buffer target). For real-life usage, you should choose files. Here, for the demo purpose, I use ring_buffer.

When the session is already created, you can start it immediately and also can watch live data on the screen as events are captured.

Create a Session using T-SQL
The same session can be created using this script.
CREATE EVENT SESSION [CaptureObjectModifications] ON SERVER
ADD EVENT sqlserver.object_altered(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.server_principal_name, sqlserver.session_id, sqlserver.sql_text)),
ADD EVENT sqlserver.object_created(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.server_principal_name, sqlserver.session_id, sqlserver.sql_text)),
ADD EVENT sqlserver.object_deleted(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.server_principal_name, sqlserver.session_id, sqlserver.sql_text))
ADD TARGET package0.ring_buffer(SET max_events_limit=0,max_memory=102400)
GO
When the session is already created and started. It captures all object modifications. To see changes live, start the Wach Live Data view in SSMS.

You can right click on the column names to add additional columns to this view.

Now, let’s test it by executing the whole test case from the beginning of the article. Captured events are automatically displayed in the window.
To review historical data from the ring_buffer target, you need to use the T-SQL query. When you log to files you have the possibility to review their content in SSMS with View Target Data option. Use this query to select captured events for our session.
;WITH raw_data(t) AS
(
SELECT CONVERT(XML, target_data)
FROM sys.dm_xe_sessions AS s
INNER JOIN sys.dm_xe_session_targets AS st
ON s.[address] = st.event_session_address
WHERE s.name = 'CaptureObjectModifications'
AND st.target_name = 'ring_buffer'
),
xml_data (ed) AS
(
SELECT e.query('.')
FROM raw_data
CROSS APPLY t.nodes('RingBufferTarget/event') AS x(e)
)
SELECT * --FROM xml_data;
FROM
(
SELECT
[timestamp] = ed.value('(event/@timestamp)[1]', 'datetime'),
[database_id] = ed.value('(event/data[@name="database_id"]/value)[1]', 'int'),
[database_name] = ed.value('(event/action[@name="database_name"]/value)[1]', 'nvarchar(128)'),
[object_type] = ed.value('(event/data[@name="object_type"]/text)[1]', 'nvarchar(128)'),
[object_id] = ed.value('(event/data[@name="object_id"]/value)[1]', 'int'),
[object_name] = ed.value('(event/data[@name="object_name"]/value)[1]', 'nvarchar(128)'),
[session_id] = ed.value('(event/action[@name="session_id"]/value)[1]', 'int'),
[login] = ed.value('(event/action[@name="server_principal_name"]/value)[1]', 'nvarchar(128)'),
[client_hostname] = ed.value('(event/action[@name="client_hostname"]/value)[1]', 'nvarchar(128)'),
[client_app_name] = ed.value('(event/action[@name="client_app_name"]/value)[1]', 'nvarchar(128)'),
[sql_text] = ed.value('(event/action[@name="sql_text"]/value)[1]', 'nvarchar(max)'),
[phase] = ed.value('(event/data[@name="ddl_phase"]/text)[1]', 'nvarchar(128)')
FROM xml_data
) AS x
WHERE phase = 'Commit'
ORDER BY [timestamp];
In terms of what was captured, the Extended Event Session looks very well. It has a variety of configuration options that allow you to customize the logged details. Viewing the data collected by session, we know what was changed, when it was changed and by whom. We also have a SQL statement that was executed to perform this change.
The Extended Events Session has many retention options for both targets. For files, we can specify the maximum files size. For the ring buffer, we can specify maximum events count and memory size. That gives users a lot of flexibility.
In the next part, I will demonstrate you how to use DDL Triggers to capture database schema changes.
-Marek
Share it:By continuing to use the site, you agree to the use of cookies. more information
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.