T-SQL Tuesday #98 – Technical challenge

Tsql2sday logoThis month’s TSQL2sday is hosted by BlobEater (Arun Sirpal b|t) and the topic is about solving a technical challenge you faced in the past.

It’s another great subject for writing a blog post. Troubleshooting, issue solving, and root cause analysis are very exciting aspects of every DBA’s work. It will be a great experience to read all these posts describing various kinds of issues that DBAs from around the world have to deal with.

However, it’s not an easy task to write such post. I had plenty of ideas what I could describe but after a moment of thought, everything seemed to be obvious and not interesting at all. Finally, after few days of intensive thinking, I decided what to write about.

“Application is freezing”

Three years ago I was working with a customer who was reporting an application freeze issue. A few weeks before that they upgraded the application  to the newer version. Since then, everything was fine for some time, but recently they started to observe an application freeze issue. Each time it occurred, the application was unresponsive for most of the users. It was very non-deterministic behavior. Freeze appeared once every few days at different times of day. Usually, they took few minutes and always were resolved without any manual intervention.

At the very beginning, the major part of investigation activities was focused mainly on the network and a Citrix farm. Unfortunately, few days of checks gave nothing. After eliminating every subsystem from application infrastructure one by one only the database server was left.

“It has to be the database!”

The database server was quite powerful with 160 logical processors and 512 GB of Memory. Attached SAN storage also was very good with 0-2 ms of latency. According to logs, the server was never utilized in 100% so hardware was not our main concern…

The biggest problem with the investigation of the database were irregular occurrences of the freeze. Every time it occurred and we were called by the customer, the issue was already gone before we connected. That was irritating. We were never able to check what is happening on the system during the impact.

At that time this particular database server was not monitored by any SQL Server monitoring tool, so we didn’t have many logs describing server activity. We reviewed all available logs, scheduled jobs, and other regular activities to verify if anything could be involved. Unfortunately, it didn’t give us any new findings.

Finally, we decided to deploy some monitoring mechanism which would collect data and give as good situation overview for our after-the-facts analysis.

Diagnostic data collection

We decided to use PSSdiag. It’s a free data collection utility developed as a side project by Microsoft. It’s mainly used by Microsoft Customer Support Service and Product Support Service teams. It’s not the simplest tool to configure, but it collects all required information, like PerfMon counters, SQL Trace, error logs, and diagnostic data about SQL Server state, health, and its performance.

It took us a while to create a proper configuration because the volume of data being collected was huge. We also had to test it very well, because we couldn’t affect customer’s SQL Server performance. Finally, we ended up with the configuration which in this particular environment was collecting 0,5GB of data per minute (on average). Data retention was driven by fixed maximum data size because we didn’t want to consume all free space on one of SQL Server drives.  Because of it, the collected time frame was varying from 1 hour up to max 2 hours. That gave us about 30 to 90 minutes to collect this data after the customer called us.

When the customer called us a few days later informing about another freeze occurrence, we had bad luck. Data from issue time period was already gone – overwritten by newer ones. “Fortunately”, two days later there was one more occurrence, which this time was captured properly. I copied this data out from the server and was ready for analysis.

Captured Data Analysis

To analyze data captured by PSSdiag I used a tool called SQLNexus. It’s another tool created as a side project at Microsoft. It processes the output from PSSdiag into a database and runs reports on top of it.

The result was worth all the configuration and testing efforts. At first glance, I knew that we had captured what we needed.

 

As you can see in above screenshot, something wrong happened around 16:05 and it lasted until 16:20. During this time period, the number of Started and Completed Batches dropped significantly and the average duration of execution increased. A large amount of Attentions was also noticeable. When I opened another report with blocking information everything became clear.

There was blocking situation that lasted for almost 15 minutes and blocked a total 15314 other SQL Server sessions! This is madness…

Blocking Chain Detail report revealed additional important information. One of our small, utility applications used to manage user data (let’s call it App02) was identified as a head blocker (SPID 137). The application was not used very often by the customer, which explained why freezes occurred in so non-deterministic way. However, most interesting information was hidden somewhere else… It’s the Transaction Name: implicit_transaction. What…?

Let’s dig a little bit deeper and check what code App02 application executed.

Insert statement to the [dbo].[User] table was identified as the last query that our application executed. But why would an insert statement cause such behavior?

That bothered me a lot. I needed to know what happened in session 137 before this insert statement. Because I wasn’t able to find this kind of data in SQLNexus reports I decided to get it directly from the database. I had to spend some time to make myself familiar with the table structure created by ReadTrace tool from RML Utilities (used by SQLNexus during data import process). Finally, I developed bellow code snippet:

DECLARE @startTime datetime
DECLARE @endTime datetime
DECLARE @blockThresholdInMs int

SET @startTime = '2014-07-08 15:00'
SET @endTime = '2014-07-08 17:00'
SET @blockThresholdInMs = 20000 --20sec

-- Get blocking chain
SELECT * FROM dbo.tbl_BLOCKING_CHAINS 
WHERE blocking_start BETWEEN @startTime AND @endTime
AND max_wait_duration_ms > @blockThresholdInMs

DECLARE @blockingStartTime datetime
DECLARE @blockingEndTime datetime
DECLARE @blockingSession int

SELECT @blockingStartTime = blocking_start,
@blockingEndTime = blocking_end,
@blockingSession = head_blocker_session_id
FROM dbo.tbl_BLOCKING_CHAINS 
WHERE blocking_start BETWEEN @startTime AND @endTime
AND max_wait_duration_ms > @blockThresholdInMs

-- Get connection info for blocking chain header
SELECT * FROM ReadTrace.tblConnections
WHERE Session = @blockingSession
AND @blockingStartTime BETWEEN StartTime AND EndTime

DECLARE @connectionStartTime datetime
DECLARE @connectionEndTime datetime

SELECT @connectionStartTime = StartTime,
@connectionEndTime = EndTime 
FROM ReadTrace.tblConnections
WHERE Session = @blockingSession
AND @blockingStartTime BETWEEN StartTime AND EndTime

-- Get statements executed by head blocker
SELECT ub.NormText, ub.OrigText, * FROM ReadTrace.tblBatches b
left join ReadTrace.tblUniqueBatches ub on b.HashID = ub.HashID
WHERE session = @blockingSession
AND StartTime BETWEEN @connectionStartTime AND @connectionEndTime
order by b.BatchSeq

-- Get additional events (like Exception) for this session in this time frame
SELECT * FROM ReadTrace.tblInterestingEvents
WHERE Session = @blockingSession
AND StartTime BETWEEN @connectionStartTime AND @connectionEndTime
AND Error IS NOT NULL

Maybe it’s not the most beautiful piece of code I wrote, but it did its job. It displays data about head blocker session, executed queries, and other interesting events (like exceptions) that could have taken place. Here are the results (click to enlarge):

You can see that after connecting to the database App02 application executed SET IMPLICIT_TRANSACTIONS ON, then executed one more SELECT statement and that’s it. Nevertheless, there is one more thing. At 16:03:52 an error 2601 was reported. It is the time when INSERT statement to the [dbo].[User] table was executed. So what is the error 2601? Some of you probably already know it. I didn’t, so I had to look for it in the documentation. It turned out that it is unique index violation. OK, let’s summarise what we already know:

  1. App02 connects to the database
  2. It executes some statements
  3. It SET IMPLICIT_TRANSACTIONS to ON
  4. Then it tries to insert data to table, but fail because of unique index violations

So far so good, but why the transaction was not rolled back?

“The problem is the code”

I had to look for answers to this question elsewhere. Namely in the application source code. It took me a while, but I was able to find c++ code that was responsible for this. Its simplified version looks that way:

GetDatabase()->BeginTrans();

TRY
{
      result=Data::Put();
}
CATCH (CDBException, e)
{
      ShowMessageBox(e->m_strError);
      result=FALSE;
}                
END_CATCH

if (result)
{
      TRY
      {
          //Do sth else
      }
      CATCH (CDBException, e)
      {
          ShowMessageBox(e->m_strError);
          result=FALSE;
      }
      END_CATCH
}
if (result)
      GetDatabase()->CommitTrans();
else
{
      GetDatabase()->Rollback();
}

Do you already see it? I hope you do. In case whenever something goes wrong an error message in a dialog window is displayed to the application user. The opened transaction is not rolled back until the user closes this window.

And this is exactly what happened. Application user wanted to create a new user in the system. Unfortunately, due to some constraints, this process failed because new entry he or she created was not unique. An error message was displayed, but the user didn’t close it immediately. Instead of this, he or she started to browse application user manual for additional information and problem resolution.

After this discovery, new application patch was ready the next day and immediately deployed to the customer environment.

Summary

Guys, remember to educate application developers and teach them to do not implement any unnecessary steps in the database transaction scope. Especially if they require user action or input. The better idea is to not open database transaction from the application code, but that’s another story. Next advice is to make yourself familiar with tools I mention in this post. All of them are free and half of them is open source and available on Github. If you cannot afford professional SQL Server monitoring tool that may be your salvation.

PS: That whole story reminds me of one tweet I saw recently

-Marek

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

QuickQuestion: Stored Procedure with multiple result sets

QuickQuestion series is a series of short posts in which I answer database related questions asked by my colleagues, friends, and co-workers, mainly application developers.

Today’s question:

Is it OK to create a Stored Procedure that returns two result sets?

Continue reading “QuickQuestion: Stored Procedure with multiple result sets”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

SQL Operations Studio – Comprehensive guide to the new database DevOps tool!

SQL Operations Studio - logo croppedSQL Operations Studio is a new tool created by Microsoft to make life easier for Database Administrators and Developers. It was first announced at the beginning of November this year, during PASS Summit conference in Seattle. Two weeks later on November 15th, the preview version was made available to the whole SQL Server community. The tool is available on three platforms (Windows, Linux, macOS) and can be downloaded for free here: Download and install Microsoft SQL Operations Studio (preview).

In this blog post, I’m going to describe SQL Operations Studio features. Continue reading “SQL Operations Studio – Comprehensive guide to the new database DevOps tool!”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

T-SQL Tuesday #97 – My learning plan for 2018

Tsql2sday logoThis month’s TSQL2sday is hosted by Malathi Mahadevan (b|t) and the topic is about setting learning plan and goals for 2018. It is a perfect time to think how I would like to develop myself next year, what skills I should improve or what to change in my career path.

I never used to make New Year resolutions. I never had such need, and never thought about it as a good idea. However, I also never considered it as a simple way to create a learning plan. A plan that I can pursue to become a better database specialist. A plan which will help me get rid of my inner procrastinator (Mr. P: “keep dreaming…”).

This month’s T-SQL Tuesday topic encouraged me to try… I sat down for a while and this is the plan I came up with… Continue reading “T-SQL Tuesday #97 – My learning plan for 2018”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

QuickQuestion: Can I add column with IDENTITY property to existing table with data?

QuickQuestion series is a series of short posts in which I answer database related questions asked by my colleagues, friends, and co-workers, mainly application developers.

Today’s question:

Can I add column with IDENTITY property to existing table with data?

Continue reading “QuickQuestion: Can I add column with IDENTITY property to existing table with data?”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

Temporary Stored Procedures – good, old and little-known feature

Almost every T-SQL developer know temporary tables. However, I recently found that only a few people know about the existence of Temporary Stored Procedures. In this short blog post, I’m going to describe this functionality and explain how you can create and use such procedures. Continue reading “Temporary Stored Procedures – good, old and little-known feature”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

Import Flat File wizard- the new feature available in SSMS v17.3

Import Flat File - header

Recently I wrote post about XE Profiler – the new feature in SQL Server Management Studio 17.3. You can find it here. However it is not only feature added to this version of SSMS. Microsoft has also added the new Import Flat File wizard to this version. It is a response to many complaints about an inconvenient Flat File import process received from many SSMS users.

New Import Flat File wizard supposes to make a flat file import easier and more comfortable.

What a Flat File is?

Flat File is a plain text file with no internal hierarchy. It’s often use to store tabular data because its creation is simple. Two most popular types of Flat Files are:

  • Delimited (such as comma- and tab-separated)
    Id, Car, Price
    1, Ford, 1500
    2, Chevrolet, 2000
  • Fixed width (when values in one column have the same length in each row)
    Id   Car         Price
    1    Ford        1500
    2    Chevrolet   2000

One of the most well-known and often-used types of delimited flat files is a comma-separated values (CSV) file.

Basic CSV rules (from wikipedia):

  • Adjacent fields must be separated by a single comma. However, CSV formats vary greatly in this choice of separator character. In particular, in locales where the comma is used as a decimal separator, semicolon, TAB, or other characters are used instead.
  • Any field may be quoted (that is, enclosed within double-quote characters). Some fields must be quoted, as specified in following rules.
  • Fields with embedded commas or double-quote characters must be quoted.
  • Each of the embedded double-quote characters must be represented by a pair of double-quote characters.
  • Fields with embedded line breaks must be quoted (however, many CSV implementations do not support embedded line breaks).
  • The first record may be a “header”, which contains column names in each of the fields

Example:

Year,Make,Model,Description,Price
 1997,Ford,E350,"ac, abs, moon",3000.00
 1999,Chevy,"Venture ""Extended Edition""","",4900.00
 1999,Chevy,"Venture ""Extended Edition, Very Large""",,5000.00
 1996,Jeep,Grand Cherokee,"MUST SELL!
 air, moon roof, loaded",4799.00
Import and Export Data wizard

In older versions of SQL Server Management Studio we have Import and Export Data wizard available. It works pretty well when importing data between tables or databases, but it’s not so comfortable in use when importing data from flat files.

SQL Server Import and Export wizard

SQL Server Import and Export wizard - details

This wizard requires detailed information from the user about source file format. We have to specify a delimiter, a separator, a text qualifier, a code page and few other things. As one of the next steps we have to review whether the wizard correctly recognized column’s datatypes and their lengths. It’s a very cumbersome process which often ends with switching between the next and the previous wizard’s tabs to correct any errors and to finally import data successfully.

This is how above example of CSV file looks after import by this wizard:

Import and Export Data wizard - result

Import Flat File wizard

In SQL Server Management Studio 17.3 Microsoft gave users new functionality: Import Flat File wizard.

Import Flat File - menu

This wizard is much simpler in use.

  1. Introduction: Welcome page describes shortly how to use this Import Flat File wizard. If you don’t want to see this page every time while using this process you can check “Do not show this page again” option. Import Flat File - Introduction
  2. Specify input file:  Use this page to choose source file and destination table name. Click browse button to search for *.txt and *.csv files. The new table should be unique, if not then the wizard will not let you to continue.Import Flat File - Specify Input File
  3. Preview Data: Here you can take a look on your data preview. It will show you up to first 50 rows. Import Flat File - Preview Data
  4. Modify Columns: On this page you can correct column names, data types and additional attributes, recognized and set by wizard.Import Flat File - Modify Columns
  5. Summary: This page presents a summary of the import process.Import Flat File - Summary
  6. Results: On this last page users receive information whether import was successful. Import Flat File - Results

That’s it! Only 5 steps and file is imported. As you probably already noticed there is one huge difference between this import process and the previous one.  This time we didn’t have to describe file format. That is because of a completely new import framework called PROSE. Microsoft Program Synthesis using Examples SDK can analyze text data, search for patterns and “learn” structure of the file. This is how it discovers column names, datatypes, delimiters and other properties.

Let’s take a look on import results.

Import Flat File - import result

As you can see it didn’t work well for quoted fields with embedded double-quote characters.  On the other hand it recognized NULL value correctly.

Other Examples

OK, we already know the framework recognizes NULL values and can parse comma separated values. What about other possible formats?

Semicolon separated values

Year;Make;Model;Description;Price
1997;Ford;E350;"ac, abs, moon";3000.00
1999;Chevy;"Venture ""Extended Edition""";"";4900.00
1999;Chevy;"Venture ""Extended Edition, Very Large""";;5000.00
1996;Jeep;Grand Cherokee;"MUST SELL!
air, moon roof, loaded";4799.00

Import Flat File - import result with semicolon

No surprises – It works in the exactly same way.

Fixed width file

Year Make  Model                         Description   Price  
1997 Ford  E350                          ac, abs, moon 3000.00
1999 Chevy Venture "Extended Ed."                      4900.00
1999 Chevy Venture "Extended Ed., Large"               5000.00

Import Flat File - fixed width import result

Nope. I didn’t work.

TRUE/FALSE values

Year,Make,Model,Description,Price,Available
1997,Ford,E350,"ac, abs, moon",3000.00,TRUE
1999,Chevy,"Venture ""Extended Edition""","",4900.00,FALSE
1999,Chevy,"Venture ""Extended Edition, Very Large""",,5000.00,TRUE
1996,Jeep,Grand Cherokee,"MUST SELL!
air, moon roof, loaded",4799.00,FALSE

Import Flat File - boolean import result

TRUE and FALSE values are recognized and correctly mapped to bit values.

Report File (*.rpt)

What about files with query results? Maybe this is the perfect solution to load this data back to the database?

Import Flat File - rpt

Unfortunately such file format is not supported.

Not supported file format

What happens if we will provide unsupported file format like JSON or SQL Server Error Log file (after writing my last week’s blog post I had to try this)? Such tries will end with below error.

Import Flat File - error

Summary

New Import Flat File functionality can decrease time required to import simple flat files. Unfortunately for more complex ones it doesn’t work well yet. It recognizes well TRUE, FALSE and NULL values. However it cannot parse fixed-width files. It can handle quoted string values, but has issues when parsing quoted fields which contain double-quoted characters. If you find this annoying I encourage you to vote up my connect item: https://connect.microsoft.com/SQLServer/feedback/details/3144015

-Marek

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

SQL Server 2017 – Oracle JRE 7 Update 51 (64-bit) or higher is required for Polybase

At the beginning of October, the SQL Server 2017 was officially released. A few days later after the official announcement I wanted to install it in my virtual environment to take a look at newest features. During SQL Server 2017 installation I faced “Oracle JRE 7 Update 51 (64-bit) or higher is required for Polybase” error and I was not able to proceed. Continue reading “SQL Server 2017 – Oracle JRE 7 Update 51 (64-bit) or higher is required for Polybase”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

QuickQuestion: When Stored Procedure break execution on error?

QuickQuestion series is a series of short posts in which I answer database related questions asked by my colleagues, friends, and co-workers, mainly application developers.

Today’s question:

When Stored Procedure break execution on error?

Continue reading “QuickQuestion: When Stored Procedure break execution on error?”

Share it:
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close