THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Alexander Kuznetsov

Wrapping related changes in a transaction may use less CPU.

Wrapping related changes in a transaction is a good way to ensure data integrity. Besides, in some cases it just runs noticeably faster, using less CPU. As usual, I am posting repro scripts, which you can run, tweak, and see for yourself.

Environment

I've run my scripts on 2008 R2 Dev Edition. Snapshot isolation is enabled, READ_COMMITTED_SNAPSHOT is not enabled.

Test data

 After applying Occum's razor, all we need is two tables with one row in each:

 CREATE TABLE dbo.Toggle1
    
(
      
id INT NOT NULL
            
PRIMARY KEY ,
      
i INT NOT NULL
    ) ;
GO
INSERT  INTO dbo.Toggle1
        
( id, i )
VALUES  ( 1, 0 ) ;
GO
CREATE TABLE dbo.Toggle2
    
(
      
id INT NOT NULL
            
PRIMARY KEY ,
      
i INT NOT NULL
    ) ;
GO
INSERT  INTO dbo.Toggle2
        
( id, i )
VALUES  ( 1, 0 ) ;
 

Benchmarking

The following two stored procedures run the same modifications 10K times. The first one does not use an explicit transaction:

 CREATE PROCEDURE dbo.Toggle50kTimesX3WithoutTransaction
AS
    BEGIN
;
        
SET NOCOUNT ON ;
        
SET XACT_ABORT OFF ;
        
DECLARE @i INT ;
        
SET @i = 0 ;
        
WHILE @i < 50000
            
BEGIN ;
                
UPDATE  dbo.Toggle1
                
SET     i = 1 - i
                
WHERE   id = 1 ;
                
UPDATE  dbo.Toggle2
                
SET     i = 1 - i
                
WHERE   id = 1 ;
                SET @i = @i + 1 ;
            
END ;
    
END ;

The second procedure does use an explicit transaction:

CREATE PROCEDURE dbo.Toggle50kTimesX3WithXactAbortOff
AS
    BEGIN
;
        
SET NOCOUNT ON ;
        
SET XACT_ABORT OFF ;
        
DECLARE @i INT ;
        
SET @i = 0 ;
        
WHILE @i < 50000
            
BEGIN ;
                
BEGIN TRAN ;
                
UPDATE  dbo.Toggle1
                
SET     i = 1 - i
                
WHERE   id = 1 ;
                UPDATE  dbo.Toggle2
                
SET     i = 1 - i
                
WHERE   id = 1 ;
                COMMIT ;
                
SET @i = @i + 1 ;
            
END ;
    
END ;

Let's run the following script a few times:

EXEC dbo.Toggle50kTimesX3WithoutTransaction ;
GO
EXEC dbo.Toggle50kTimesX3WithXactAbortOff ;
 

Typical CPU and duration is as follows:

 dbo.Toggle50kTimesX3WithoutTransaction

CPU: 1900-2340 Duration:5900

dbo.Toggle50kTimesX3WithXactAbortOff

CPU: 1500-1700 Duration: 3200-3300

Conclusion

As we have seen, wrapping related changes in as transaction may improve performance.By related, I mean changes that should all succeed or all fail if we are ensuring atomicity.

However, we typically need to keep transactions short, so we should not bundle unrelated changes in transactions - that may hurt concurrency.

Also I am being very careful and not making any blanket statements here. I did not research this effect with due diligence. I noticed the effect in one case and came up with a repro that works in one particular case on one server, this is all.

I would really appreciate if you run the repro, tweak it as you see needed, and post the results here. TIA!

In general, we should not assume that larger transactions always mean better performance - the opposite is true in many cases. We should always benchmark the specific case at hand and decide on case by case basis.

Edit: I have simplified the scripts a little bit, and that changed the numbers a little bit.

Published Wednesday, February 01, 2012 5:02 PM by Alexander Kuznetsov

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

Kevin said:

You have an extra SET @i = @i + 1 ; which would cause the stp to do a bit less work. I don't think this would cause all the time savings though. Will need to play around a bit more.

February 3, 2012 5:54 PM
 

Alexander Kuznetsov said:

Good catch. Kevin!

In fact, I intended to have only one SET @i = @i + 1 ; per loop iteration. As a result, the first loop ran approximately 33K times, but the second one ran 25K times.

I will rerun my benchmarks soon and fix my post.

February 3, 2012 10:10 PM
 

Alexander Kuznetsov said:

Kevin,

I fixed the procedures, fixed the post, and reran the benchmarks on my laptop. The big difference in CPU is still there, as expected.

Thanks again!

February 3, 2012 10:23 PM
 

Dave Ballantyne said:

Isnt this expected ?

Every Update statement would be working in an implicit transaction,  therefore you are actual committing 6 * more that with the explicit transaction.  I would imagine the difference in time would be due to SQLServer being able to batch up the work in the longer running (explicit) transaction rather than the bity-bity work in the implicit transactions.

February 7, 2012 8:18 AM
 

Alexander Kuznetsov said:

Dave,

This was kind of expected, but in this example the overhead of beginning transactions and committing is more than the overhead of actually doing the modifications. This was new to me.

February 7, 2012 9:11 AM
 

Dave Ballantyne said:

Ah , ok , that didnt come across to me.  

Also, dont discount the cost of acquiring and releasing locks too.

February 7, 2012 11:59 AM
 

Alexander Kuznetsov said:

Dave,

I have refactored my benchmarks to address your concern abut the cost of acquiring and releasing locks: now each object is accessed only once per transaction. Still I clearly observe a difference.

February 7, 2012 3:05 PM

Leave a Comment

(required) 
(required) 
Submit

About Alexander Kuznetsov

Alex Kuznetsov has been working with object oriented languages, mostly C# and C++, as well as with databases for more than a decade. He has worked with Sybase, SQL Server, Oracle and DB2. He regularly blogs on sqlblog.com, mostly about database unit testing, defensive programming, and query optimization. Alex has written a book entitled "Defensive Database Programming with Transact-SQL" and several articles on simple-talk.com and devx.com. Currently he works at DRW Trading Group in Chicago, where he leads a team of developers, practicing agile development, defensive programming, TDD, and database unit testing.

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement