THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Lara Rubbelke

Interesting Things in the World of SQL Server

  • Policy-Based Management Configuration Policy

    I have been working on some new materials for an upcoming webcast on leveraging Policy-Based Management using the Enterprise Policy Management Framework.  As I was working through some sample policies, I noticed that my SQL Server 2008 instances were not consistently configured for history retention and to log on success.  Well, I thought this was a good thing to put in a policy, as policies allow me to easily manage configuration settings for my instance.  Much to my surprise, SQL Server 2008 does not currently have a Policy Based Management Configuration facet!  I logged a suggestion to connect.microsoft.com (https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=419574), and hopefully we will see a new facet in the future.  Please feel free to vote if you think this would be a helpful addition.

    In the meantime, we still need to manage our PBM configuraton in our SQL Server 2008 installations.  So I developed a policy which will execute some scripts and report the state of our environments.  I cannot force configuration to my policy when conditions are defined with scripts, but at least I have a scalable solution to report on those instances which do not conform. 

    The attached policy defines the PBM management settings.  I typically like to change the PBM Management properties such that the policy history is only retained for 14 days and successful evaluations are logged.  If you care to look at these setting in Management Studio, right click on Policy Management to select properties. 

    • When LogOnSuccess is set to false (default) successful evaluations are not logged.  Setting LogOnSuccess to true will log all policy evaluation results.  You will want to log on success if you decide to centralize policy history and perform enterprise management reporting.  Without the log on success, you will only have failed evaluations.   This makes it difficult to determine if a policy is currently in a failed state or if it was in a failed state in the past but is currently in compliance with your specifications.
    • HistoryRetentionInDays will determine how many days of policy history will be saved in the policy history tables in MSDB. 

    The attached policy is not configured to run on a schedule.  You may want to change this setting prior to importing the policy to your SQL Server 2008 instances. 

  • Documentation Updated: Enterprise Policy Management Framework on CodePlex

    Hello all.  I made a major mistake.  When I released the EPM Framework on CodePlex (http://www.codeplex.com/EPMFramework), I accidentally uploaded the wrong document for the setup notes.  The document I had placed was simply a series of bullets, and not the full document with configuration steps and definitions.  The correct document has been uploaded and is now available.  I am sure this new document will be much easier to follow when setting up an environment. 

    Sorry for any confusion from the original poorly structured document!   

  • Enterprise Policy Management Framework on CodePlex

    It has been a long time since I have blogged, and I promise it is for a good reason.  The Enterprise Policy Management Framework has been published to CodePlex:  Enterprise Policy Management Framework - Home

    I will be presenting this solution at the PASS Summit this week.  Stop by on Friday at 1 pm to learn more and see the full solution in action. 

    The Enterprise Policy Management Framework is a reporting solution on the state of the enterprise against a desired state defined in a policy. Extend Policy-Based Management to all SQL Server instances in the enterprise. Centralize and report on the policy evaluation results.

    The Enterprise Policy Management Framework (EPM) is a solution to extend SQL Server 2008 Policy-Based Management to all versions of SQL Server in an enterprise, including SQL Server 2000 and SQL Server 2005. The EPM Framework will report the state of specified SQL Server instances against policies that define the defined intent, desired configuration, and deployment standards.

    When the Enterprise Policy Management Framework (EPM) is implemented, policies will be evaluated against specified instances of SQL Server through PowerShell. This solution will require at least one instance of SQL Server 2008. The PowerShell script will run from this instance through a SQL Server Agent job or manually through the PowerShell interface. The PowerShell script will capture the policy evaluation output and insert the output to a SQL Server table. SQL Server 2008 Reporting Services reports will deliver information from the centralized table.

    This solution requires the following components are configured in your environment. All SQL Server 2008 requirements listed below may be executed from and managed on the same instance:
    • SQL Server 2008 instance to store policies
    • SQL Server 2008 instance to act as the Central Management Server
    • SQL Server 2008 instance to execute the PowerShell script
    • SQL Server management database and policy history table to archive policy evaluation results
    • SQL Server 2008 Reporting Services to render and deliver policy history reports

    EPMReport.jpg

  • Evaluating Policies On Demand Through PowerShell

    In my previous blog I reviewed the different policy evaluation modes available in SQL Server 2008.  In this next installment I will detail how to initiate a policy on demand using PowerShell.

    PowerShell integration with SQL Server 2008 is getting more press lately.  In past blogs I noted a few resources to help learn PowerShell, and since that time additional resources (http://blogs.technet.com/industry_insiders/pages/powershell-in-sql-server-2008.aspx and http://blogs.msdn.com/mwories/archive/2008/06/14/SQL2008_5F00_Powershell.aspx to call out a few) continue to surface.  This blog series is focused on Policy Evaluation, so I will be specifically looking at the new PowerShell cmdlet available in SQL Server 2008 to support an on demand evaluation of policies. 

    The cmdlet Invoke-PolicyEvaluation will allow you to specify a policy (or set of policies) to be evaluated against a target server.  The general syntax is as follows:

    Invoke-PolicyEvaluation [-Policy [(<string[]> | <Policy> | <FileInfo>)]] -TargetServerName <(string | ISfcConnection)>  [-TargetExpression  <string>] [-OutputXml] [-AdHocPolicyExecutionMode <AdHocPolicyExecutionMode>]

    To open the SQL Server PowerShell provider through Management Studio, right-click on the Server and select Start PowerShell. 

    Note: You can also add the SQL Server PowerShell provider into the main PowerShell.  Open PowerShell and run the following commands to load the SQL Server snap-ins.  These are documented incorrectly in SQL Server 2008 RC0 BOL.

    add-pssnapin SqlServerCmdletSnapin100

    add-pssnapin SqlServerProviderSnapin100

    Once you are in PowerShell, we can execute our commands to evaluate a policy.  I encourage you to issue the following command, which will return a very detailed report on the cmdlet:

    Get-Help Invoke-PolicyEvaluation -Detailed

    A very simple example will assume that you have the SQL Server 2008 sample policies installed in the default location.  On my laptop, they are located at "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033". 

    #Set the working location to the file with our sample policies

    sl "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033"

    Invoke-PolicyEvaluation -Policy "Database Auto Close.xml" -TargetServer "MyServerName"

    Executing these commands will return the output on the console with the results of the policy evaluation.  Note that these results are NOT stored in the MSDB database (as other evaluation modes will do).   

    If you would like to evaluate multiple policies, they are passed as a comma separated list as follows:

    #Set the working location to the file with our sample policies

    sl "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033"

    Invoke-PolicyEvaluation -Policy "Database Auto Close.xml", "Database Auto Shrink" -TargetServer "MyServerName"

    The Invoke-PolicyEvaluation cmdlet can accept inputs from a pipe.  The following example will loop through and evaluate each policy stored in the file against the server.

    sl "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033"

    gci | Invoke-PolicyEvaluation -TargetServer "MyServerName"

    By default, the -AdHocPolicyExecutionMode parameter is "Check", which will simply evaluate the policy and report back through the console or through an XML output file.  The "Configure" option will reconfigure properties that failed compliance so they comply with the policy, as long as they are deterministic and settable. 

    sl "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033"

    Invoke-PolicyEvaluation -Policy "Database Auto Close.xml", "Database Auto Shrink" -TargetServer "MyServerName" -AdHocPolicyExecutionMode "Configure"

    The final parameter I want to point out is the -OutputXml, which will produce an XML report.  As mentioned above, the policy evaluation results are not stored in the MSDB database.  Using this parameter, you can produce an output file which will be very useful when you want to review the information at a later time. 

    sl "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies\DatabaseEngine\1033"

    Invoke-PolicyEvaluation -Policy "Database Auto Close.xml" -TargetServer "MyServerName" -OutputXml >"Report.xml"

    So, yes you can evaluate a policy on demand through PowerShell.  You can create a SQL Server Agent Job with a step that executes your PowerShell script (new in 2008). 

    I can hear what many of you are thinking - "<insert your name here>, why would I need to use this feature if I can simply create a policy on SQL Server 2008 and let the policy manage the server and have the output stored in MSDB?"  In my opinion, the policy should be on the server when your server is SQL Server 2008.  The server should be managed by the policy whenever possible.  However... What if you want to evaluate a policy against a SQL Server 2000 or 2005 instance?  Aha!  Now you have a real use case for invoking policy evaluation through PowerShell.   

    This is just a brief overview of some of the ways to use this cmdlet, but as with all PowerShell and SQL implementations there are multiple ways to put these into action.  In the next parts of the series, I will explore central management servers, data collectors, and examples on how to extend this cmdlet for an enterprise solution.

  • Policy Evaluation on a Single SQL Server 2008 Instance

    SQL Server 2008 Books Online and the Policy Based Management (PBM) blog have documented how to create a policy.  Please review these valuable resources if you are looking for an introduction to creating policies in SQL Server 2008.  Through the remainder of this series, I will be focusing on the different methods of evaluating a policy against a target (server, database, database object).  This first part will focus on policy evaluation on a single SQL Server 2008 instance.  As we journey through the series, I will deliver methods to evaluate policies across multiple servers and against down versions of SQL Server (2000 and 2005). 

    As a brief introduction, SQL Server 2008 includes a new Policy Based Management (PBM) framework.  This new framework allows you to specify intent in a policy, store the policy on a server and have the server managed by that policy (note: a policy may also be stored in a file and I will touch on this later in the series).  Intent may be defined as a server configuration, database configuration, connectivity requirements, security requirements, business rules, naming conventions, and more.  A policy will identify the targets (servers, databases, database objects) that must be managed by this intent and how they are managed by the intent.  How the target is managed by the policy is dictated by the evaluation mode specified on the policy. 

    I think it is very important that we set expectations:  Policy Based Management provides a level of prevention and enforcement, but the current version should be more strongly considered a framework to report and in some cases prevent and enforce compliance.  This is a tool to more easily manage deployed servers, report on the state of SQL Server, and maintain consistency and compliance to your business and regulatory requirements.

    SQL Server 2008 policies can be evaluated, or invoked, in a number of ways on a single instance.  Not all evaluation modes are available to all policies.  The facet will determine the evaluation modes available for your policy.  I have pointed out the Policy Based Management blog in the past, and please spend a few minutes to review the blog on Facets.  This blog is particularly enlightening as it points directly to the relationship between each facet and the supported evaluation modes.  The team has developed a chart (hint: print this out and keep it nearby) which will be very useful when you are designing policies.

    The following are available evaluation mode options when you save the policy on a SQL Server 2008 instance:

    On Schedule

    On Schedule will create a SQL Server agent job to execute the policy at scheduled intervals.  This option is available for all policies created for the database engine.  The results of the policy evaluation are available by viewing the SQL Server event log, Windows NT Application Log, or through the policy views in MSDB outlined here.

    Consider the following scenario:  Many organizations are challenged with maintaining a consistent deployment of SQL Server.  The process to build a new SQL Server may specify the appropriate settings, but over time these settings may be inadvertently updated.  A DBA may change an AWE setting, alter configuration settings, or alter a backup schedule - any of which can increase the time managing the server and identifying the cause of an issue.  As a result the organization either spends energy managing these configuration settings (manually reviewing periodically or building a heavily architected process to manage this process) or the enterprise has many different deployed builds in production (which leads to issues when standardizing the management of the database tier).

    A policy can be deployed on these servers with an evaluation mode On Schedule.  The schedule may run once per day (for example) to determine the status of the deployed instance.  This mode will provide an ideal method for automating the evaluation of deployment standards and reporting on instances which violate these standards. 

    Check on Change: Log

    Check on Change: Log will write output to the event log and to MSDB when an action is executed against one of the properties that are controlled by the policy. 

    For example, you may have a policy that requires xp_cmdshell is disabled.  If a DBA were to enable xp_cmdshell an event is logged indicating that the policy has been violated.  These logs may be viewed in the SQL Server event log, Windows NT Application Log, or through the policy views in MSDB. 

    Rounding out this picture, consider an alerting mechanism for violated policies that are critical.  In some cases, you may want to be immediately notified of a violation to your policy.  You may set up an alert triggered by these event logs through SQL Server or System Center which will send an email to a manager or lead administrator that an action was executed against your server that violates the policy.

    Check on Change: Prevent

    Check on Change: Prevent will disallow any changes that violate the policy.  When a policy is created with this option, a DDL trigger is generated on the server.  If you have multiple policies which are set to Check on Change:Prevent, they will all use the same DDL trigger. 

    As an example, you may have a business rule that requires all stored procedures in a particular application database follow a consistent naming convention.  A policy is set up with this requirement and enforced on the server with the Check on Change: Prevent evaluation mode.  When a developer issues a CREATE PROCEDURE statement, the server DDL trigger is started.  SQL Server will evaluate the procedure against the conditions specified in the policy.  If the procedure violates the condition, the CREATE PROCEDURE statement is rolled back and an error is returned to the developer.  The policy will also write an event to the SQL Server event log, Windows NT Application log and the policy views in MSDB. 

    On Demand

    On Demand is an option where you manually run the policy.  Policies which are stored on SQL Server 2008 and evaluated On Demand are disabled and cannot be enabled.  This option is available for all policies stored on a SQL Server 2008 instance or stored as an XML file.  The results of policies executed On Demand are not written to the SQL Server Event Log, but are available in the policy views in MSDB.

    Evaluating a policy On Demand on a single instance can include multiple flavors.  The most obvious is manually running the policy in SQL Server Management Studio, as in the following examples. 

    • Right-click on any policy stored in SQL Server 2008 and select Evaluate to manually evaluate the policy.
    • Right click the Policies folder in SQL Server Management Studio (SQL Server Instance>Management>Policy Based Management>Policies) and select Evaluate. The Evaluate Policies dialog will appear. By default, the policies stored on the SQL Server 2008 instance are available and you can check which policies you would like to evaluate. If the policies are stored as XML files, select the Browse(...) button. Navigate to the folder where the policy xml files are stored and select one or more policies (hold the shift key to select more than one policy).

    In my next post in this series I will detail one of the lesser known (but very cool and useful) methods of evaluating a policy on demand. 

  • SQL Server 2008 Policy Based Management Resources

    I will be releasing a series of blogs related to Policy Based Management and PowerShell in SQL Server 2008.  As a precursor to these blogs, I wanted to be sure that everyone is aware of the many great resources available online.  

    Policy-Based Management Blog

    The SQL Server Manageability Team has started a new blog related to Policy-Based Management.  This site will provide some great learning content for the terminology and development of policies.  http://blogs.msdn.com/sqlpbm/ 

    SQL Server 2008 Feature Pack

    The following site will provide a number of additional features that you can download and use with SQL Server 2008.  Of particular interest is the download for the Microsoft SQL Server 2008 Policies, which are policies that were developed for all of the best practices represented in previous versions of the Microsoft SQL Server Best Practice Analyzer.  I find these policies to be great tools to learn about the different ways to develop, deploy and evaluate policies.  http://www.microsoft.com/downloads/details.aspx?FamilyId=089A9DAD-E2DF-43E9-9CD8-C06320520B40&displaylang=en 

    SQL Server 2008 RC0 Books Online

    SQL Server 2008 RC0 SQL Server Books Online can be downloaded and installed on your desktop (without negatively impacting other versions of Books Online).  The documentation contains many great learning resources for Policy Based Management and PowerShell (and loads of other stuff).  http://www.microsoft.com/downloads/details.aspx?familyid=19DB0B42-A5B2-456F-9C5C-F295CDD58D7A&displaylang=en 

    If you are reticent to installing Books Online on your desktop, you can access the entire contents of BOL online: http://msdn.microsoft.com/en-us/library/bb543165(SQL.100).aspx

    Watch for some very interesting material coming soon...

     

  • Where did Activity Monitor go?

    When you begin working with RC0, you may find yourself searching for Activity Monitor.  I personally spent about 10 minutes looking for Activity Monitor when I started working with RC0.  Luckily, I happen to work at Microsoft and can directly email people when I have these types of questions (well, lucky for me but perhaps not lucky for them:-))

    In the past, you could only find Activity Monitor by opening the Management folder in SQL Server Management Studio.  This is no longer the place to find Activity Monitor.  Here are the two new ways to launch Activity Monitor:

    1. Open Management Studio.  Select the icon on the toolbar that is the furthest right (It looks like a chart with green and blue lines).

    2. In Object Explorer, right click on the server and select Activity Monitor.  Please note that the Activity Monitor is not available in the context menu unless you select the server.  You cannot launch the Activity Monitor when you select a database, another folder, etc. 

    I am certain many of you are questioning why there was a change in location for the Activity Monitor. I was told that the intention was to make it "more discoverable".  This provided me a great deal of humor as I spent 10 minutes looking for it and ended up asking the product team. 

    Happy monitoring!

  • Saving SQL Server 2008 Policy Based Management Execution History

    The SQL Server 2008 Policy Based Management framework includes views in MSDB to track evaluation history.  The syspolicy_policy_execution_history view displays information about when the policy was executed and the results of the execution.  The syspolicy_policy_execution_history_details view will display more details including the results of each condition.  None of this is very new - and all of this is available in Books Online. 

    What is interesting - and new in SQL Server 2008 RC0 - is a property which will allow you to configure whether you want to collect all history or only history of failed policies in syspolicy_policy_execution_history_details.  By default, SQL Server 2008 will only collect and display history in this view when the policy fails. 

    To access this option, open Management Studio and select Policy Management in object explorer.  Right click on Policy Management to select properties.  When LogOnSuccess is set to false (default) successful evaluations are not logged.  Setting LogOnSuccess to true will log all policy evaluation results.

  • PowerShell Learning Resources Online

    PowerShell is all the rage!  It is deeply integrated in SQL Server 2008, where you can start a PowerShell session from Management Studio, and create job steps which execute PowerShell scripts.  SQL Server 2008 will also introduce some great new cmdlets.  This integration of PowerShell in SQL Server 2008 will introduce a number of useful scenarios for centralizing administration and management of your SQL Server environment.  I will be showing some different solutions in the upcoming months, but thought some of you may be interested in resources to learn the scripting language – and learn it in the context of SQL Server administration. 

    The first resource I would recommend would be a series of articles by my colleague and friend Buck Woody.  Buck is on the team which is delivering Management Studio in SQL Server 2008 (http://blogs.gotdotnet.com/buckwoody/archive/2008/04/03/microsoft-loves-dba-s.aspx).  One of the great things about Buck (besides his movie star name and incredible personality) is that he uses his many years as a DBA, MVP, and team manager in his articles to make features and capabilities real for the database community.  Buck has written a SQL Server Reference Guide, and the PowerShell sections begin here (http://www.informit.com/guides/content.aspx?g=sqlserver&seqNum=253).   This is a great introduction to scripting with PowerShell and perfect if you are very new to scripting and PowerShell.  Mind – you will not walk away from these articles as an expert PowerShell programmer – but it will give you the tools to get started.

    SQL Server 2008 Books Online includes a number of links for learning PowerShell (http://technet.microsoft.com/en-us/library/cc281954(SQL.100).aspx ).  I would anticipate that more Microsoft links will be coming as we near the release of SQL Server 2008. 

    Allen White has also been writing different examples for using PowerShell with SQL Server (http://sqlblog.com/blogs/allen_white/archive/tags/PowerShell/default.aspx).  His examples provide some context to using PowerShell in your SQL Server environment. 

    I hope these few online resources help get you started.  Of course, there are a plethora of books available on the topic of PowerShell, although I have not seen one that is specifically written for SQL Server. 

    On that note: Please share your resources in the comments.  Let us know of those great PowerShell books or articles for DBAs!

  • Great Post on SSIS Destinations + More Information

    Another recycled post from days gone by...  

    One of my favorite blogs on SQL Server Integration Services (SSIS) is by Jamie Thompson.  He does a tremendous job sharing tips, tricks, and lessons learned from working with SSIS.  Recently he posted a blog on three of the destination adapters and the performance implications of using each (http://blogs.conchango.com/jamiethomson/archive/2006/08/14/4344.aspx).  The blog was pretty timely, as I had been having this exact conversation with a client the previous week.  I wanted to add a couple of additional thoughts on the material.

    First, after reading the blog, one would ask oneself "Self, why would I ever use the OLE-DB Destination without FastLoad?"  Well, that is a good question.  Undoubtedly, this method is extremely slow.  Reiterate, extremely slow.  OK, slow compared to the counterparts "with FastLoad" or SQL Server Destination.  If you run Profiler while loading into each type of destination, you will quickly see that the OLE-DB Destination without FastLoad inserts each row individually.  Obviously, this is very slow.

    OLE-DB Destinations with FastLoad cannot be set to redirect error rows.  Consider a situation where you would want to trap error records in the destination - such as if a record violates a primary key constraint, or a domain constraint.  If you want to redirect these rows and allow the load to succeed while trapping those bad records for assessment and possible reload at a later time, you CANNOT use FastLoad.  This only leaves OLE-DB without FastLoad, since the SQL Server Destination does not support this behavior either. 

    So you may then ask yourself “Self, why would I choose to use the OLE-DB Destination with FastLoad over the SQL Server Destination?”  Another good question.  Well, to use the SQL Server Destination, the destination database must be on the same server on which you are executing the SQL Server Integration Services package.  From my experience, you will not see a significant performance difference between the two solutions (this is subjective and you should test with your own loads!).  I generally prefer the OLE-DB Destination simply because it gives you better flexibility on where you execute the package.  You may make a decision that the package will always execute from the server where the destiniation resides - but you are painting yourself in a corner and limiting your ability to easily scale out in the future.

  • Memory Pressure on 64 Bit SQL Server 2005

    I have just learned that my old blog will be disabled, so I will be recycling some of the more meaningful blogs to SQLBlog.  This was written last year during an engagement where I was troubleshooting an issue with procedure cache growth.  The good news appears to be that some relief is on the way (SQL Server Procedure Cache: More Relief on the Way). 

    Do you remember the days when you first started learning about 64 bit capabilities and dreaming of the day when you were no longer confined by the barriers of the 32 bit virtual address space?   Those ideas produced dreams of a truly capable scale up solution.  Yes, 64 bit has been a blessing for many of our SQL Server databases.  But as is often the case, there are some drawbacks.  Such is the case for some 64 bit SQL Server 2005 database applications. 

    In the 32 bit OS, you have a total of 4 GB of virtual address space, with 2-3 GB of virtual address space available for the applications (depending on how you configured your boot.ini file).  The 32 bit version of SQL Server is very busy managing the procedure cache, user cache, log cache, extended stored procedures, binaries and buffer (data) cache within the limited confines of the 2-3 GB virtual address space.  When you enable AWE on 32 bit SQL Server, you are extending the buffer (data) cache, and not able to address the available memory for any of these other items – including the procedure cache.  SQL Server must still manage the procedure cache within the constraints of the 2-3 GB of available virtual address space.

    Now let’s consider 64 bit.  These confines are history – and we can now enjoy the capabilities of a very large user cache, very large log cache, and as many are discovering a very large procedure cache.  As was the case with one customer this week, the procedure cache was using over half the available RAM for the instance.  Note:  This happens in specific scenarios.  I am not saying that 64 bit is bad, only that in some scenarios it is important to understand where you need to adjust your management of a database.

    Consider an application which executes a large number of non-parameterized ad hoc statements and dynamic SQL.  This is particularly an issue when executed with EXECUTE vs. sp_execute. See chapter 7 in “Expert SQL Server 2005 Development” for an excellent discussion on the difference (shameless plugJ).  When a statement is executed, SQL Server searches the procedure cache to see if a plan exists for the statement.  If one does not exist, SQL Server will save an execution plan in the procedure cache.  This can quickly lead to a large procedure cache.  These statements may never be used again, but SQL Server cannot differentiate between a recently executed ad hoc statement that may be reused and one that will not ever be reused.  If the plan is considered expensive – or one which requires a lot of work to generate – SQL Server will not age this plan out very quickly.  Since the 64 bit world has the luxury of larger volumes of RAM in which to manage the procedure cache, SQL Server may prefer to keep these large expensive plans in cache. 

    Why is this a problem?  For starters, when SQL Server is under memory pressure it may instead flush pages from our valuable buffer (data) cache instead of flushing what it determines to be expensive plans in the procedure cache.  A larger procedure cache may also impact the time for SQL Server to search the procedure cache to determine if a plan exists.  When the procedure cache is full of meaningful and reusable plans, perhaps this is time well spent.   When your procedure cache is full of plans which will never be reused, it is a waste of resources and a waste of time.

    Currently, there are a few options to manage the problem if you are not able to fix the code.  Of course, fixing the code is the best option.  Reduce or eliminate dynamically built statements in application code.  Reduce or eliminate ad hoc statements, or at least parameterize them.    When this is not an option:

    On regular intervals execute DBCC FREEPROCCACHE. 

    This will flush the procedure cache.  This is the best option when you have exhausted all other options.  Flushing the procedure cache may cause your database application to slow down as SQL Server compiles and stores execution plans for procedures and statements.  In one of my past engagements, this was the only option given their environment.  They had set the database to 8.0 compatibility mode, which did not give them the luxury of the next solutions.  In the end, they fixed their code to parameterize the dynamic SQL and they no longer had problems.  (An interesting note is that this issue was caused by one small batch process that ran nightly - sometimes the little things can hurt a lot!)

    Use Forced Parameterization on the databases which are causing this issue. 

    Forced parameterization will replace any constants in an eligible statement with a variable when the query is compiled.  This is particularly suited when your application builds the same statement with differing constants.  Consider the following statements:

    SELECT * FROM Production.Product WHERE MakeFlag = 1 AND ListPrice > 0

    SELECT * FROM Production.Product WHERE MakeFlag = 0 AND ListPrice > 1

    A single parameterized statement is stored in the procedure cache:

    (@0 int,@1 int)select * from Production . Product where MakeFlag = @0 and ListPrice > @1

    In one customer situation, the procedure cache contained tens of thousands of large duplicate statements with differing constants in the where clause.  Forced parameterization significantly reduced the number of statements stored in the procedure cache.

    This option is best suited when there are a large number of statements that are regularly executed with varying constants in the WHERE clause.  There are tradeoffs with this method, and I would strongly recommend reviewing the requirements and risks in BOL.  Note: This is not available if your database is in backward compatibility (8.0 or lower).

    Use a Plan Guide to Force Parameterization at the Statement Level

    This is the ideal option when you know that there are only a few statements which are causing the procedure cache to grow.  This will be the least intrusive way to force parameterization.  This option will allow you to still maintain a procedure cache and enjoy execution plan reuse.  This may not be a reasonable solution when you have hundreds of statements, as you will need to build a plan guide for each statement.  Note: This is not available if your database is in backward compatibility (8.0 or lower).

     

  • What's in a name?

    A good friend recently asked me to explain the difference between Performance Data Collector,   Performance Studio, and Performance Datawarehouse.  Well, this is an interesting question, and I am certain many people have been trying to figure out how all of these concepts relate to one another.  We at Microsoft have been adjusting the names of many of the new features, in an effort to properly show how the different features may be leveraged.  But, as these names are adjusted it leads to confusion about where to find each feature and how to search for different topics online or in Books Online.

    Performance Studio

    Let’s start with Performance Studio.  This is simply the wrapper of four solutions:  Data Collector, performance and diagnostic monitoring, Management Data Warehouse for historical reporting, and troubleshooting through Policy based management.  This is the vision of a complete Microsoft solution for performance data collection, storage for historical analysis and trending, and diagnostic reporting and dashboards. 

    The term/concept “Performance Studio” has been in a number of different older Microsoft presentations, although I see fewer people referring to “Performance Studio” in more recent material.   Some people have been using the terms Performance Studio and Data Collector interchangeably – but Data Collector is merely a component of the full Performance Studio vision.  You will not find anything in Books Online for “Performance Studio”, but you will find a plethora of information related to each individual component. 

    Performance Data Collector

    Performance Data Collector is synonymous with Data Collector.  We have dropped Performance from the name in later documentation, including Books Online – and for very good reason.  On the surface, the Data Collector appears to be a tool to collect performance related data but I truly believe it is a process that can be leveraged to collect all types of data - both system-related and business-related.  The data collectors can be configured on each server to collect data and store on a central data warehouse. 

    When you configure data collection with Management Studio, you will see three system data collector sets which are installed by default.  These are collecting data related to disk usage, query statistics (poor performing queries) and system utilization (CPU, memory, IO, network, etc).  These “Data Collection Sets” are detailed very well in Books Online, and in the future I will work on a blog related to each of these collection sets.  Each of these data collection sets may be enabled or disabled independently from one another. 

    And… You can create your own custom data collection sets which are defined to use a SQL Server Trace, performance counters or a T-SQL query.  For example, you can set up a custom data collector which will execute a TSQL statement against a table and load the data into the performance data warehouse.   Since I am always thinking about compliance (I am a lot of fun at partiesJ), I think about setting up a custom data collector on my servers to centralize the results of policy evaluation across my enterprise – a single source to report on compliance for my entire enterprise!  There is no gui interface for defining custom data collectors in this current version - you must create these through scripts.

    The Performance Data Warehouse

     That is where we will end our journey.  The Performance Data Warehouse has also been renamed to Management Data Warehouse – again to properly depict the vision of a solution for centralizing all types of data – not just performance related.  At a very high level, the Management Data Warehouse is the destination for the data from the data collector.  When you configure data collection, you define the server instance and database where you would like to store the data for the data collector.  This database may exist on the same instance, or (more likely) on a separate instance on a different server in your enterprise.  This single database may serve as the destination for one or more instances which are configured for data collection.

    The Management Data Warehouse is initially created with tables in one of two schemas: core and snapshot.  The core tables are dimension tables and the snapshot tables are the fact tables.  When you create a custom data collector, a third schema custom_snapshots is created to support the custom data collector sets. 

     

  • We are Leaders!

    Some very exciting news was released last week:  Microsoft SQL Server is in the Leaders quadrant for the Gartner Magic Quadrant for Data Warehousing!

    http://www.microsoft.com/presspass/press/2007/oct07/10-12DWMQPR.mspx

    This news gives reinforces what those of us who work with SQL Server already know:  SQL Server Scales!  With the Data Warehousing enhancements and new capabilities coming with SQL Server 2008, I am sure we are looking forward to a long future as Leaders.

    Read the full report here: http://mediaproducts.gartner.com/reprints/microsoft/article19/article19.html

     

  • PASS Summit Demo Scripts

    Thanks to all who attended my session on SQL Server Encryption this afternoon.  As promised, my demo scripts are available on this site.  I have added an additional script for those looking to estimate the size of a table with a single column encrypted.

    I am now off to more events - I need to pick up all of the free swag I can tonight!  :-)

  • New SQL Server Monitor Gadget for Windows Vista Sidebar

    Jamie Thomson, John Rayner and their team developed a Vista sidebar gadget to monitor SQL Server. 

     

    http://blogs.conchango.com/jamiethomson/archive/2007/08/09/Announcing-SQL-Server-Monitor-Gadget-for-Windows-Vista-Sidebar.aspx

     

    I saw this and my immediate reaction was "What fun!"  Great job building a useful gadget!

     

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement