THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

John Paul Cook

  • BI Beginner: Installing SSDT BI

    The Business Intelligence Development Studio (BIDS) does not come with SQL Server 2014 like it did in previous versions of SQL Server. Furthermore, it has been renamed as SQL Server Data Tools Business Intelligence. If you have Visual Studio 2012, download this. If you have Visual Studio 2013, download this. I recommend that you download and save the file and run the saved copy instead of running directly from the link.

    A few individuals have reported seeing a “Same architecture installation” error message. This is most likely caused by not taking the default of Perform a new installation of SQL Server 2014. Pick the default and you should be fine. It does not install a new instance of the SQL Server database engine.

    image

    Figure 1. Take the defaults for Installation Type.

    Expect to reboot. Open Visual Studio. Go to FILE | New | Project… to see the new templates that have been added to Visual Studio.

    image

    Figure 2. Select Project to see the new Business Intelligence Templates.

    Pick an appropriate template and get down to business.

    image

    Figure 3. Business Intelligence Templates in Visual Studio 2013.

  • In The Cloud: Creating Your First Azure SQL Database

    Creating your first SQL Database in Azure is quick and easy. There are a few things I’ve discovered that might help you be successful faster. Many people have asked me how to get started, so I’ve provided these steps to success. The steps I show you aren’t the only way to achieve success. This is an adaptation of one of my Azure demos.

    First, you need an Azure account. If you have an MSDN subscription, you should be able to use that. Your employer might have Azure credits. If those options aren’t available, you can get a one-month trial here.

    Go to http://azure.microsoft.com and login.

    image

    Figure 1. Default appearance of SQL DATABASES in Azure.

    Sometimes I click CREATE A SQL DATABASE, which is the default appearance of the SQL DATABASES page in Azure. If you’re just starting out on Azure, I suggest that you take a look around to see what all of your options are. That will help you see the big picture. Click on SERVERS.

    image

    Figure 2. Click on SERVERS to see the server view and create a new SQL Database server.

    When working with a physical SQL Server, we create a SQL Server before we create a database. It’s a familiar model, so that’s what I’m showing you how to do in Azure. After you click CREATE A SQL DATABASE SERVER, you’ll be presented with a CREATE SERVER box.

    image

    Figure 3. Be sure to pick a nearby REGION for better response times.

    Once your SQL Database server has been created, you should click MANAGE at the bottom of the page to configure it.

    image

    Figure 4. Click MANAGE to configure your server.

    What happens next and what you should do depends on your network environment. If you have a simple, uncomplicated network, click YES to include your IP address as a permissible IP address allowed to go through the firewall.

    image

    Figure 5. Adding your IP address as a firewall rule.

    Depending on your network’s configuration, the IP address you are shown may not be your actual public IP address. If that’s the case, you’re not going to achieve success until you add your actual public IP address as a firewall rule. Thank you SQL Server MVP Denny Cherry for pointing that out to me. How do you know if the IP address you were shown is your actual public IP address? There are several websites that will show you what your actual, public IP address is. One such site is found here.

    If you discover that your public IP address is different (my public IP address at work differs in the third octet), you’ll need to click your server’s name to configure it.

    image

    Figure 6. Click on your server’s name to see the list of options available.

    On your server’s page, click on CONFIGURE to add an IP address to your firewall rules.

    image

    Figure 7. Click CONFIGURE to add an IP address.

    You can enter additional IP addresses on your server’s CONFIGURE page. Click SAVE after making changes.

    image

    Figure 8. Adding an IP address. Remember to click SAVE.

    You can create a database from within the Azure portal. It’s the simplest way and avoids surprises. If you chose to deploy an existing database into Azure as a SQL Database, not all SQL Server datatypes are implemented in Azure. Notice that within Azure, we have SQL databases, not SQL Server databases. Native Azure database servers and databases have a subset of what is available in SQL Server. There are other differences. In particular, you need to be aware that to migrate a SQL Server database to Azure SQL database, every table must have a clustered index. If you have a small, unindexed table you want to migrate to Azure SQL database, you’ll need to add a clustered index.

  • Migrating Open Source Databases to SQL Server

    In my recent migration of a PostgreSQL database to SQL Server, I made several generic observations that I hope you will find helpful. I used the official PostgreSQL drivers found here. They did work and I was eventually able to use the Import and Export Wizard to load the data into SQL Server. Drivers that work with the Import and Export Wizard will also work with SSIS.

    If you are dealing with large volumes of data, you may need to use the 64-bit version of the Import and Export Wizard instead of the default 32-bit version. It’s easy to find either version by using the Windows search feature. Three tables did not load because they were too large and the 32-bit version of the Import and Export Wizard ran out memory. The 64-bit version of the Import/Export Wizard was able to load all of the tables, including the one with 182 million rows.

    image

    Figure 1. Windows 8 search feature showing both versions of the Import/Export Wizard.

    The bitness of the drivers must match the bitness of the tool. I installed both 32-bit and 64-bit versions of the PostgreSQL ODBC drivers. Once again, the Windows search feature comes in handy for selecting the correct version of the ODBC Data Source Administrator.

    image

    Figure 2. Windows 8 search feature showing both versions of the ODBC Data Source Administrator.

    I was not successful in connecting to the PostgreSQL database when I used the server name. Since this was a one time migration, I took the expedient approach of using the server’s IP address instead of the server name. To obtain an IP address from a Windows server, open a command prompt and enter ipconfig and hit return. For a Linux server, open a terminal console and enter ifconfig and hit return.

    image

    Figure 3. Connecting to the remote database by IP address.

    Although basic connectivity was established, the default option for Specify Table Copy or Query was not exactly what was needed for success. The table names in the PostgreSQL database were all enumerated, but they were in the format TableName instead of a SchemaName.TableName format. It was necessary to select Write a query to specify the data to transfer option so that the SchemaName.TableName format could be manually specified.

    image

    Figure 4. Specify Table Copy or Query.

    Success was still elusive after this change. None of the tables could be created on the destination data source, which was SQL Server 2014.

    image

    Figure 5. Improper schema mappings.

    I found a fix to the schema mappings problems in this blog post on MSDN. Notice that the problem database in the linked page was a Pervasive database. That reinforces my original point that there are some generic issues to consider when migrating data.

    You may want to consider purchasing a third party data migration utility. If you have recurring, mission critical data migration needs, a third party vendor can provide support. This is a particularly important consideration when your source or target database is open source such as PostgreSQL. Some vendors write their own data access code which gives them control (good for you because they can fix problems should any arise) and may also give you better performance. I did exchange emails with Damir Bulic at http://www.spectralcore.com/ who was very helpful in informing me about data migration in general. I particularly appreciate how he pointed out to me that although the tables were indexed and some had primary keys, the largest tables did not have a primary key. Next time, I’ll more carefully check for primary keys before starting a migration. As Damir correctly pointed out, having a primary key allows keyed access to tables so that the entire table doesn’t have to be loaded into memory. His product takes advantage of primary keys so that data can be read in chunks and thus avoid out of memory issues. If you have a 32-bit system, that could be very important. You may want to add a primary key to a large table before attempting a migration.

    I also exported all of the tables into csv files. I knew that my 182 million row table was too big even for my 64-bit version of Excel on a machine with 32 GB of ram, but I wanted to see what would happen. Excel gracefully informed me that it could not load the entire file. As you can see, Excel has a limit of 1,048,576 rows.

    image

    image

    Figure 6. Row limit in Excel.

  • SQL Server Backup to Azure


    Because so many people have asked me for specific details on backing up a SQL Server database to Azure, this post shows the necessary steps to accomplish that. The database name is MIMIC II, which is a clinical database that is available to researchers. It will be used in subsequent posts on data migration and machine learning. The original copy of MIMIC II is in a PostgreSQL database in an Ubuntu virtual machine. I used the SQL Server Import/Export Wizard to import the data into SQL Server, which I find more convenient to work with. When I completed the migration to SQL Server, of course I wanted a backup. Azure was my first choice. By storing it in Azure, I know it is both secure and accessible to me should I ever need it.

    SQL Server 2014 is shown in the screen captures below. SQL Server versions as far back as SQL Server 2005 can be easily backed up to Azure by using the free Microsoft SQL Server Backup to Microsoft Azure Tool.

    The first step is to create a storage account in Azure. On the left hand side of the page, select STORAGE. Next click either + NEW or CREATE A STORAGE ACCOUNT.

    image

    Figure 1. Pick either of the options to create a storage account.

    Specify the first part of the URL that will be used to access the storage account.

    image

    Figure 2. Entering mimic2 for the URL makes the entire URL mimic2.core.windows.net

    image

    Figure 3. Click on MANAGE ACCESS KEYS.

    image

    Figure 4. You must save the access keys. They are your passwords to the storage account.

    You’ll need to create a security credential to use in SSMS. Here is what your key maps to in T-SQL:

    image

    Figure 5. The storage account name maps to IDENTITY. The primary access key maps to SECRET.

    image

    image

    Figure 6. You can use the SSMS gui to create the security credential instead of T-SQL. Notice the primary access key from Azure maps to Password on the New Credential box.

    Now that you have a storage account, the next step is to create a container. Click on CONTAINERS to change the view.

    image

    Figure 7. Click on CONTAINERS to create a container for your backup.

    To create the container, click either + ADD or CREATE A STORAGE ACCOUNT

    image

    Figure 8. Pick either of the options to create a container within your storage account.

    image

    Figure 9. Enter a name for your new container.

    image

    Figure 10. Your backups will reside in the blob container you’ve created.

    Refer back to Figure 5 or Figure 6 to create a security credential. You’ll have to do that before proceeding to the next step.

    Once you have a security credential created, you are ready to back up to Azure.

    image

    Figure 11. In SQL Server 2014, back up to Azure is accessed like any other back up method.

    To backup to Azure, select URL from the Back up to dropdown list and select your credential name from the SQL credential dropdown list,

    image

    Figure 12. Select URL and your credential name from the dropdown lists.

    You probably want to select Backup Options so that you can select Compress backup from the dropdown list. A compressed backup will take less time to upload to Azure and it will occupy less storage space.

    image

    Figure 13. Selecting Compress backup will reduce the upload time and Azure storage space.

    You can also connect to Azure directly from SSMS and browse your Azure storage.

    image

    image

    Figure 14. Select Azure Storage to connect to Azure to look at your Azure storage from within SSMS. The Account key is the primary access key in Azure, the SECRET in T-SQL, or the Password on the New Credential box in SSMS.

    image

    Figure 15. View your Azure storage accounts, containers, and backups from within SSMS.

    image

    Figure 16. You can also examine the contents of your container from within the Azure portal.

  • Antivirus and SQL Server

    Antivirus protection and SQL Server don’t mix well. Some good tips on getting antivirus software to play nicely with SQL Server can be found here. Take a look, you might find something you haven’t considered.

  • SQL Server Analytic Functions

    Last week I demonstrated SQL Server analytic functions at Houston TechFest. Below you will find an expanded set of code samples from my original content. As I explained in the presentation, I also used content from Itzik Ben-Gan’s excellent book from Microsoft Press, which I highly recommend. I also recommend Itzik’s columns at http://sqlmag.com. Joe Celko has written some very informative articles on analytic functions at https://www.simple-talk.com/.

    Run the code samples on a machine running SQL Server 2012 or higher. The code is also available as a file attachment.

    -- create a table that has a column for ordering the data
    CREATE TABLE #numbers (
         nbr    INT
        ,tempus DATE  -- used for ordering the data
    );

    -- insert some sample data
    INSERT INTO #numbers
    (tempus,nbr)
    VALUES
    ('1/1/2018',1)
    ,('2/2/2018',2)
    ,('3/3/2018',3)
    ,('4/4/2018',4)
    ,('5/5/2018',5)
    ,('6/6/2018',6)
    ,('7/7/2018',7)
    ,('8/8/2018',8)
    ,('9/9/2018',9)
    ;

    -- run an ordinary query ordering by the tempus columns
    SELECT nbr
    FROM #numbers
    ORDER BY tempus;

    -- show the nbr value in the current row and in the previous row
    -- the first row retrieved has a NULL for the previous nbr

    SELECT nbr
          ,LAG(nbr, 1) OVER (ORDER BY tempus) AS prevNbr

    FROM #numbers
    ORDER BY tempus;

    -- return the nbr value in the following row
    -- the last row retrieved has a NULL for the following nbr

    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr

    FROM #numbers
    ORDER BY tempus;

    -- return the first value
    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus) AS firstNbr

    FROM #numbers
    ORDER BY tempus;

    -- return the last value
    -- notice how it is really the last value so far

    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus) AS firstNbr
          ,LAST_VALUE(nbr)  OVER (ORDER BY tempus) AS lastNbr

    FROM #numbers
    ORDER BY tempus;

    -- modify code to always return the last value
    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus) AS firstNbr
          ,LAST_VALUE(nbr)  OVER (ORDER BY tempus ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING) AS lastNbr

    FROM #numbers
    ORDER BY tempus;

    -- this returns the same results as the previous query
    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus) AS firstNbr
          ,LAST_VALUE(nbr)  OVER (ORDER BY tempus ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) AS lastNbr

    FROM #numbers
    ORDER BY tempus;

    -- notice that the syntax for lastNbr is not what is needed for firstNbr
    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) AS firstNbr
          ,LAST_VALUE(nbr)  OVER (ORDER BY tempus ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) AS lastNbr

    FROM #numbers
    ORDER BY tempus;

    -- fix the previous query to always show the very first value
    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING) AS firstNbr
          ,LAST_VALUE(nbr)  OVER (ORDER BY tempus ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) AS lastNbr

    FROM #numbers
    ORDER BY tempus;

    SELECT nbr
          ,LAG (nbr, 1) OVER (ORDER BY tempus) AS prevNbr
          ,LEAD(nbr, 1) OVER (ORDER BY tempus) AS nextNbr
          ,FIRST_VALUE(nbr) OVER (ORDER BY tempus ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) AS firstNbr
          ,LAST_VALUE(nbr)  OVER (ORDER BY tempus ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) AS lastNbr

    FROM #numbers
    ORDER BY tempus;

    DROP TABLE #numbers;

  • AdventureWorks2014 installation script

    The AdventureWorks2014 sample database is downloadable from here. If you choose to run the script and are unfamiliar with sqlcmd, this post shows you what to do. Be sure to extract the zip file’s contents to a new folder because the instawdb.sql comes with 72 csv files that contain the actual data. If you specify a folder other than the default, you’ll have to change a path in the instawdb.sql script to point to your directory.

    image

    Figure 1. instawdb.sql script file in SSMS. Notice the 3 lines that begin with a colon.

    It’s always prudent to read comments and examine script files before executing them. Notice that you are instructed to change the paths if your actual locations are different from what is in the script. Also notice that FULL TEXT SEARCH should be enabled. The most important instruction in the comments is that the script must be run in SQLCMD Mode, which is available in SSMS but not enabled by default.

    image

    Figure 2. Enable SQLCMD Mode from the Query menu.

    Once SQLCMD is enabled, the SQLCMD statements are highlighted.

    image

    Figure 3. SQLCMD enabled and SQLCMD statements highlighted.

    It took about a minute and a half for the script to execute on my machine, so be patient. You might want to turn off SQLCMD after you’re done installing AdventureWorks2014 to restore SSMS to the state you’re more familiar with.

  • Surface Pro 3 first impressions

    I traded in my Surface 2 (the trade-in program is now over) and bought a Surface Pro 3 with an i7 processor and 8 GB of ram. I greatly prefer the 3 by 2 aspect ratio of the Surface 3. After only one day of ownership, I’ve decided to purchase a docking station. I have a 7 year old desktop with a quad core Q6600 processor overclocked to 3.0 GHz and 8 GB of ram. It has a Plextor 512 MB SSD as the primary drive. It’s a very capable machine, but it does have a little bit, and I do mean only a little bit, of fan noise. I’d like to eliminate even that small amount of noise.

    My subjective initial impression was that my new Surface Pro 3 was significantly outperforming my desktop. Time to use PerformanceTest to get the facts. The Surface Pro 3 outperforms my desktop significantly on every measure except one of the 2D graphics tests. The desktop was slightly better drawing 2D vectors at 18.7k vectors/sec compared to the Surface’s 15.7k vectors/sec.

    PerfRes

    Figure 1. PerformanceTest 8.0 test results.

    I have 6 TB of storage on my desktop. The desktop could be moved to the closet where my router and switch are. With the docking station, the Surface can use my gigabit network to access the files on the desktop. My desktop cannot run the latest version of Hyper-V, which is most inconvenient. Installing Visual Studio 2013 installs Hyper-V. As a reader points out below, this alters the behavior of how Windows runs on the device.

    What I’ve found is that I don’t want to go back to non-touch enabled devices. I use both touch and a Bluetooth mouse with my Surface Pro 3, which is what I also did with my previous Surface 2. Having said that, I use the mouse with SSMS. I’m a little afraid of using touch when doing things to a database. If you’re going to use the Surface Pro 3 for development or doing demos of SQL Server, I recommend that you get 8 GB of ram. Both i7 models have 8 GB of ram. Only the higher version of the i5 Surface that has 256 MB of storage comes with 8 GB of ram.

    UPDATE: After taking the Surface to work today, I concluded that I want a lunch bag with a padded slot to hold the Surface. That way I can carry everything in one convenient bag. To the entrepreneur who creates such a lunch bag, send me one and we’ll call it even, okay?

  • Unsupported way to secure unsupported XP

    The free software tool nLite can be used to customize an XP installation. It isn’t supported by Microsoft, but neither is XP. It is possible to remove Internet Explorer or networking components from an installation. Sometimes there is no need for the XP machine to ever connect to a network. This is particularly true when XP is used to host software used to run a piece of specialized industrial or medical equipment that requires vendor software that runs only on XP. Or maybe your application needs to connect to a network printer but not surf the web.

    I have used nLite and was pleased with the results. When I used it to customize XP builds for restricted use, I would remove the games. This reduces the temptation for users to use the XP machine for anything other than its intended use.

    I recommend that people move off of XP. But I do understand that isn’t feasible for everybody. For those who continue to use XP, nLite might provide a means to reduce the attack surface area of an XP installation. Using an unsupported tool to remove components of an operating system of course introduces new risks of possibly removing something that is needed. Test thoroughly and retest.

  • Free Azure HDInsight ebook from Microsoft

    Microsoft has released a free ebook on Azure HDInsight. Full details are available here. HDInsight is Microsoft’s implementation of Hadoop. Azure HDInsight is Hadoop in the cloud. The ebook gives you a quick overview of what Big Data is and what you can do with it.

  • Windows 8.1 Boot into Desktop Mode

    If you primarily work in desktop mode, you can easily reconfigure Windows 8 to start in desktop mode when booting up. Once in desktop mode, go to the Taskbar, right-click and then select Properties to change the Navigation on your machine.

    image

    Figure 1. Right-click on the Taskbar select Properties.

    On the Navigation tab, check When I sign in or close all apps on a screen, go to the desktop instead of Start to enable booting into desktop mode.

    image

    Figure 2. Check Show my desktop background on Start, Show the Apps view automatically when I go to Start, and List desktop apps first in the Apps view when it’s sorted by Category.

    There are other things you might want to enable. If you check Show the Apps view automatically when I go to Start, you’ll see your applications instead of the tiled view when you switch back to the Modern UI. Since I prefer the category view, I also checked List desktop apps first in the Apps view when it’s sorted by Category. Notice that the desktop background appears because Show my desktop background on Start is enabled.

    image

    Figure 3. The Modern UI with Show my desktop background on Start, Show the Apps view automatically when I go to Start, and List desktop apps first in the Apps view when it’s sorted by Category all checked.

  • Windows Not Sleeping All Night

    Having a computer wake up when you don’t want it to wastes electricity and drains the battery on mobile devices. My desktop had been waking up at night, so I assumed it was some network traffic on my home network. I unchecked Allow this device to wake the computer on my network adapters.

    image

    Figure 1. Network adapter Power Management tab.

    That didn’t solve the problem. I included the screen capture in Figure 1 because it could be part of the solution for someone else.

    To identify the root cause instead of guessing, the System log was examined. As you can see, pressing the sleep button was putting the machine to sleep.

    image

    Figure 2. Timestamp of when the computer was put to sleep.

    To figure out what was waking the machine up, the System log was checked to review all events following the sleep event.

    image

    Figure 3. System log showing what woke the computer up.

    This is what requested the computer to wake up:

    Wake Source: Timer - Windows will execute 'NT TASK\Microsoft\Windows\TaskScheduler\Regular Maintenance' scheduled task that requested waking the computer.

    A Bing search provided more information: http://support.microsoft.com/kb/2799178

    To change this behavior, go to the Action Center and click on Change maintenance settings.

    image

    Figure 4. Automatic Maintenance section in the Action Center.

    There are two options for modifying the Automatic Maintenance schedule. You can change the time or disable the ability for automatic maintenance to wake your device.

    image

    Figure 5. Options for controlling maintenance events waking the computer.

  • Web-based data generator

    One of my coworkers told me about Mockaroo, a web-based data generator. I needed some test data for upcoming blog posts, so I decided to give it a try. It’s pretty good. I had to use Firefox because of problems running Mockaroo on Internet Explorer 11. Using the defaults except for changing the format to SQL, it generated output that looked something like the following. Mockaroo is so good that it generates fake data that could accidentally be real, such as email addresses. Consequently, I edited the values shown below so that they are hopefully actually fake.

    create table MOCK_DATA (
        id INT,
        first_name VARCHAR(50),
        last_name VARCHAR(50),
        email VARCHAR(50),
        country VARCHAR(50),
        ip_address VARCHAR(20)
    );
    insert into MOCK_DATA (id, first_name, last_name, email, country, ip_address) values (1, 'John', 'Deaux', 'jdeaux@fakeDomain.fakeDomain', 'Made', '255.255.255.255');

    It has lots of options, but some edits to the generated output may be necessary for Unicode compatibility in SQL Server. Look at the following output (which my always helpful friend Greg Low says is politically correct):

    create table TEST_DATA (
        chinese VARCHAR(50)
    );
    insert into TEST_DATA (chinese) values ('空调);

    That’s not what is needed for SQL Server to properly handle Unicode characters. The preceding example needs to be changed to the following:

    create table TEST_DATA (
        chinese NVARCHAR(50)
    );
    insert into TEST_DATA (chinese) values (N’空调);

    Mockaroo is extensible because it can use regular expressions to generate data.

  • TechEd 2014 Day 4

    Many people visiting the SQL Server booth wanted to know how to improve performance. With so much attention being given to COLUMNSTORE and in-memory tables and stored procedures, it is easy to overlook how important tempdb is to performance. Speeding up tempdb I/O improves performance. The best way to do this is to not do the I/O in the first place. With SQL Server 2014, tempdb page management is smarter. Pages are more likely to be released before being unnecessarily flushed to disk. Read more about it here.

    Of course, not all tempdb I/O can be eliminated. Performance can be improved by moving database files to flash memory storage. To maximize the performance benefits of flash storage, buffer pool extensions should be enabled.

  • TechEd 2014 Day 3

    There is some confusion about durability of data stored in SQL Server in-memory tables, so some review of the concepts is appropriate. The in-memory option is enabled at the database level. Enabling it at the database level only gives you the option to specify the in-memory feature on a table by table basis. No existing tables or new tables will by default become in-memory tables when you enable the feature at the database level.

    If you choose to make a table an in-memory table, by default it is durable with changes being recorded in the transaction log. You do not have to worry about data loss. However, you have an additional option of making an in-memory table not durable. If you actively choose to do this, you will have data loss if, for example, the server crashes. There are legitimate use cases of choosing to override the default behavior and create in-memory tables that are not durable. If you need staging tables for ETL, non-durable in-memory tables will provide high performance. If you need temporary tables for a particular processing need, non-durable in-memory tables can outperform tables in tempdb.

More Posts Next page »

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement