THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Stacia Misner

  • Working with Reporting Services Filters–Part 5: OR Logic

    When you combine multiple filters, Reporting Services uses AND logic. Once upon a time, there was actually a drop-down list for selecting AND or OR between filters which was very confusing to people because often it was grayed out. Now that selection is gone, but no matter. It wouldn’t help us solve the problem that I want to describe today.

    As with many problems, Reporting Services gives us more than one way to apply OR logic in a filter. If I want a filter to include this value OR that value for the same field, one approach is to set up the filter is to use the IN operator as I explained in Part 1 of this series. But what if I want to base the filter on two different fields? I  need a different solution.

    Using the AdventureWorksDW2008R2 database, I have a report that lists product sales:

    SNAGHTML139a4b29

    Let’s say that I want to filter this report to show only products that are Bikes (a category) OR products for which sales were greater than $1,000 in a year.

    If I set up the filter like this:

    Expression Data Type Operator Value
    [Category] Text = Bikes
    [SalesAmount]   > 1000

    Then AND logic is used which means that both conditions must be true. That’s not the result I want.

    Instead, I need to set up the filter like this:

    Expression Data Type Operator Value
    =Fields!EnglishProductCategoryName.Value = "Bikes" OR Fields!SalesAmount.Value > 1000 Boolean = =True

    The OR logic needs to be part of the expression so that it can return a Boolean value that we test against the Value. Notice that I have used =True rather than True for the value. The filtered report appears below. Any non-bike product appears only if the total sales exceed $1,000, whereas Bikes appear regardless of sales. (You can’t see it in this screenshot, but Mountain-400-W Silver, 38 has sales of $923 in 2007 but gets included because it is in the Bikes category.)

    SNAGHTML13b7749a

  • The Case of the Missing Date/Time Stamp: Reporting Services 2008 R2 Snapshots

    This week I stumbled upon an undocumented “feature” in SQL Server 2008 R2 Reporting Services as I was preparing a demonstration on how to set up and use report snapshots. If you’re familiar with the main changes in this latest release of Reporting Services, you probably already know that Report Manager got a facelift this time around. Although this facelift was generally a good thing, one of the casualties – in my opinion – is the loss of the snapshot label that served two purposes… First, it flagged the report as a snapshot. Second, it let you know when that snapshot was created.

    As part of my standard operating procedure when demonstrating report snapshots, I point out this label, so I was rather taken aback when I didn’t see it in the demonstration I was preparing. It sort of upset my routine, and I’m rather partial to my routines. I thought perhaps I wasn’t looking in the right place and changed Report Manager from Tile View to Detail View, but no – that label was still missing. In the grand scheme of life, it’s not an earth-shattering change, but you’ll have to look at the Modified Date in Details View to know when the snapshot was run. Or hope that the report developer included a textbox to show the execution time in the report. (Hint: this is a good time to add this to your list of report development best practices, whether a report gets set up as a report snapshot or not!)

    A snapshot from the past

    In case you don’t remember how a snapshot appeared in Report Manager back in the old days (of SQL Server 2008 and earlier), here’s an image I snagged from my Reporting Services 2008 Step by Step manuscript:

    clip_image001

    A snapshot in the present

    A report server running in SharePoint integrated mode had no such label. There you had to rely on the Report Modified date-time stamp to know the snapshot execution time. So I guess all platforms are now consistent.

    Here’s a screenshot of Report Manager in the 2008 R2 version. One of these is a snapshot and the rest execute on demand. Can you tell which is the snapshot?

    image

    Consider descriptions as an alternative

    So my report snapshot demonstration has one less step, and I’ll need to edit the Denali version of the Step by Step book. Things are simpler this way, but I sure wish we had an easier way to identify the execution methods of the reports. Consider using the description field to alert users that the report is a snapshot. It might save you a few questions about why the data isn’t up-to-date if the users know that something changed in the source of the report. Notice that the full description doesn’t display in Tile View, so keep it short and sweet or instruct users to open Details View to see the entire description.

    image

  • Working with Reporting Services Filters - Part 4: Creating a NOT IN Filter

    A question came in from a reader asking about how to create a NOT IN filter. You can see in the first of this series of blog posts about Reporting Services filters, NOT IN is not available in our list of operators for filters. However, it's still possible to produce a filter that excludes items in the list. The technique is not particularly intuitive, which is why I've added this filter type to my series on filters.

    Preparing the Report

    Let's start with a simple scenario using the AdventureWorksDW2008R2 database. I set up a query that returns reseller sales by category. In my report, I have two tables: one for Bikes and Components, and the other for everything else. I want to create a filter for the first table using the IN operator, but I'll have to create an expression for the second table to achieve the NOT IN effect.

    To hold the list of values for my IN filter, I create a hidden report parameter with multi-values, without available values, and with a list of default values that includes Bikes and Components, like this:

    image  image

    Of course, I don't need to manually populate the default values. I could use a query instead, but I think you get the idea.

    Using the IN Operator

    This step is not really required to set up the NOT IN filter, but I have included it as another example of working with the IN operator to supplement my first post in this series. In my report, I want the first table to include the values that are defined in the parameter, so I add a filter to the tablix like this:

    Expression Data Type Operator Value
    [Category] Text In =Parameters!FilterList.Value

     

    You can also use [@FilterList] as the value if you want to type in a simple expression into the Value field. If you use the Expression Editor to set up the Value expression, and double-click on FilterList in the Parameters category, be sure to remove the (0) from the end of the expression so that the IN operator is comparing Category to all values in the parameter array, not just the first value.

    Creating a NOT IN Filter

    For the second table, I also add a filter, but I set it up differently. There is no such thing as a NOT IN operator, so I need to get creative. I need to come up with an expression that evaluates as True or False, and then set that up as my Value in the filter definition. Then I’ll set the Expression of the filter to True and use an = operator. The filter will keep rows where Value is True and exclude rows which cause Value to be False.

    Expression Data Type Operator Value
    =True Boolean = =Iif(InStr(Join(Parameters!FilterList.Value,","),
    Fields!Category.Value)=0,True,False)

     

    For Expression, note that I have =True and not just True. This is important because True by itself will be interpreted as a string instead of a Boolean data type.

    I use the InStr function in the value to compare the current row’s Category to the FilterList which I convert from an array to a comma-delimited string by using the Join function. The InStr function returns 0 if the category is not found, which is the equivalent of NOT IN. Thus, I have the expression return a True – which keeps the row in the second table.

    Checking the Results

    Here’s the final report:

    image

    If you’d like to have a look at how I set this up, you can download the RDL.

    Have you found another way to solve NOT IN? Let me know!

  • I Hereby Resolve… (T-SQL Tuesday #14)

    TSQL2sDay150x150It’s time for another T-SQL Tuesday, hosted this month by Jen McCown (blog|twitter), on the topic of resolutions. Specifically, “what techie resolutions have you been pondering, and why?” I like that word – pondering – because I ponder a lot. And while there are many things that I do already because of my job, there are many more things that I ponder about doing…if only I had the time. Then I ponder about making time, but then it’s back to work! In 2010, I was moderately more successful in making time for things that I ponder about than I had been in years past, and I hope to continue that trend in 2011. If Jen hadn’t settled on this topic, I could keep my ponderings to myself and no one would ever know the outcome, but she’s egged me on (and everyone else that chooses to participate)! So here goes…

    For me, having resolve to do something means that I wouldn’t be doing that something as part of my ordinary routine. It takes extra effort to make time for it. It’s not something that I do once and check off a list, but something that I need to commit to over a period of time. So with that in mind, I hereby resolve…

    To Learn Something New…

    One of the things I love about my job is that I get to do a lot of things outside of my ordinary routine. It’s a veritable smorgasbord of opportunity! So what more could I possibly add to that list of things to do? Well, the more I learn, the more I realize I have so much more to learn. It would be much easier to remain in ignorant bliss, but I was born to learn. Constantly. (And apparently to teach, too– my father will tell you that as a small child, I had the neighborhood kids gathered together to play school – in the summer. I’m sure they loved that – but they did it!) These are some of things that I want to dedicate some time to learning this year:

    • Spatial data. I have a good understanding of how maps in Reporting Services works, and I can cobble together a simple T-SQL spatial query, but I know I’m only scratching the surface here. Rob Farley (blog|twitter) posted interesting examples of combining maps and PivotViewer, and I think there’s so many more creative possibilities. I’ve always felt that pictures (including charts and maps) really help people get their minds wrapped around data better, and because a lot of data has a geographic aspect to it, I believe developing some expertise here will be beneficial to my work.
    • PivotViewer. Not only is PivotViewer combined with maps a useful way to visualize data, but it’s an interesting way to work with data. If you haven’t seen it yet, check out this interactive demonstration using Netflx OData feed. According to Rob Farley, learning how to work with PivotViewer isn’t trivial. Just the type of challenge I like!
    • Security. You’ve heard of the accidental DBA? Well, I am the accidental security person – is there a word for that role? My eyes used to glaze over when having to study about security, or  when reading anything about it. Then I had a problem long ago that no one could figure out – not even the vendor’s tech support – until I rolled up my sleeves and painstakingly worked through the myriad of potential problems to resolve a very thorny security issue. I learned a lot in the process, and have been able to share what I’ve learned with a lot of people. But I’m not convinced their eyes weren’t glazing over, too. I don’t take it personally – it’s just a very dry topic! So in addition to deepening my understanding about security, I want to find a way to make the subject as it relates to SQL Server and business intelligence more accessible and less boring.

    Well, there’s actually a lot more that I could put on this list, and a lot more things I have plans to do this coming year, but I run the risk of overcommitting myself. And then I wouldn’t have time…

    To Have Fun!

    My name is Stacia and I’m a workaholic. When I love what I do, it’s difficult to separate out the work time from the fun time. But there are some things that I’ve been meaning to do that aren’t related to business intelligence for which I really need to develop some resolve. And they are techie resolutions, too, in a roundabout sort of way!

    • Photography. When my husband and I went on an extended camping trip in 2009 to Yellowstone and the Grand Tetons, I had a nice little digital camera that took decent pictures. But then I saw the gorgeous cameras that other tourists were toting around and decided I needed one too. So I bought a Nikon D90 and have started to learn to use it, but I’m definitely still in the beginning stages. I traveled so much in 2010 and worked on two book projects that I didn’t have a lot of free time to devote to it. I was very inspired by Kimberly Tripp’s (blog|twitter) and Paul Randal’s (blog|twitter) photo-adventure in Alaska, though, and plan to spend some dedicated time with my camera this year. (And hopefully before I move to Alaska – nothing set in stone yet, but we hope to move to a remote location – with Internet access – later this year!)
    • Astronomy. I have this cool telescope, but it suffers the same fate as my camera. I have been gone too much and busy with other things that I haven’t had time to work with it. I’ll figure out how it works, and then so much time passes by that I forget how to use it. I have this crazy idea that I can actually put the camera and the telescope together for astrophotography, but I think I need to start simple by learning how to use each component individually. As long as I’m living in Las Vegas, I know I’ll have clear skies for nighttime viewing, but when we move to Alaska, we’ll be living in a rain forest. I have no idea what my opportunities will be like there – except I know that when the sky is clear, it will be far more amazing than anything I can see in Vegas – even out in the desert - because I’ll be so far away from city light pollution.

    I’ve been contemplating putting together a blog on these topics as I learn. As many of my fellow bloggers in the SQL Server community know, sometimes the best way to learn something is to sit down and write about it. I’m just stumped by coming up with a clever name for the new blog, which I was thinking about inaugurating with my move to Alaska. Except that I don’t know when that will be exactly, so we’ll just have to wait and see which comes first!

  • Working with Reporting Services Filters – Part 3: The TOP and BOTTOM Operators

    Thus far in this series, I have described using the IN operator and the LIKE operator. Today, I’ll continue the series by reviewing the TOP and BOTTOM operators.

    Today, I happened to be working on an example of using the TOP N operator and was not successful on my first try because the behavior is just a bit different than we find when using an “equals” comparison as I described in my first post in this series. In my example, I wanted to display a list of the top 5 resellers in the United States for AdventureWorks, but I wanted it based on a filter. I started with a hard-coded filter like this:

    Expression Data Type Operator Value
    [ResellerSalesAmount] Float Top N 5

    And received the following error:

    A filter value in the filter for tablix 'Tablix1' specifies a data type that is not supported by the 'TopN' operator. Verify that the data type for each filter value is Integer.

    Well, that puzzled me. Did I really have to convert ResellerSalesAmount to an integer to use the Top N operator?

    Just for kicks, I switched to the Top % operator like this:

    Expression Data Type Operator Value
    [ResellerSalesAmount] Float Top % 50

    This time, I got exactly the results I expected – I had a total of 10 records in my dataset results, so 50% of that should yield 5 rows in my tablix.

    image

    So thinking about the problem with Top N some  more, I switched the Value to an expression, like this:

    Expression Data Type Operator Value
    [ResellerSalesAmount] Float Top N =5

    And it worked!

    So the value for Top N or Top % must reflect a number to plug into the calculation, such as Top 5 or Top 50%, and the expression is the basis for determining what’s in that group. In other words, Reporting Services will sort the rows by the expression – ResellerSalesAmount in this case – in descending order, and then filter out everything except the topmost rows based on the operator you specify.

    image

    The curious thing is that, if you’re going to hard-code the value, you must enter the value for Top N with an equal sign in front of the integer, but you can omit the equal sign when entering a hard-coded value for Top %. This experience is why working with Reporting Services filters is not always intuitive!

    When you use a report parameter to set the value, you won’t have this problem. Just be sure that the data type of the report parameter is set to Integer. Jessica Moss has an example of using a Top N filter in a tablix which you can view here.

    Working with Bottom N and Bottom % works similarly. You just provide a number for N or for the percentage and Reporting Services works from the bottom up to determine which rows are kept and which are excluded.

  • Working with Reporting Services Filters – Part 2: The LIKE Operator

    In the first post of this series, I introduced the use of filters within the report rather than in the query. I included a list of filter operators, and then focused on the use of the IN operator. As I mentioned in the previous post, the use of some of these operators is not obvious, so I'm going to spend some time explaining them as well as describing ways that you can use report filters in Reporting Services in this series of blog posts.

    Now let's look at the LIKE operator. If you write T-SQL queries, you've undoubtedly used the LIKE operator to produce a query using the % symbol as a wildcard for multiple characters like this:

    select * from DimProduct where EnglishProductName like '%Silver%'

    SNAGHTML91bd74b

    And you know that you can use the _ symbol as a wildcard for a single character like this:

    select * from DimProduct
    where EnglishProductName like '_L Mountain Frame - Black, 4_'

     SNAGHTML92af546

    So when you encounter the LIKE operator in a Reporting Services filter, you probably expect it to work the same way. But it doesn't. You use the * symbol as a wildcard for multiple characters as shown here:

    Expression Data Type Operator Value
    [EnglishProductName] Text Like *Silver*

    Note that you don’t have to include quotes around the string that you use for comparison.

    Books Online has an example of using the % symbol as a wildcard for a single character, but I have not been able to successfully use this wildcard. If anyone has a working example, I’d love to see it!

    UPDATE: Thanks to loyal reader Pravin, I can confirm that the ? character is the wildcard for the single character. 

  • Working with Reporting Services Filters–Part 1

    There are two ways that you can filter data in Reporting Services. The first way, which usually provides a faster performance, is to use query parameters to apply a filter using the WHERE clause in a SQL statement. In that case, the structure of the filter depends upon the syntax recognized by the source database. Another way to filter data in Reporting Services is to apply a filter to a dataset, data region, or a group. Using this latter method, you can even apply multiple filters. However, the use of filter operators or the setup of multiple filters is not always obvious, so in this series of posts, I'll provide some more information about the configuration of filters.

    First, why not use query parameters exclusively for filtering? Here are a few reasons:

    • You might want to apply a filter to part of the report, but not all of the report.
    • Your dataset might retrieve data from a stored procedure, and doesn't allow you to pass a query parameter for filtering purposes.
    • Your report might be set up as a snapshot on the report server and, in that case, cannot be dynamically filtered based on a query parameter.

    Next, let's look at how to set up a report filter in general. The process is the same whether you are applying the filter to a dataset, data region, or a group. When you go to the Filters page in the Properties dialog box for whichever of these items you selected (dataset, data region, group), you click the Add button to create a new filter. The interface looks like this:

    SNAGHTMLae7e7a9

    The Expression field is usually a field in the dataset, so to make it easier for you to make a selection,the drop-down list displays all of the current dataset fields. But notice the expression button to the right, which means that you can set up any type of expression-not just a dataset field. To the right of the expression button, you'll find a data type drop-down list. It's important to specify the correct data type for the field or expression you're using.

    Now for the operators. Here's a list of the options that you have:

    This Operator

    Performs This Action
    =, <>, >, >=, <, <=, Like Compares expression to value
    Top N, Bottom N Compares expression to Top (Bottom) set of N values (N = integer)
    Top %, Bottom %

    Compares expression to Top (Bottom) N percent of values
    (N = integer or float)

    Between

    Determines whether expression is between two values, inclusive
    In Determines whether expression is found in list of values

    Last, the Value is what you're comparing to the expression using the operator. The construction of a filter using some operators (=, <>, >, etc.) is fairly simple. If my dataset (for AdventureWorks data) has a Category field, and I have a parameter that prompts the user for a single category, I can set up a filter like this:

    Expression Data Type Operator Value
    [Category] Text = [@Category]

    But if I set the parameter to accept multiple values, I need to change the operator from = to In, just as I would have to do if I were using a query parameter. The parameter expression, [@Category], which translates to =Parameters!Category.Value, doesn’t need to change because it represents an array as soon as I change the parameter to allow multiple values. The “In” operator requires an array.

    With that in mind, let’s consider a variation on Value. Let’s say that I have a parameter that prompts the user for a particular year – and for simplicity’s sake, this parameter only allows a single value, and I have an expression that evaluates the previous year based on the user’s selection. Then I want to use these two values in two separate filters with an OR condition. That is, I want to filter either by the year selected OR by the year that was computed. If I create two filters, one for each year (as shown below), then the report will only display results if BOTH filter conditions are met – which would never be true.

    Expression Data Type Operator Value
    [CalendarYear] Integer = [@Year]
    [CalendarYear] Integer = =Parameters!Year.Value-1

    To handle this scenario, we need to create a single filter that uses the “In” operator, and then set up the Value expression as an array. To create an array, we use the Split function after creating a string that concatenates the two values (highlighted in yellow) as shown below.

    Expression Data Type Operator Value
    =Cstr(Fields!CalendarYear.Value) Text In =Split(
    CStr(Parameters!Year.Value) + ”,” + CStr(Parameters!Year.Value-1)
    , “,”)

    Note that in this case, I had to apply a string conversion on the year integer so that I could concatenate the parameter selection with the calculated year. Pay attention to the second argument of the Split function—you must use a comma delimiter for the result to work correctly with the In operator.

    I also had to change the Expression value from [CalendarYear] (or =Fields!CalendarYear.Value) so that the expression would return a string that I could compare with the values in the string array.

    More fun with filter expressions in future posts!

  • My Right-to-Left Foot (T-SQL Tuesday #13)

    TSQL2sDay150x150As a business intelligence consultant, I often encounter the situation described in this month's T-SQL Tuesday, hosted by Steve Jones ( Blog | Twitter) – “What the Business Says Is Not What the  Business Wants.” Steve posed the question, “What issues have you had in interacting with the business to get your job done?”

    My profession requires me to have one foot firmly planted in the technology world and the other foot planted in the business world. I learned long ago that the business never says exactly what the business wants because the business doesn't have the words to describe what the business wants accurately enough for IT. Not only do technological-savvy barriers exist, but there are also linguistic barriers between the two worlds. So how do I cope?

    The adage "a picture is worth a thousand words" is particularly helpful when I'm called in to help design a new business intelligence solution. Many of my students in BI classes have heard me explain ("rant") about left-to-right versus right-to-left design. To understand what I mean about these two design options, let's start with a picture:

    DW

    When we design a business intelligence solution that includes some sort of traditional data warehouse or data mart design, we typically place the data sources on the left, the new solution in the middle, and the users on the right. When I've been called in to help course-correct a failing BI project, I often find that IT has taken a left-to-right approach. They look at the data sources, decide how to model the BI solution as a _______ (fill in the blank with data warehouse, data mart, cube, etc.), and then build the new data structures and supporting infrastructure. (Sometimes, they actually do this without ever having talked to the business first.) Then, when they show what they've built to the business, the business says that is not what we want. Uh-oh.

    I prefer to take a right-to-left approach. Preferably at the beginning of a project. But even if the project starts left-to-right, I'll do my best to swing it around so that we’re back to a right-to-left approach. (When circumstances are beyond my control, I carry on, but it’s a painful project for everyone – not because of me, but because the approach just doesn’t get to what the business wants in the most effective way.) By using a right to left approach, I try to understand what it is the business is trying to accomplish. I do this by having them explain reports to me, and explaining the decision-making process that relates to these reports. Sometimes I have them explain to me their business processes, or better yet show me their business processes in action because I need pictures, too. I (unofficially) call this part of the project "getting inside the business's head." This is starting at the right side of the diagram above.

    My next step is to start moving leftward. I do this by preparing some type of prototype. Depending on the nature of the project, this might mean that I simply mock up some data in a relational database and build a prototype report in Reporting Services. If I'm lucky, I might be able to use real data in a relational database. I'll either use a subset of the data in the prototype report by creating a prototype database to hold the sample data, or select data directly from the source. It all depends on how much data there is, how complex the queries are, and how fast I need to get the prototype completed.

    If the solution will include Analysis Services, then I'll build a prototype cube. Analysis Services makes it incredibly easy to prototype. You can sit down with the business, show them the prototype, and have a meaningful conversation about what the BI solution should look like. I know I've done a good job on the prototype when I get knocked out of my chair so that the business user can explore the solution further independently. (That's really happened to me!) We can talk about dimensions, hierarchies, levels, members, measures, and so on with something tangible to look at and without using those terms. It's not helpful to use sample data like Adventure Works or to use BI terms that they don't really understand. But when I show them their data using the BI technology and talk to them in their language, then they truly have a picture worth a thousand words. From that, we can fine tune the prototype to move it closer to what they want. They have a better idea of what they're getting, and I have a better idea of what to build.

    So right to left design is not truly moving from the right to the left. But it starts from the right and moves towards the middle, and once I know what the middle needs to look like, I can then build from the left to meet in the middle. And that’s how I get past what the business says to what the business wants.

  • Creating a Linked Server for Analysis Services

    I think I’ve finally recovered (in more ways than one!) from PASS Summit 2010 in Seattle a mere few weeks ago. I seem to be eternally plagued with computer problems at every conference I attend, but always in different ways. Accordingly, I have devised different strategies for anticipating and coping with problems. I foiled this last disaster by having my slides and demos duplicated on Erika’s laptop which I had to call into action literally seconds before the session began due to a hard drive failure, but missed one little detail which was actually a last minute bonus addition to my session on Analysis Services Stored Procedures (ASSP). So it wasn’t the end of the world, but it was a forehead slapping moment!

    As part of that session, I told a story of how I came to appreciate the use of ASSP for a client project in which I had to dynamically construct MDX queries on the fly using T-SQL stored procedures and pass the resulting query into an OPENQUERY function. The problem that I had in that situation was the limitation imposed by the OPENQUERY function – I  had to write my MDX query in 8000 characters or less! Erika and I worked out all kinds of ways to live within that constraint before we tried ASSPs, but ultimately the complexity of these queries kept pushing over the query string over the limit and thus began our adventures with ASSP. I’ll add a post on this technique to my series on Using Dynamic MDX in Reporting Services soon. (The first post describes the use of StrToMember() and StrToSet() and the second post shows how to use MDX in a dynamic OLE DB query.

    I had a few extra minutes of time in my session, so I thought I’d show the audience “the rest of the story” by demonstrating how I could use the OPENQUERY function with the ASSP that I created earlier in the session. I  had the OPENQUERY working on my laptop just a few minutes before the session started, but in preparing Erika’s laptop, I didn’t think about setting up a linked server to Analysis Services in advance. No worries – I’ll just do it live! Well, my memory failed me on how to fill out the New Linked Server dialog box in Management Studio, so I asked the audience if anyone knew what to do and the reply I got was, “It’s a mystery!” (It took me a few times to realize that’s what they said – I was suffering from the “SQL plague” and my ears were really plugged up!)

    Nothing a little search engine won’t resolve, right? Well, I turned up a few hits (like the Books Online entry Adding a Linked Server), but nothing was particularly helpful for interpreting which parameters are really needed and which are optional – especially when I was on the spot! So my post today is my contribution to resolving the mystery. Of course, now that I’m offstage and working on my replaced hard drive and with a much less congested head, I was able to set up a linked server quite quickly.

    To set up a linked server for Analysis Services, follow these steps:

    1. Connect to the Database Engine in Management Studio.
    2. Expand Server Objects, right-click Linked Servers, and click New Linked Server.
    3. Define a name for the linked server, and set the following options as shown in the screenshot below:
    • Provider: Microsoft OLE DB Provider for Analysis Services 10.0
    • Product name: MSOLAP.4
    • Data source: (the name of your server)
    • Catalog: (the name of the Analysis Services database)

    image

    To test the linked server, you can write a query like this:

    select * from openquery(AdventureWorksOLAP, 'select [Measures].[Sales Amount] on columns from [Adventure Works]')

    A couple of things to note - This query is executed as a database engine query, not as an MDX query. The first argument of the OPENQUERY is the name of your linked server.

  • PASS Summit 2010: Epilogue

    Another PASS Summit has come and gone, and a good time was had by all. The song at the opening keynote sums up the experience quite nicely, "Simply the Best." My favorite quote of the week comes from Andy Leonard (blog| twitter) who tweeted on November 9, "#sqlpass is a family reunion. :{>" Oddly enough, Andy is one of the few people that I DIDN'T get to see last week, but not for lack of trying. Sorry, Andy!

    As an independent consultant, I don't get to interact regularly with my peers, so I love the reunion aspect of PASS. And now that I've started following people on Twitter, my universe of colleagues has expanded even more, and I was delighted to meet the people behind the avatars.

    The joy of PASS is not limited to those of us who have attended for many years. As I was sitting in the airport last Friday, I recognized an attendee who was a first-timer and asked about his experience. He replied that it exceeded expectations. He was so anxious to get back to work to put what he learned into practice. He felt that the value of the knowledge that he is bringing back to the office from PASS was so much greater than the price of the conference. Now that's a ringing endorsement if I ever heard one.

    There is so much more that could be said about PASS, and many in the community have already posted their thoughts on Twitter and on their blogs. There has been quite a stir in the business intelligence community this past week about the future of Analysis Services, about which I will comment in a future post. Today, however, I will focus on a few follow-up comments and links to resources related to activities in which I participated last week.

    Getting Started In Blogging And Technical Speaking

    Kendal Van Dyke (blog| twitter) presented a session with tips for getting started in either of blogging or speaking. He invited a panel of experts to join him, including myself, Rob Farley (not pictured below) (blog| twitter), Aaron Bertrand (blog | twitter), Buck Woody (blog | twitter), Todd McDermid (blog| twitter), Mike Walsh (blog | twitter), Thomas LaRock (blog | twitter), Ted Krueger (blog | twitter), Patrick LeBlanc (blog | twitter), Andy Warren (blog | twitter), and Brent Ozar (blog | twitter).

    (Photo courtesy of Brent Ozar) 

    Who had control of this very interactive session? You'll have to buy the PASS Summit DVD to find out!

    I waited patiently for my turn to speak. With a room full of speakers, getting your own turn can be a challenge! I heard a few attendees express concern that they didn't know what to write about, or that someone else has covered the topic. My response to this concern is that everyone has a unique take on a topic, and that's why T-SQL Tuesday is such a great way both to learn and to contribute. T-SQL Tuesday was started by Adam Mechanic (blog | twitter) and gives everyone with an opinion (and who doesn't have one?) or a specific experience to add to the topic. You can see the first invitation to T-SQL Tuesday here to see how it all started and the most recent T-SQL Tuesday posts hosted by Paul Randal (blog | twitter). The best way to keep tabs on who's hosting the next round is to follow the #TSQL2sDay hash tag on Twitter.

    Unfortunately, I didn't get to hang out for the entire presentation, because I had to move on to the…

    Women in Technology Panel

    I was honored to participate as a panelist for this year's Women in Technology luncheon. It was well attended, and I heard so many positive comments after the event from both men and women. I drew inspiration from my fellow panelists as well as the stories shared with me by other women attending PASS this year. You can view a recording of the event here if you're a registered member of PASS (which is free to join).

    It's so difficult to say what the right answer is for increasing the numbers of women in technology. The numbers are diminishing at a deplorable rate (as I discussed in a previous post). It seems to me that to foster change we need to start laying the groundwork with our children. By "our children", I mean society in general, not me specifically, although I have tried do my part! On the one hand, I don't recommend forcing children into a career path that they can't embrace enthusiastically. On the other hand, I believe that one reason that girls don't pursue technology as an option is lack of exposure to the possibilities. Lynn Langit (blog | twitter) is a role model for showing kids (not just girls!) how to explore these possibilities through www.teachingkidsprogramming.org. Check it out!

    Along these lines, I proposed that maybe - as great a community as PASS is - we should collectively think about what we can do for our kids. Someone tweeted that I suggested we should bring our daughters to SQLSaturday, but actually I wondered aloud if we could do something similar to SQLSaturday that focused on the kids (and not just girls). Maybe we could get some sponsors to help, too?

    Demystifying MDX in Reporting Services

    In this session, I explained some of the nuances of working with MDX in Reporting Services. I have posted my demo reports here.

    In addition, you might want to refer back to some of my recent posts about using dynamic MDX in Reporting Services: Part 1 and Part 2.

    If you're not familiar with MDX, I presented Session 07: Intro to MDX for 24 Hours of Pass: Summit Preview which you can view if you have a free PASS membership. MDX is not going away any time soon, contrary to recent rumors, so invest some time learning it if you plan to work with real Analysis Services cubes, which will continue to have their place in the BI stack for several years to come.

    Real World Analysis Services Stored Procedures

    This topic drew a larger audience than I expected as it's a fairly specialized topic. For years, I never needed to use Analysis Services Stored Procedures (ASSP), avoiding it because folklore said so due to performance hits. However, some things just can't be done any other way and I ran into such things this past year. To date, I haven't found much written about ASSP other than BOL, but you can find some excellent examples to download at CodePlex. Plus I've uploaded the very simple (non-production-ready) C# example that I used in my session demonstration for you to peruse.

    What's Next?

    So now that I've completed all the training and Webcasts and conference sessions that I've been focused on the last couple of months (with one exception - Delivering Information with Reporting Services, a free Webcast at 12 pm Pacific on Wednesday, November 17), I plan to get back to a more regular blogging schedule. There are certainly plenty of topics on my "to do" list!

  • Using Dynamic MDX in Reporting Services: Part 2

    In this post, I continue my exploration of approaches to working with dynamic MDX in a Reporting Services dataset when you are using Analysis Services as a data source. I began this series with a look at string conversion functions in Part 1. In Part 1, the dynamic MDX relied on parameters that work as a filter on the query results.

    But what if you want to change the structure of the query itself? A parameter won't help with that. Instead, you need to create the query string at run-time. However, the Analysis Services data source in Reporting Services doesn't allow you to use an expression to define the query string. That's okay- I'll just do an end run around that problem. I'll use an OLE DB provider to connect to my cube and then I can build up the query string by using an expression. In this post, I walk you through the process.

    The context for this demonstration is a report that allows the user to specify the sets that appear on rows and columns of a matrix and to select one measure. It's a very simple example that focuses on the dataset construction, and doesn't spend as much time on the beautification of the report. Hopefully, it will give you some ideas to leverage for your own reports.

    Create a data source

    First, I need to create an OLE DB source. In the Type drop-down list, I select "OLE DB" and then I provide a connection string like this:

    Provider=MSOLAP.4;Data Source=.;Initial Catalog="Adventure Works DW 2008R2"

    The Edit button allows you to use a UI to generate the string if you don't want to remember how to construct it manually.

    

    Parameters

    This set of steps is necessary to create the lists from which the user makes the selection. In my very simple example, I have created two parameters - Rows and Columns - and hard-coded possible lists. You can do more interesting things here, of course. Just make sure that the user can't make the same selection for both parameters - whether you enforce that by manually providing the values or by doing something clever with a dynamically generated list based on a query.

    I created the Measure parameter with the following values:

    Label Value
    Sales Amount ="[Measures].[Sales Amount]"
    Order Quantity ="[Measures].[Order Quantity]"
    Gross Profit Margin ="[Measures].[Gross Profit Margin]"

    I created the Columns parameters like this:

    Label Value
    Calendar Year ="[Date].[Calendar Year].[Calendar Year].Members"
    Reseller Business Type ="[Reseller].[Business Type].[Business Type].Members"

    And the Rows parameters like this:

    Label Value
    Product Category ="[Product].[Category].[Category].Members"
    Sales Territory Country ="[Sales Territory].[Sales Territory Country].[Sales Territory Country].Members"

    Dataset

    Although the ultimate goal is to produce a dynamic MDX query, it's actually easier to start the design of the dataset using a static query and then to switch it out later. By using a static query, the fields for the dataset are autogenerated. The less work I have to do, the better.

    Now one thing about dynamic MDX in Reporting Services is the need to make sure the number of fields in the dataset are the same each and every time. Therefore I need to structure the query differently than I would if I were to write it for a "normal" Analysis Services client. That is, I need to rewrite a query like this:

    select
    non empty [Date].[Calendar Year].[Calendar Year].Members on columns,
    non empty [Product].[Category].[Category].Members on rows
    from [Adventure Works]
    where [Measures].[Sales Amount]
    which produces a result like this:
    to a query that looks like this:
    select
    [Measures].[Sales Amount] on columns,
    non empty
    ([Product].[Category].[Category].Members,
    [Date].[Calendar Year].[Calendar Year].Members)
    on rows
    from [Adventure Works]

    and produces a result like this:
    However, while structurally the result set is what I need, the fields generated for the dimensions above will change each time that I run the dynamic query with different specifications for rows and columns. So I need to modify the query one more time like this:
    with
    member [Measures].[Measure] as [Measures].[Sales Amount]
    member [Measures].[RowValue] as [Product].[Category].CurrentMember.Name
    member [Measures].[ColumnValue] as [Date].[Calendar Year].CurrentMember.Name
    select {[Measures].[Measure], [Measures].[RowValue], [Measures].[ColumnValue]} on columns,
    non empty ([Product].[Category].[Category].Members, [Date].[Calendar Year].[Calendar Year].Members) on rows
    from [Adventure Works]
    To get a result like this:
    Now I can reference the generic Measure, RowValue, and ColumnValue in the report layout.

    Matrix

    I like to test things out before I start introducing more complexity. So my next step is to add a matrix layout to the report design, and put fields into the layout and apply a little formatting as shown below.

    One extra step related to formatting is required in my example. I have three possible measures, each of which uses a different format string. If I were displaying detail records, I could use the formatted_value cell property as an extended property in the textbox expression, replacing Fields!Measure.Value with Fields!Measure.FormattedValue. However, the use of a matrix here doesn't work with that approach, so... I need to create a conditional expression to set the Format property correctly:

    =Switch(Parameters!Measure.Label = "Sales Amount", "C2", Parameters!Measure.Label = "Order Quantity", "N0",
    Parameters!Measure.Label = "Gross Profit Margin", "P2")

    Then I preview the report to make sure all is well, which it is.

    Query expression - step 1

    Now it's time to do the deed - convert the query string to an expression. To do this, I open Dataset Properties and click the expression button (fx) next to the Query box.

    The first step is just to enclose the query in double-quotes and prefix with an equal sign and to eliminate all the line feeds in the query. The expression needs to be one long string. If you really must add line feeds to make it easier to read, you can set up the expression like this:

    ="with"
    + " member [Measures].[Measure] as [Measures].[Sales Amount]"
    + " member [Measures].[RowValue] as [Product].[Category].CurrentMember.Name"
    + " member [Measures].[ColumnValue] as [Date].[Calendar Year].CurrentMember.Name"
    + " select {[Measures].[Measure], [Measures].[RowValue], [Measures].[ColumnValue]} on columns,"
    + " non  empty ([Product].[Category].[Category].Members, [Date].[Calendar Year].[Calendar Year].Members) on rows"
    + " from [Adventure Works]"

    Just make sure to allow for a space between words on separate lines. I put it at the beginning of each new line so that I can see it easily. I then preview the report again to make sure that the expression works before I add in the next layer of complexity.

    Query expression - step 2

    Next I plug in parameter values in the appropriate sections of the query, like this:

    ="with"
    + " member [Measures].[Measure] as " + Parameters!Measure.Value
    + " member [Measures].[RowValue] as"
    + " " + Split(Parameters!Rows.Value,"]")(0) + "]" + Split(Parameters!Rows.Value,"]")(1)+ "].CurrentMember.Name"
    + " member [Measures].[ColumnValue] as"
    + " " + Split(Parameters!Columns.Value,"]")(0) + "]"
    + Split(Parameters!Columns.Value,"]")(1)+ "].CurrentMember.Name"
    + " select {[Measures].[Measure], [Measures].[RowValue], [Measures].[ColumnValue]} on columns,"
    + " non empty (" + Parameters!Rows.Value + ", " + Parameters!Columns.Value + ") on rows"
    + " from [Adventure Works]"

    Then I preview again. Here's the report with the default parameter settings: products on rows, dates on columns, and sales amount as the measure.

    And here's the report with sales territory on rows, business type on columns, and gross profit margin as the measure.

    Mission accomplished!

  • Using Dynamic MDX in Reporting Services: Part 1

    If you're using Analysis Services as a data source for Reporting Services reports, you can build a simple dataset using the graphical query designer, but you'll want to switch to the generic query designer to create the MDX query string manually when you have more advanced requirements. Using the generic query designer, you can:

    · Impose greater control over the sets that you want to add to the rows axis by using set functions.

    · Add query-scoped named sets to the query in addition to calculated members. (Calculated members can also be added in the graphical query designer, but not named sets.)

    · Build dynamic MDX queries.

    When would you need a dynamic MDX query? Whenever you want to modify the query based on a condition known only at run-time, typically based on a parameter value. If you're using the graphical query designer, you can auto-generate the report parameter's query for available values by selecting the Parameter checkbox. When the user selects a value during report execution, Reporting Services passes the unique name for the selection to the query and all is well. However, there might be situations when the user selection doesn't come from the cube, so you must find a way to convert the parameter value into a value that will work with the query.

    In a series of posts, I will explore the available options for working with dynamic MDX queries in Reporting Services. By dynamic MDX, I mean that the query can be different each time it executes. In this post, I cover the use of StrToMember() and StrToSet() functions in parameters.

    A very common scenario is the requirement to pass dates into a query. If you have a date filter for the report, do you really want users to navigate through a list of dates from the cube as shown below?

    Date Parameter List

    This list of dates - even if it's arranged hierarchically by month, quarter, and year - is what you get when you build the parameter directly from the date hierarchy in the query designer as shown below.

    Graphical Query Designer

    Wouldn't a more user-friendly experience allow the user to select a data from a calendar control? I can do this by changing the auto-generated report parameter's data type to a Date/Time data type and clear the "Allow multiple values" check box. I must also change the Available Values setting for the parameter to None. I can set the default value to "No default value" to force the user to make a selection, or do something nice like define an expression to set a date, like =Today().

    So far, so good. But the problem now is that the date data type returned by the calendar control cannot be used by the MDX query without some intervention. I need to change the Parameter Value mapped to the query parameter in the Dataset Properties to an expression, like this:

    ="[Date].[Calendar].[Date].[" + Format(CDate(Parameters!DateCalendar.Value), "MMMM d, yyyy") + "]"

    The expression that you use to convert a date like 2/1/2008 to a valid unique name in your Date dimension might look different. My example is specific to the Adventure Works 2008 R2 cube, which requires the date member to look like this: [Date].[Calendar].[Date].[February 1, 2008].

    That's fine so far, but the result of this expression is a string and the MDX query requires a member or a set. The autogenerated query already makes this change for you fortunately. However, if you're creating your query manually, you should understand what it's doing, especially if you need to make changes to it.

    The autogenerated query looks like this before I make changes:

    SELECT NON EMPTY { [Measures].[Sales Amount] } ON COLUMNS,
    NON EMPTY { ([Product].[Category].[Category].ALLMEMBERS ) }
    DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS
    FROM ( SELECT ( STRTOSET(@DateCalendar, CONSTRAINED) ) ON COLUMNS
    FROM [Adventure Works])
    WHERE ( IIF( STRTOSET(@DateCalendar, CONSTRAINED).Count = 1, 
    STRTOSET(@DateCalendar, CONSTRAINED), [Date].[Calendar].currentmember ) ) 
    CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS

    I prefer to simplify the query as shown below - removing the text highlighted in red text above. The function does what it says - changes the string (represented by the parameter @DateCalendar) into a set object. I remove the WHERE clause from the query as the FROM clause adequately restricts the query results to cell values related to the selected date. If I need the dimension properties in the report to display something or if I need the cell properties for report formatting, I'll include only the ones I need, but for this example I have removed them all from the query.

    SELECT NON EMPTY { [Measures].[Sales Amount] } ON COLUMNS,
    NON EMPTY { ([Product].[Category].[Category].ALLMEMBERS ) } ON ROWS
    FROM ( SELECT ( STRTOSET(@DateCalendar, CONSTRAINED) ) ON COLUMNS
    FROM [Adventure Works])

    You could change the StrToSet() function to a StrToMember() function and get the same result. It's not harmful to leave StrToSet() as it is. It just returns a set of one member in this case-the date from the calendar control which is a valid set. The CONSTRAINED flag is used to prevent an injection attack and requires the expression to resolve to a valid member before the query executes.

    One challenge that often confounds people working with MDX queries in the generic query designer is the inability to copy and paste the query into Management Studio for testing when parameters are in the query as shown above. The MDX query editor doesn't support parameters. Teo Lachev (blog | twitter) posted some advice for working with parameterized MDX queries in Management Studio which I encourage you to check out.

    In my next post, I'll explain how to use the OLE DB for OLAP provider with dynamic MDX to create a dataset.

  • 24 Hours of Pass Summit Preview: Introduction to MDX

    Many thanks to everyone who attended my session today, Introduction to MDX. There was so much more I wanted to say, but the constraint of an hour required me to be very selective and, as you might have noticed, to sacrifice the demos. After all, how exciting can it be to watch me execute a query? I thought it more important to provide you with the key concepts that you can use to get started.

    When I first learned MDX, I didn't learn it the way that I explained it today. Consequently, I struggled for a long time until the proverbial light bulb turned on in my head about how to think about the language. I hope that my explanation saves you some of the grief that I experienced, although I can't promise that you aren't going to have (and lose) a few wrestling matches as you build up your MDX muscles.

    As promised, I have sample queries that illustrate the concepts that I discussed today. Before you can run the queries, I assume that you have both a SQL Server 2008 R2 database instance and an Analysis Services instance. You will need to download and install the AdventureWorks 2008 R2 sample database. Then open the Intro to MDX solution included in the zip file and deploy the project to create the simple cube for the queries. Then open the MDX files in SQL Server Management Studio. You can run the entire script, and thereby execute multiple queries, or run one at a time by highlighting the query text as described in the header of each MDX file. These queries and the SSAS solution might work with earlier versions of SQL Server, but is untested.

    Download the zip file containing the queries and SSAS solution here.

    If you plan to attend the PASS Summit 2010 in Seattle, be sure to see my session, Demystifying MDX in Reporting Services, where I explain how to use MDX with Reporting Services. There are a few differences from the query structure that I described today that you'll need to know.

  • Building a Data Mart with Integration Services

    Last week I presented a Webcast, sponsored by Idera, that received some favorable feedback, plus some requests for my sample files. To recap, my focus was on the process of building the data mart, rather than trying to explain (in 45 minutes!) all the things that you can do with Integration Services. In some cases, your data is relatively straightforward and small, but still doesn't provide an optimal experience for reporting and analysis. I explain how you can use some rapid development techniques to create a simple data mart for these scenarios, and point you to some resources for learning more about best practices that are necessary for larger BI projects.

    Click here to view the Webcast (after registering with Idera) and click here to download the sample package. Although I created the package using Business Intelligence Development Studio for SQL Server 2008 R2, it will probably work with SQL Server 2008 also, but definitely not with SQL Server 2005.

    To execute the package, you will need to download and install the AdventureWorks 2008 R2 sample database and then follow my example in the Webcast to create a data mart. It might work with earlier versions, but is untested.

    I will confess to a blooper in the demonstration. I forgot to add in the RegionSourceKey (which is visible in the slides that summarize the steps) to the SalesTerritory dimension, which is required for the FactSales package to execute.

  • Do-It-Yourself Map Gallery in SQL Server 2008 R2 Reporting Services

    A common question that I get when introducing the map feature in SQL Server 2008 R2 Reporting Services is why the map gallery is limited to maps of the United States and individual states in the US. The reason as I've heard it explained is that the political boundaries for other countries are sometimes in dispute and it was better to stay away from any legal issues that might arise from publishing a map as part of a software installation. Presumably that means that boundaries between states (and even towns) are not in dispute, but that's not entirely true as a CBS New report highlights.

    Border integrity notwithstanding, the good news is that even though Microsoft doesn't give you international maps in the map gallery, you can do it yourself. The files that the Reporting Services installation provides are simply reports created using TIGER/Line Shapefiles available free from the United States Census Bureau and stored in the Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies\MapGallery folder. (Look in the Program Files (x36) folder if you're developing on a 64-bit computer.) You can add your own reports to this same folder to update the map gallery.

    You can't add just any report to the Map Gallery for this technique to work. The report must contain a map with embedded spatial data. You can create such a report by creating a map using any of the three options for spatial data - map gallery, ESRI Shapefile, or a SQL Server spatial query. Then, on the "Choose spatial data and map view options" page of the Map Wizard, you must select the "Embed map data in the report" option as shown below.

    Using the map that I created for my post earlier this year about the map control, I eliminated the point layer and then saved the RDL file to the map gallery folder. Then whenever I want to create a new map for France, I can select it in the Map Wizard as shown below.

    If you plan to use this option for making additional maps available to report developers, be aware that you must do this on every computer using Business Intelligence Development Studio to create reports. If you have users creating reports with Report Builder 3.0, you also need to store the map RDL in the following location: C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\ReportBuilder\RptBuilder_3\MapGallery where MSRS10_50.MSSQLSERVER is the name of the Reporting Services instance.

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement