A place for my thoughts and experiences on SQL Server, Business Intelligence and .NET
In the last months, in the spare time, I started to study PHP in order to use it on a Windows + SQL Server box. Why you would do such thing you may be wondering. The point is that Wordpress is actually, IMHO, the state of the art of a free CMS that must be used as the backend for a community site: it’s feature-rich, it has a *lot* of plugins and themes, it can be used to host blogs and to empower a “thematic” website. In my case I’d like to refresh the engine used to publish the Italian SQL Server User Group website.
I’ve looked for a lot of alternatives in the .NET World, and I evaluated in the last year
- Community Server
For one reason or another, none of the mentioned platforms, which are great platforms BTW, was the right for us. We needed something
- Capable of managing a community portal with news, articles, events, calendars and so on
- Capable of managing the blogs of members, allowing the generation of new blog sub site on the fly
- Fully customizable with a minimum effort for the end user
- Enabled to use HTML5 and CSS3
- Stable and Mature, with a good documentation and/or forum support
- Easy to be extended/modified adapted to our needs
- Compatible with MSN Live Writer
- Compatible with SQL Server
- Capable of hosting forums or capable of be integrated with a 3rd party forum platform
And at the end the platform that suits all our need is…Wordpress!
Of course this decision bring some challenges in the game:
- I need to be sure that Wordpress can work *well* with SQL Server
- I need to integrate Wordpress with a forum software.
Luckily Microsoft has written a cool abstraction layer for Wordpress, that make it compatible with SQL Server. And, even more luckily, there is a mainstream forum solution, PHPBB, natively compatible with SQL Server.
Of course not everything is as smooth as one would like it to be, so there are some “attention point” that one need to take into account when going in this way. And since there isn’t a lot of documentation available on running Wordpress together with PHPBB on SQL Server, I though that writing some post can be helpful to the community. After all Wordpress and PHPBB are two *great* solution and having them available on SQL Server is something desirable in my opinion.
So, in the next months, I’ll write a series of four (maybe five) posts to describe how to have a Wordpress + PHPBB on IIS + SQL Server solution up and running.
Here’s the agenda of the next posts:
I’ll hope you’ll enjoy the topics!
Already 10 days has passed since SQL Bits X in London. I really enjoyed it! Those kind of events are great not only for the content but also to meet friends that – due to distance – is not possible to meet every day. Friends from PASS, SQL CAT, Microsoft, MVP and so on all in one place, drinking beers, whisky and having fun. A perfect mixture for a great learning and sharing experience!
I’ve also enjoyed a lot delivering my session on Temporal Snapshot Fact Tables. Given that the subject is very specific I was not expecting a lot of attendees….but I was totally wrong! It seems that the problem of handling daily snapshot of data is more common than what I expected.
I’ve also already had feedback from several attendees that applied the explained technique to their existing solution with success. This is just what a speaker in such conference wish to hear! :)
If you want to take a look at the slides and the demos, you can find them on SkyDrive:
The demo is available both for SQL Sever 2008 and for SQL Server 2012. With this last version, you can also simplify the ETL process using the new LEAD analytic function. (This is not done in the demo, I’ve left this option as a little exercise for you :) )
If you’re using MDS and DQS with the Excel Integration you may get an error when trying to use the “Match Data” feature that uses DQS in order to help to identify duplicate data in your data set.
The error is quite obscure and you have to enable WCF error reporting in order to have the error details and you’ll discover that they are related to some missing permission in MDS and DQS_STAGING_DATA database.
To fix the problem you just have to give the needed permession, as the following script does:
GRANT SELECT ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb]
GRANT INSERT ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb]
GRANT DELETE ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb]
GRANT UPDATE ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb]
ALTER AUTHORIZATION ON SCHEMA::[db_datareader] TO [VMSRV02\mdsweb]
ALTER AUTHORIZATION ON SCHEMA::[db_datawriter] TO [VMSRV02\mdsweb]
ALTER AUTHORIZATION ON SCHEMA::[db_ddladmin] TO [VMSRV02\mdsweb]
Where “VMSRV02\mdsweb” is the user you configured for MDS Service execution. If you don’t remember it, you can just check which account has been assigned to the IIS application pool that your MDS website is using:
I think I can say that we all agree that SQL Server 2012 brings with revolution in our database world, since the number of new feature and new possibilities, IMHO, is comparable to what SQL Server 2005 brought several years ago.
Interesting enough, even the well-known IDC firm thinks that SQL Server 2012 can be a game changer. From a technical point of view the highlighted technologies are:
- ColumnStore Index
- Windows Server Core Support
- Power View
- BI Sematic Model
- Data Quality Services
- Hadoop Support
- Cloud & On-Premise Integration
- Support for PHP, Java and Linux
The full document (less than 10 pages) is very interesting – definitely worth reading – and can be extremely helpful to bring the awareness of what SQL Server 2012 can offer and how it can help business, also to non technical people:
IDC WHITE PAPER - Microsoft SQL Server 2012- Potential Game Changer
I’ve released couple of hours ago the SP2 of my DTLoggedExec tool.
For those who don’t know it, it’s a DTEXEC replacement, useful to execute SSIS and having logging provided right from the engine and not from the package itself.
More info can be found here:
This SP2 release add an important feature to the CSV Log Provider. It's now possible to store a personalized label into each log, in order to make it easy to identify or group logs.
Let's say, for example, that you have 10 packages in your ETL solution, and each time you have to load your data, you need to execute all those 10 packages. In other words , you have a batch made of 10 packages. It would be nice if all logs - one for each package - can be identified as a whole and grouped together, in order to quickly identify all the log of a articular batch.
This will make things easier when you want to know the overal time consumed by each batch execution.
The new "ExecutionLabel" attribute will help to achieve this. A useful ExecutionLabel can be obtained using SQL Server Agents Token. For example:
- Updated the CSV Log Provider in order to write the status of the log file in the header.
- An OPEN status means that the log file is being written.
- A CLOSED status means that the log files has been written correctly.
- A file can be loaded into the database log only if is in the CLOSED state.
- CSV Log files have 2 additional rows in the header: one for the FileStatus and one for the ExecutionLabel values.
- File format has been update to 4 from 3. (Only the header section of the file has been changed).
- CSV Log Provider will now display EndTime value to the Console
- Updated the import-log-data.sql to correctly load file with format 3 (the old one) and 4 (the new one).
- Updated database schema to version 19 in order to store the new ExecutionLabel value
- Update samples in order to show how to use the new ExecuteLabel option
As usual the download is available for free here:
I’ve updated my SYS2 scripts:
- Added a new script to see how much buffer cache memory is used by each database
- Updated the sys2.stats script in order to have only one row per statistics
- Updated the sys2.query_stats script to use the sys.dm_exec_plan_attributes dmv to get better information on which database was used by the cached plans
As usual they are available from CodePlex:
IBM doesn’t like MS. That’s a fact. And that’s why you can get your machine.config file (!!!) corrupted if you try to install IBM DB2 data providers on your server machine.
If at some point, after having installed IBM DB2 data providers your SSIS packages or SSAS cubes or SSRS Reports starts to complain that 'DbProviderFactories' section can only appear once per config
you may want to check into you machine.config, located in the %runtime install path%\Config
Almost surely you’ll find a IBM DB2 Provider into an additional DbProviderFactories section all alone. Poor guy. Remove the double DBProviderFactories entry, and merge everything inside only one section DBProviderFactories and after that everything will start to work again.
Today I had to schedule a package stored in the shiny new SSIS Catalog store that can be enabled with SQL Server 2012. (http://msdn.microsoft.com/en-us/library/hh479588(v=SQL.110).aspx)
Once your packages are stored here, they will be executed using the new stored procedures created for this purpose. This is the script that will get executed if you try to execute your packages right from management studio or through a SQL Server Agent job, will be similar to the following:
Declare @execution_id bigint
EXEC [SSISDB].[catalog].[create_execution] @package_name='my_package.dtsx', @execution_id=@execution_id OUTPUT, @folder_name=N'BI', @project_name=N'DWH', @use32bitruntime=False, @reference_id=Null
DECLARE @var0 smallint = 1
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'LOGGING_LEVEL', @parameter_value=@var0
DECLARE @var1 bit = 0
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'DUMP_ON_ERROR', @parameter_value=@var1
EXEC [SSISDB].[catalog].[start_execution] @execution_id
The problem here is that the procedure will simply start the execution of the package and will return as soon as the package as been started…thus giving you the opportunity to execute packages asynchrously from your T-SQL code. This is just *great*, but what happens if I what to execute a package and WAIT for it to finish (and thus having a synchronous execution of it)?
You have to be sure that you add the “SYNCHRONIZED” parameter to the package execution. Before the start_execution procedure:
exec [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'SYNCHRONIZED', @parameter_value=1
And that’s it .
From the RC0, the SYNCHRONIZED parameter is automatically added each time you schedule a package execution through the SQL Server Agent. If you’re using an external scheduler, just keep this post in mind .
One of the major criticism to DAX is the lack of a decent editor and more in general of a dedicated IDE, like the one we have for T-SQL or MDX.
Well, this is no more true. On Codeplex a very interesting an promising Visual Studio 2010 extension has been released by the beginning of November 2011:
Intellisense, Syntax Highlighting and all the typical features offered by Visual Studio are available also for DAX.
Right now you have to download the source code and compile it, and that’s it!
Just reading from all posts and tweets it’s quite clear that this year’s PASS has been a great success. Despite of the bad cold that I had from Monday , I’ve enjoyed it at lot. All the parties, the friends, the community and the great content make PASS an appointment one that should not to be missed.
I’ve also enjoyed a lot delivering “Temporal Snapshot Fact Table” session. From the feedback I had attendees enjoyed it too .
All the related material will be soon available here
Temporal Snapshot Fact Table [BIA-406-S]
Upgrading SSIS to Denali - Management Considerations and Best Practices [BIA-311-S]
For all those who cannot wait to download the slides and the demos from PASS website, I’ve uploaded everything on my SkyDrive folder. If you need them just send me an email, I’ll be happy to send the public link to you
A brillant idea and an opportunity that no-one that works with databases (DBAs or Developers) should miss:
A free online course on databases:
This course covers database design and the use of database management systems for applications. It includes extensive coverage of the relational model, relational algebra, and SQL. It also covers XML data including DTDs and XML Schema for validation, and the query and transformation languages XPath, XQuery, and XSLT. The course includes database design in UML, and relational design principles based on dependencies and normal forms. Many additional key database topics from the design and application-building perspective are also covered: indexes, views, transactions, authorization, integrity constraints, triggers, on-line analytical processing (OLAP), and emerging "NoSQL" systems.
The istructor will be Professor Jennifer Widom, a ACM Fellow and a member of the National Academy of Engineering and the American Academy of Arts & Sciences; she also received the ACM SIGMOD Edgar F. Codd Innovations. I’ve already done my registration: I’m sure I’ll learn something new and useful and I’ll get a refresh of good old concepts…which is always a good thing.
A praise to Stanford University for this excellent initiative!
A big thanks to my friend and colleague Luca Zavaralla for pointing out this opportunity!
I just discovered yesterday the possibility to run an application under the credential of a domain user, even if you’re not in a domain. This is a very useful feature for me: being a consultant I work with a lot of different customers, each one with its own domain, and each one (of course) with a different user account for myself.
I cannot join all their domains so I have to work outside the domain, or I have to create a Virtual Machine with all the tools I need and then join their domain. This, unfortunately, means a lot of installation and maintenance work.
But what I discovered yesterday simply changed my life : to execute an application using a domain user, even if you’re not I a domain, all you have to do is to use the /netonly option of the runas command!
To launch Excel, for example:
runas /netonly /user:THEDOMAIN\theuser "C:\Program Files (x86)\Microsoft Office\Office14\EXCEL.EXE"
And that’s it! Now you can browse the cubes on SSAS (for example) using the software on your machine. Cool!
Today I’ve released the first Service Pack of DTLoggedExec (for those who doesn’t know what it is: DTLoggedExec is a DTExec replacement to run Integration Services packages):
This Service Pack fixes some little problems with the .bat and .sql files that comes with DTLoggedExec. All the fixes were already published as single changesets (86188, 86299, 87778, 88124 and 91054) and the Service Pack put them all togheter for users convenience.
You can download the full DTLoggedExec package with the SP1 already integrated or the single Service Pack 1 that you can integrate manually in your existing installations (all you have to do is to overwrite the existing files):