Adam Jorgensen, Executive Vice President Finance and Governance, welcomed the crowd. He pointed out we're providing 391 hours of education and technical training this year. Adam's excited about the future growth of PASS, and the need to develop new leaders throughout the organization. He shared the financial basis for the organization. Revinew for FY is $8.3M, a 10% increase over the previous year. Summit attendees have doubled since 2007. Revenue has grown consistenly since 2008. PASS has delivered 27 '24 Hours of PASS' events since its inception. Sponsorship sales have continued to grow. Since 1999, PASS has grown to over 150,000 members from 163 countries. 78% of the funds that PASS raises goes directly back to the community.
Denise McInerney, Vice President of Marketing, and presented the PASS Outstanding Volunteer award to Bill Graziano, outgoing PASS Immediate Past President, for everything he's done over the years for PASS. She then spoke about the community and our outstanding volunteers. 71 people have been nominated for Outstanding Volunteer since 2012. Ten people were nominated for this year's PASSion award, and it was given to Lance Harra this year.
PASS Summit will be back in Seattle next year, October 25-28, 2016.
The Keynote is Data Management for the Internet of Things (IoT), co-presented by David DeWitt, Technical Fellow, Microsoft Jim Gray Systems Labs, and Rimma Nehme, Principal Software Engineer, Microsoft Jim Gray Systems Labs.
Doctor Nehme came out to thank the PASS community for the new clicker, to replace the one she had that was broken last year. She explained that she'll begin the talk (as the "appetizer"), Dr. DeWitt will present the
IoT involves taking a physical object, add analytics, and you then provide value that the object couldn't have without the combination. There are 2 types of IoT - the consumer type, and the industrial type. Consumer includes things like fitbet, nest, etc. You can use that information to identify credit risk, education, general health, etc., of any of us.
Companies gain value from IoT based on unconventional revenues, inccremental revenues, and operational efficiency. IoT is still in its infancy.
There are four types of IoT capabilities - monitoring, control, optimization and autonomy. We're right at the peak of the "hype" cycle of IoT right now. Around 2008 the number of devices connected to the internet exceeded the number of people in the world, and it keeps growing. The value to customers is huge. With the number of devices connected, a savings of 1% across the board is tremendous in effiency.
IoT - How?
Dr. DeWitt came out to discuss the technical challenges of managing IoT. One of the biggest challenges is device/sensor security. From IoT we want to provide messaging, so we can learn things, and we need to have an easy way to deploy large numbers of implementations. On the consumer side you have to worry about battery life, where on the industrial side the power is not an issue. Consumer side cost is a real issue, and cost isn't really a factor. Consumer side the devices need to be embedded in a device, and wireless, where the industrial side can have standalone devices with wired connections.
IoT today is truly a DIY (do it yourself) process. The state of the art is rather primitive. In the field we've got devices with a sensor and an actuator to adjust the device. In the cloud, there's a event/data aggregator, connected with the device with a D2C (device to cloud) event. The aggregator can feed an application, stored data to storage, and some real-time processing engine, sending data to a cloud-based device controller, which will send commands back to the field device via C2D (cloud to device) events.
Two main components are Azure IoT Hubs and Azure Event Hubs. For data management, we can use Stream Analytics, DocumentDB, HDinsignt, etc. to gather store and analyze the data. Stream Analytics gather the data and stores a vast amount of data, HDinsight allows querying of that data, and Machine Learning can then act on those events.
The IoT hub can manage and control the devices. Sensors push their events into the hubs via endpoints. Hash functions are applied to the event to send the events to proper handlers and the handler can act on those events. Messages can then routed back from the hub back to devices via send endpoints and message queues.
Azure Machine Learning, using data from SQL Azure, Azure storage, etc., will "predict when the boiler will explode". The IoT hub will send events to a real-time query engine which can then process the results. In streaming systems, a sequence of events will be taken in, query the events and send out results across multiple hubs depending on the content. There's no long-term storage of data, it's the queries that are stored long-term.
Field devices require a field gateway to send the sensor readings to the IoT hub. (A Raspberry Pi serves as a good field gateway.) It also makes sense to connect multiple sensors to a single gateway for security ease in configuration. They may be capable of local processing as well. Per-device metadata can be stored in local storage on the device.
Device Security is a major concern. Per-device identities are used to authenticate, and the devices must pull to obtain the C2D commands.
These devices all push to the cloud. But there's some problems with pushing everything to the cloud. Bandwidth, connectivity, latency, data deluge, storage constraints, speed and specifically "non-interesting" events.
So why not exploit the capability of the field gateway? In the field gateway we could put a streaming engine to process events so that only the interesting events get sent to the IoT hub.
Dr. Nehme came back to talk about Fog Computing or Edge Computing.
You never move the data to the computation, you move the computation to the data. The device does some data gathering, the gateway does some data filtering and processing, and the cloud provides the analytics. This gives us better real-time response, scalability, and metadata management.
Polybase for IoT.
We need a Declarative Language, a complex object model, scalable metadata management, discrete and continuous queries, and multi-query processing. She talked about three query types: ExecuteOnce, ExecuteForever, ExecuteAction, where ExecuteOnce sends a query to the device and the device sends a response. ExecuteForever sends a query that continues to send results until given a stop response from the source, and ExecuteAction sends a query with an Action statement, and that action can be done once or done forever, until a stop response is sent.
Why should Data Professionals really care? When new technologies are introduced you can be either the steamroller, or the street. More solutions are required for the growth in IoT that will be coming.
They announced that this will be their last PASS keynote. They've done many keynotes and there are many new speakers that can provide new ideas. Dr. DeWitt will be retiring and Dr. Nehme will be moving on, either within Microsoft, or without.
Thank you, Dr. DeWitt and Dr. Nehme, for the enormous insight you've provide us over the years.
After this morning's refreshing three mile SQL Run - a much smaller event this year to avoid the problems inherent in having to get permits for a larger event - and a quick breakfast at the Summit, it's time for the opening keynote presentation.
Tom LaRock, PASS President, greeted the crowd, and shared the metrics of the event. Over 58 countries, and over 2000 companies are represented here today. He introduced the current and board members, and the new board member, Ryan Adams.
In Tom's first Summit in 2004, there were 1,740 registrations, and this year over 5,500. He had the First Timers raise their hands and gave them some friendly advice on how to make the most out of this event.
As Data Professionals, we have what many consider to be a "dream job", and PASS works to help provide the necessary resources to make it the best job they could ever have. We all work together to make something great. PASS provides a worldwide standard for community organizations. He encouraged everyone to volunteer, to get involved, and help others learn and grow.
Joseph Sirosh, Corporate Vice President Data Group, comes on stage to talk about Microsoft's Cloud and Data Strategy.
We live in the age of data. We extract intelligence from every bit of data, and use it to transform our daily lives. Data started out as analog sources, and as computers came in it became digital data. As we move forward that digital data is being moved to the cloud, and by 2020 we'll have 50 zetabytes of data available via an IP address. We now live in an Age of Data.
By having digital history of patient statistics we can predict when that patient will make a trip to the emergency room. By keeping track of the data we can monitor blood pressure and other key data points to know how to help avoid these emergencies. Advances in genome sequence analysis have made it possible to develop life plans to minimize the problems we can expect to have with medical problems.
DocuSign uses the power of data to change the way contracts are signed. Improve the efficiency of contract completion by 90 percent. Eric Fleischman, Chief Architect and VP Platform Engineering, introduced his process, transforming paper processes into digital processes. Eric looked at open source solutions and decided they weren't interested in writing a database platform, and Microsoft provided the best solution for their business needs. Their data volume doubles every year, and that provides some significant scaling issues. Using Always On technologies and all flash storage arrays allows them to perform efficiently and has allowed them to continue to grow as they need.
SQL Server 2016 provides all the necessary engines of data needed for today's business. It's available both on-premises and in the cloud. The new features are built first in the cloud and then translated into the box product. "We live in a planet that has its feet on the ground, and its head in the sky." Gartner rated Microsoft as the leader in both Vision and Execution this year.
Shawn Bice is the General Manager of the Database Systems Group at Micrsoft. SQL Server 2016 embraces the entire global implementation of Azure.
Shawn talked about seven big bets. Everything is built in. From OLTP, most secure database, highest performing data warehouse, ene-to-end mobile BI on any device, to In-database Advanced Analytics. 1) Dramatically simplify HA & DR. DocuSign is using some of the fastest disk systems using flash storage. That experience is translated into the SQL Server HA/DR solutions to improve performance. They now provide easy setup of on-prem and hybrid cloud HA & DR. They provide load balancing on readable secondaries, and they have fast failover on prem or to the cloud.
Removing the complexity of big data, via T-SQL over Hadoop. The PolyBase solution is now built directly into SQL Server. They also added JSON support.
Real-time Operational Analytics. In-memory technology is built into SQL Server. "Real time" is the ability to learn and adjust. Every business can benefit from that. This provides up to 30x faster transactions, queries go from minutes to seconds.
In-database Advanced Analytics. They built intelligent applications using R, which is a standard language for scientists and statisticians. Shawn introduced Rohan Kumar, the Partner Director, Engineering, to demonstrate the application of these advanced analytics capabilities.
Shawn returned to talk about security. Security is a staple for the platform. It's not a one-size-fits-all. We start with layers of security. Start with TDE (Transparent Data Encryption.) Control access through Windows Authentication, row-level security, and dynamic data masking. Then incorporate Always Encrypted to protect the data from man-in-the-middle attacks. SQL Server is the least vulnerable database platform 6 years running, and it's the most used platform in the world. This is the first technology of its kind in the industry.
Cut storage costs with Stretch Database. With both hot and cold data in the data center, the cold data is on the same, expensive storage as the hot data. Stretch Database allows you to stretch a table directly into Azure, and move cold data into inexpensive, cloud based storage. It's available via normal queries, which will reach into the cloud storage when the data has been moved there. The data is encrypted and queryable, and doesn't require any application changes.
End-to-end mobile BI on any device. Provides lightning fast queries and reports on all mobile devices.
Joseph returned to the stage to wrap up, and say that building great products is a journey. It's really important to know what we believe in when it takes 20 years to build a project. We must have constant innovation, and innovation based on the customer. It's important to innovate for the future, and do it with the customer. The cloud allows us to innovate with the customer in a very agile way. It then makes its way into an on-premisis product. We've gone from the age of hardware, through the age of software, to now we're in the age of data, to provide better better human experiences and creating new human experiences, all being powered by the cloud.
I can't begin to tell you how honored I am to have been selected to present at the PASS Summit 2015 in Seattle this year. It's especially noteworthy for me in that this will be the tenth consecutive year I will have presented at the Summit. The Program Committee works hard to put together a slate of sessions that will help all data professionals working with the SQL Server platforms learn more so they can provide greater value to their employers and customers. (I know personally now, as I was a member of two of the subcommittees this year.) The fact that I've been invited to present again is both amazing and humbling.
This year I will be presenting two sessions - Automate Your ETL Infrastructure with SSIS and PowerShell and Scalable Application Design with Service Broker.
The first session is one where I'll walk my audience through the process of automatically generating SSIS packages from a PowerShell script, using BIML as the intermediary step. I'll essentially generate an ETL flow from an OLTP database into a Kimball-style data warehouse. Yes, there are other ways to do it, but I like PowerShell, and works really well.
The second session will demonstrate a number of ways you can use Service Broker's asynchronous messaging technology to offload database work that normally has to be done immediately to a (slightly) later time when the server is less busy. Or you can send some of that work to other servers. The limit here is your imagination, but Service Broker's technology ensures that your messages are always delivered, and, if you configure it properly, always in the order in which they were sent.
Here's a link to the sessions so you can view all the great sessions available at this year's event.
I've attended every PASS Summit since 2003, and could spend hours sharing how this singular event has helped me grow, both technically and in my career. It's well worth it to talk with your company about making your way to Seattle. You'll meet people who've been where you are now in your career, and make friends who will both help you solve problems in ways that'll make you look like a rock star, and who will help you grow your career in ways you might never expect.
Thank you to the PASS Program Committee for this honor, and I look forward to seeing all of you at the 2015 PASS Summit in Seattle!
It's been ten days since our SQL Saturday, when I announced that I was stepping down from my position as President of the Ohio North SQL Server Users Group, and feel I should share some of my thoughts with you, the people that've made our group the envy of many in the worldwide SQL Server community.
Roughly ten years ago I started attending the user group meetings, and taking an active part. Over those ten years the group evolved into an organization independent of any one company, affiliated with the worldwide PASS organization, and one that has a large number of members who are willing, not only to share their knowledge and experience with SQL Server, but who are willing to help spread that knowledge and experience through events like the SQL Saturday we just held.
My appreciation for the efforts everyone invested is unbounded. You all are very much an active part of the wonderful organization we have.
Specifically, I have many people to thank. The people who've volunteered as officers of the group when we put formal words behind the group - Brian Davis, Erin Stellato and Tim Cepelnik - thank you for helping in so many ways, from taking responsibility for any number of tasks, to being my sounding boards and friends.
To the people who've volunteered in so many different ways to make this group function smoothly, including Paul Hiles, Adam Belebczuk, Craig Purnell, Colleen Morrow, Sam Nasr, Paul Popovitch, Jason Willis, Mike Rachocki, Steve Smith, Jeff Mlakar, Cory Stevenson, Dave Gabele, and many others over the years - thank you. We wouldn't be where we are today without you.
To the people who helped us organize the group legally (even though I didn't finish the task) - Michael Slade and Sarah Dutkiewicz - thank you so much. Your experience and guidance helped us get off on the right foot, and your volunteer efforts with our early events was invaluable!
To the people at Microsoft who supported us, gave us a space to meet, and provided the connection with Microsoft that helped legitimize the group - John Miller, Bruce Szabo and Lori Olson - thank you. It's been a real pleasure getting to know you, both personally and professionally.
To everyone who's stepped up to speak to our group, people like Mike Hays, Carlton Ramsey, Jim Arko and so many more, thank you. We are a better group because of your willingness to overcome that fear of speaking in public and share your knowledge with all of us.
To say that I'm proud of this group is indeed an understatement. Every year PASS makes a big deal about the user group sending the most members to the Summit (which, naturally, groups in the Pacific Northwest tend to dominate), but this past year we had FIVE members present at the Summit. I don't have any way to verify this, but I believe that no other group has had so many individual members present in a single year!
Now, I'm not going away. I'll be around, and will help in any way I can to keep this group moving forward. It's time, though, for others to grow in the SQL Server community in ways that I've been able to over the last ten years. You grow by challenging yourself to take on responsibilities you don't know you can achieve. The cool thing is that if you set a goal you have a pretty good chance of reaching it, so I'm stepping aside so others can achieve goals they've set.
And it's your job, individually, to help the group continue to grow, and in ways I can't possibly imagine. Please, step up and volunteer to help. Come to the meeting on March 3 and pick a new set of officers who'll set a new course for the group, to bring SQL Server to more people.
I'll bring an end to this long-winded ramble, but I'll do so by saying it once again.
Yesterday I received the news that I was the winner of the Tribal Awards in the category of Best Free PowerShell Script. This award means a great deal to me, as it's for something I gave to the community to share both a useful tool and a way to teach people more of the benefits of PowerShell.
Thank you to Red Gate and Simple Talk for putting the awards together, to promote the efforts of those who make the SQL Server community the best technical community available, and thank you to everyone who voted, who took the time to acknowledge what people are doing to make their lives better in our own unique way.
Adam Jorgensen, PASS Executive Vice President of Finance, came out to talk about the financial health of the PASS organization. PASS gets $5.9M revenue from the annual Summit, and $1.3M from the BA Conference. Other than that it receives $82K from Chapters and Events, and $260K from other sources. The money raised by the community goes back to the community, via the Summit, the BA Conference, SQL Saturdays, Virtual Chapters, etc.
Tom LaRock came out to thank Sri Sridharan and Olivier Matrat for their service on the PASS board. He then introduced Sanjay Mishra as the Microsoft board rep, and Grant Fritchey as the newly elected board member.
Denise McInerney came out to talk about all the viewers watching from all over the world. She talked about the personal growth path that brought her from a lonely DBA to where she is today as PASS Executive Vice President of Marketing, and how everyone here at the Summit can do the same thing by just reaching out to others.
She announced that this year's PASSion award winner is Andrey Korshikov, a SQL Server MVP and PASS Volunteer for the last 3 years. She also recognized those nominated for the award, and the monthly Outstanding Volunteer nominees.
Denise talked about the Business Analytics conference scheduled for April 20-22, 2015 in Santa Clara, California, and a board discussion about that conference on Friday afternoon at 2:15pm. She also talked about the Community Zone and how you can become an active part of a local chapter, or even starting one.
She announced that next year's Summit will be in Seattle, October 27-30, 2015.
Dr. Rimma Nehme, Principal Research Engineer, Microsoft Jim Gray Systems Lab
Cloud Databases 101
Dr. Nehme talked about her background, why she has an accent (born in Belarus), how she knows about databases, and how she's learning about business administration, and how much respect she has for the PASS community.
Cloud technology has all kinds of misconceptions, she referred to as "Shiny Object Syndrome".
Cloud Computing is defined as computing and software resources that are delivered on demand, as a service, that is always on, and accessible from anywhere. (You can blame networking people for the name "cloud", based on the old network diagrams indicating a network cloud for WANs.)
The characteristics of cloud computing include on-demand self-service, location transparent resource pooling, ubiquitous access and elastic capacity. It offers quick and easy deployment for solutions with almost no need for provisioning. It doesn't require any capital expenditure, so ramp up is quick and easy. With the pay as you go model, it makes the cost benefit analysis simple.
She talked about the history of computing for the "masses", starting with the mainframe world of the 1960s, the arrival of Salesforce in the 1990s, Amazon Web Services in 2002, and on. She then showed us insight into the Cloud Data Center. Pictures of the Chicago Data Center looks like an indoor trailer park, but each of those containers contain thousands of servers. The data center is evaluated based on it's efficiency, and improving the Power Usage Effectiveness reduces the effective cost of the computing resources we're using in the cloud. Traditional data centers have a PUE value of 2.0, where the modular systems have a PUE value of 1.15.
Why Cloud? Elasticity, No Cap Ex, Pay per use, focus on business, and fast time to market.
Cloud service has three main layers. Infrastructure, platforms and applications.
She equated cloud services to a model she called 'Pizza as a Service'. Self-managed is like buying all the ingredients and making the pizza and enjoying it at home. Infrastructure-as-a-Service is like buying a pre-packaged set of ingredients, but you make it and eat it at home. Platform-as-a-Service is like ordering a pizza for delivery to your home, and Software-as-a-Service is like going out to the restaurant and enjoying your pizza at their location.
Dr. Nehme then explained virtualization and how it brings efficiency to the use of servers, using a house example, where the resources available in the house can be scaled up as required. Cloud services use virtualization to maximize use of the servers in the data center. She continued her analogies talking about Service Level Agreements. Azure SQL DB SLAs are 99.99% (four nines), which translates to about 53 minutes of down time per year. Azure SQL Database is designed with high availability in mind. A single database has up to three replicas at any given time.
The question is asked, "do we still need a DBA in the Cloud era?" and the answer is unequivocally Yes! With the idea of augmenting on-premises infrastructure with cloud resources, with the ability to "stretch" on-prem database historical data into the cloud, a DBA is just as critical to the process now as ever before.
After a standing ovation for Dr. Rehme's discussion, Dr. David DeWitt came out to acknowledge what a great presentation she gave.
Tom LaRock opened up the keynote with his official "Hi Tom!" Not just the thousands in the room but people logged in from around the world. People representing over 50 countries, over 2000 companies. This is our community, and those present will help you grow your careers. Since 2008, PASS has provided 1.3 million technical training hours.
TK "Ranga" Rengarajan - CVP Data Platform
Ranga started with a bit of background, growing up in India, then studying under Dr. David DeWitt, then joining first Digital Equipment Corporation, then Sybase.
There's been an explosion of data sources which drives an explosion of data, which drives businesses to learn more. Every year the amount of data generated grows by 40%, and there has to be a way to manage that data. It's an enormous opportunity for us. This data is going drive future productivity. We have an opportunity and a challenge to provide solutions to the problems that this data generates. The new data culture will allow everyone to do more and achieve more in their careers.
The Microsoft data platform allows you to capture and manage that data. It's a comprehensive data platform that encompases all the ways you can capture and store that data. The platform works in memory and on disk, on premises and in the cloud. It handles operational data and real-time data, structured and unstructured data, scale-up and scale-out solutions.
We need to capture diverse data with no limits on what you can do, via elastic scale. Maximize the performance and availability, and simplify with cloud solutions.
Azure DocumentDB is a NoSQL DB service that's schema-free, with ACID to eventual consistency models supported. Azure HDInsight is a 100% Hadoop service for the flexibility and scalability that that provides. The Analytics Platform System provides the Polybase (combining SQL and Hadoop in a single platform) appliance.
The SQL Server platforms available are SQL Server 2014 on Windows Server 2012 R2, SQL Server in Azure VMs, and Azure SQL Database solutions, all to provide elastic scale.
Ranga announced a major update to Azure SQL Database, allowing you to do more with the cloud. Improved TSQL compatibility, larger index handling, parallel queries, extended events and In-memory ColumnStore for data marts are features included in the new update.
Joseph Sirosh - CVP Machine Learning & Information Management
Joseph spent about 9 years at Amazon before joining Microsoft and hasn't ever seen anything like the PASS community. The "PASS Community Rocks!"
With data you want to understand the past, analyze the present, and predict what's next. Azure Data Factory is a platform like SSIS in the cloud to manage the data you have both on-prem and in the cloud. Azure Stream Analytics allows you to manage data in motion, analyzing the data in the present. Azure Machine Learning allows predictive analytics to be available to more organizations.
Sanjay Soni demonstrated Pier One using Microsoft Kinect to analyze the traffic patterns in the Seattle store. The Kinect sensor allows them to see exactly where customers are spending their time in the store. Using the Azure Data Factory they can manage the data from the Kinect data to provide quality analytics on that location data.
James Phillips - GM Data Experiences
James joined Microsoft two years ago after starting two companies in Silicon Valley.
Data is just a bucket of potential until you get it to users. We're not only removing the overhead, but continuing to provide oversight capabilities.
Simplify the data discovery with PowerQuery and PowerPivot. Deliver faster time to insight via Power BI and Q&A, which is a natural language query in Power BI. Connect to on-premises data via Data Refresh to schedule the Power BI data refresh and Interactive Query to view Analysis Services data via Power BI. Finally, enable a data culture using Live Dashboards and Drill Through capabilities, all provided via Power BI.
Renga came back out and shared that Azure Machine Learning is now available for free, all you need is a login via Microsoft Live ID.
There's a lot of discussion about the process of bringing new speakers to a level that allows them to be ready and able to present at major conferences like the PASS Summit. Andy Warren (@sqlandy) wrote a blog post about a speaker challenge and Brent Ozar (@BrentO) wrote about Speaker 47. Erin Stellato (@erinstellato) responded to An Open Letter To SQLSaturday & User Group Organizers by Nic Cain (@SirSQL) with a post about Helping First Time Presenters.
The most important thing to remember when helping develop and improve the breadth, depth and range of presentations is that we all started someplace. At every user group meeting of my group, the Ohio North SQL Server Users Group, I share what others call my "spiel". I share it in every user group meeting of other groups I attend, and in every presentation I give, be it a SQL Saturday, the PASS Summit or any other event I've been invited to speak. Here's what I say:
There isn't a person in this room who doesn't have some knowledge that we can all learn from. In other words, every one of you has something that I can learn from, but the only way that can happen is if you get up here and share it with the rest of us. It does two things. One, we get to learn from you. Two, you get to learn more about something you're already passionate about. You have to know more about something to present it, than to just do it every day. By sharing it with us we learn from you and you learn it better.
Now I don't mean for someone to get up the very first time and expect to be at a level that's ready for a major conference. That takes experience. That takes understanding that someone in the audience isn't really interested in your topic, and it's OK if they get up and leave. That also takes understanding that someone in the audience wants to prove that they know more than you know about your subject. I've seen this happen to both new speakers and to very experienced ones. Those of us who have been on the speaking circuit for a while have dealt with those people, and I encourage this group to help the newbie by letting the offending audience member know that their comments can wait until after the presentation is over. (There's no "good" way to handle this kind of heckler, and it's best to get them to shut up or leave.)
I like Erin's idea about a "buddy" system, to help each other out. It allows us to provide new speakers the kind of feedback they won't get on an evaluation form, and it provides moral support. I feel extremely proud that five people from our user group in Cleveland will be presenting at this year's PASS Summit, including both Erin and me. I think this stems from my "spiel" and the supportive approach we take during user group meetings where new speakers present.
Brent has some good points about the PASS Summit requiring the best speakers. The rating system in place doesn't objectively allow for ratings to be used exclusively, though. Speakers often get bad ratings because of things out of the speaker's control, things like the temperature of the room, the random disturbances outside the room, poor audio or video projection systems, etc. There also doesn't seem to be a way to let attendees know what to expect, and even when it does, attendees often pay little attention to prerequisites or session goals. Everyone has their own agenda, and that's the criteria by which the speaker is rated. I don't know how to fix this, but it deserves some attention.
Most importantly, while we need to see the speakers we know will "deliver the goods", we also need fresh faces and new ideas. My "spiel" is my way of encouraging new speakers, and I think we're successful. SQL Saturdays offer a great avenue for new and experienced speakers to learn from each other. I ask my experienced colleagues to lend a hand and help new people wherever possible, and attend their sessions, even if it's a topic that you already know thoroughly. (I once attended a "Basic T-SQL Backup" session by my friend and SQL Server MVP/MCM Sean McCown and learned things about backup I hadn't known, after using backup for 20 years.) By attending these sessions you provide support to the new speaker, you can intervene in the case of a negative attendee scenario, and you also just might learn something.
We're all in this together, and we all grow with each other's help.
We're anxiously waiting to hear from PASS which sessions were selected for the 2014 Summit in November. It's a big job to go through the hundreds of submissions and pick the sessions that will appeal to the people who will be paying over $1,000 to attend this annual event. As I am also waiting to hear the results, I saw this article addressed to actors who didn't get cast for the part they worked so hard to audition for, and it seemed appropriate to address the same issues for would-be Summit speakers.
So, given that I've been a speaker at many events, and rejected from many events, and have been, as a PASS Chapter leader and SQL Saturday organizer, in a position to select the sessions for an event, I hope I bring a little bit of perspective to the process. With that in mind, here's a list of reasons that may affect why your submission wasn't selected. (Note that I am not and never have been on the selection committee for the PASS Summit, so nothing that I say here reflects discussions that have directly impacted the Summit selection proces.)
1) Your abstract was interesting, and your title was engaging, but Microsoft submitted a session almost identical to the one you submitted, and as the co-founding member of PASS, and the reason we're all able to attend this great event, they have some pull. If they want to present the session in question, their session will take precedence over yours. There's nothing wrong with your submission, it just got bumped by another.
2) Your abstract was interesting, your title was engaging, and it seems like it'd be a good session, but another session almost identical to yours was submitted by an industry leader, someone who has strong name recognition and has a great reputation for delivering sessions that fill the session rooms and consistently rates high in evaluation scores. Remember that the goal of the conference is to get as many paying attendees as possible, and having a person who's known to put "butts in the seats" is going to take precedence to your session. This isn't about you, this is about what's best for the conference.
3) Your abstract was interesting, your title was engaging, but there were too many sessions submitted for that particular track, and since they only had so many slots to fill, they had to draw the line somewhere. It may not seem fair, and it may seem to you that there should be more sessions in that particular track, but the conference organizers had made there determination before session submission how many sessions were to be selected for each track, and there were just too many sessions in that one.
4) Your abstract was good, the title was interesting, but you'd had some problems before with meeting what the attendees expected from the presentation. Remember that they're paying a lot of money, both in conference fees and travel expenses, to be at this event. The conference organizers have to know that the presentations will be at the top level to justify those expenses, and they chose another session that more closely aligns with that goal. You can work on those problems at user group meetings and SQL Saturdays to correct those issues and that'll reflect well in future events.
5) Your abstract was good, but the title was dull. This is hard. How do you come up with a title that'll grab people's attention, but without going over the top? The best thing to do is to look at the sessions over previous events and see what wording grabs your attention. It has to reflect what you're planning to deliver, but a session title like "Improving Query Performance" just isn't going to attract many people to your session. Remember, the title will attract people to your abstract, and that will bring people to your session. (Unless you're Conor Cunningham, and then everyone will come to your session because you're Conor Cunningham regardless of the title or abstract.)
6) Your title was good, but your abstract was dull. Dull is hard to define, but it could be uninteresting, or too long, or is written in a way that tells the attendee that this session may not live up to the title's promise. It's important to be concise, but accurately convey what the attendee should expect to gain by attending your session. It's also important to be enthusiastic about your subject, because if you aren't, why should they be?
7) Your title was dull, your abstract was worse. Sorry, but this happens, too. Look at the sessions from Summits past and work on developing titles and abstracts that will appeal to the selection committee. Remember, this conference isn't about you, it's about getting people using SQL Server to come to the biggest SQL Server conference on the planet, and it needs to be the best. You have good ideas, you just need to work on presenting them in a way that's attractive.
So, those are my thoughts. I hope that sessions I submitted will be selected this year. One of the things I love to do is to share what I've learned with others, to help them grow as SQL Server professionals. Hopefully I'll get to do that again this year.
Best of luck to all of you.
In the 1980s one of my principal responsibilities was enabling communications between retail point-of-sale systems and the host computer where we processed those transactions. Communications protocols were many and varied, and I had to figure out their nuances and get the registers to talk to the hosts. Success was most often achieved when, after sending a message to the remote system, I received back a message called an Ack, an acknowledgement that the message had been received successfully.
In recent attempts at communication (via email, mostly), I've been finding that the receiving party doesn't feel the overwhelming need to let me know that the communication was received, and this is extremely frustrating to me. I have taken to asking questions that need to be answered, just to ensure that the message is being delivered. (I really already know the answer, but it gets the respondent to acknowledge the message.)
Communication is key to success, whether it's a project, a business relationship (or any type of relationship, really), and without two-way communication assumptions can be made that could cause that relationship, or database servers, to break down, and that's generally a bad thing. I try to avoid bad things.
So really, send an Ack. It's not hard and lets the sender know you're there, and the project is still on track.
On Saturday I'll be presenting one of my favorite sessions, Manage SQLServer Efficiently w/PowerShell Remoting, at the Las Vegas SQL Saturday. Here's the abstract:
You have more and more
servers to manage and less time to accomplish everything. You're writing
scripts to automate those tasks but they still take time to run.
PowerShell remoting allows you to manage servers without the overhead of
Remote Desktop, and allows you to run processes on all your servers
simultaneously. In this session we'll walk through how PowerShell
remoting works, how to set it up, and how you can save time getting
things done more quickly.
I've been talking about using PowerShell to manage SQL Server for a long time now, but the ability to manage multiple servers simultaneously just feels right. I even built a new set of VMs to demonstrate the scripts using SQL Server 2014, which just became available yesterday!
I look forward to seeing you at SQLSaturday #295!
Good morning! It's Day 2 of the PASS Summit 2013 and it should be a busy one.
Douglas McDowell, EVP Finance of PASS opened up the keynote to welcome people and talked about the financial status of the organization. Last year's Business Analytics Conference left the organization $100,000 ahead, and he went on to show the overall financial health, which is very good at this point. Bill Graziano came out to thank Doug, Rob Farley and Rushabh Mehta for their service on the board, as they step down from their positions.
Tom LaRock introduced the new executive board, including Adam Jorgenson as the Executive Vice President, and Denise McInerney as the VP of Marketing, and he introduced the new incoming board members, Jen Stirrup, Tim Ford and Amy Lewis.
The PASS Business Analytics Conference will be in early May in San Jose, California, and next year's PASS Summit will be in Seattle from November 4-7. Tom invited everyone to the WIT luncheon here in the Cisco Crown Ballroom, to the Birds of a Feather lunch tomorrow, and to the Community Appreciation Party tonight at the NASCAR Hall of Fame.
Today's keynote speaker is David DeWitt, Technical Fellow at the Microsoft Jim Gray Systems Lab, to talk about Hekaton, What, Why and How.
Dr. DeWitt seems to think we'll be board with his talk and he couldn't be further from the truth. He always explains really complicated things in a way that the rest of us can really understand.
He calls Hekaton an "OLTP Rocket Ship". It's memory-optimized, but durable, and fully integrated into SQL Server 2014. It's architected for modern CPUs.
OLTP performance has benefitted from CPU performance improvements, etc., but the hardware improvements have pretty much maxed out. Hekaton essentially "means" they're going for 100X performance improvement. How do we get there?
Pinning tables in memory still has problems. Performance still limited by latches and locks, and interpretation of query plans. Latches must be used to protect data in the buffer pool, but cause contention for other processes attempting to read the same data.
Hekaton uses lock-free data structures, it uses versions with timestamps and optimistic concurrency control, and it's compiled into a DLL to improve performance dramatically. SQL Server now has three query engines under the hood: Apollo (the Column Store index processor), the relational query processor, and Hekaton. Essentially uses versioned views of data using the optimistic model to provide high-speed throughput. Dr. DeWitt's discussion was detailed and thorough, and it would greatly benefit you to view the recording if you didn't see it live.
Today at 4:45PM EDT I'm presenting a new session using PowerShell to auto-generate SSIS packages via the BIML language. The really cool thing is that this session will be live broadcast on PASS TV! You can view the session by clicking on this link.
If you have questions for me during the session, you can send them to me via Twitter using this hashtag:
Brian Davis, my good friend from the Ohio North SQL Server Users Group, will be monitoring that hashtag and feeding me the questions that I can answer during the session.
I look forward to hearing from you on this great topic.
Update: The session (to me) went really well, and I appreciate everyone who attended. I've uploaded the slides and demo scripts to this post. AW.
It's SQL Server Geek Week once again! Every year at the PASS Summit the SQL Server faithful descend on the city of choice for the annual Summit, and this year it's Charlotte, North Carolina. Once again I've been given the privilege of sitting at the bloggers table, so my laptop is on a table!
So far this week it's been great seeing people I get to see just once a year. I attended Red Gate's SQL in the City event on Monday, and saw some great sessions from Grant Fritchey, Steve Jones and Nigel Sammy. On Tuesday I was invited to attend the Biml Workshop, put on by Varigence, and you'll see a lot of great things happening in the BI space in the near future from them.
This morning started off with a 3.3 mile run, organized by Jes Borland, and sponsored by SQL Sentry, called #sqlrun, and that was a great way to start off the event.
Bill Graziano pointed out that over 700,000 technical training hours have been provided by the PASS organization, including chapters, virtual chapters, SQL Saturday, 24 Hours of PASS and SQL Rally events. Without the volunteers who make these events happen, we couldn't reach nearly as many people. He also introduced Amy Lewis as this year's PASSion award winner, for outstanding volunteer effort. Amy was one of the people recently elected to the PASS Board of Directors for the coming year. He also thanked Ryan Adams in a special "honorable mention" for his volunteer work, and thanked all the PASS volunteers for making these events happen.
The keynote speaker this year is Quentin Clark, Corporate VP at Microsoft for the Data Platform Group. He starts out by saying that today's talk is about "listening to you". Instead of everything being about "the cloud", they're now talking hybrid solutions, and that's great, because not everything should be pushed to cloud solutions. He announced the SQL Server 2014 CTP2 is now public and available for download. He also confirmed that these are the final "production ready" bits, so it should be feature-complete. (At last night's get-together I was told that it's "almost" ready, so I'm sure if there's anything glaring that's a problem, it's still fixable, but it better be critical.)
With the new features in SQL Server 2014, including the Hekaton bits, SQL Server can provide up to 30X OLTP performance gains, up to 100X faster star join queries, and up to 90% disk space savings over previous offerings. There's no need to rewrite existing apps, and it's incorporated into the core engine, not a special add-in.
Tracy Daugherty, Program Manager at Microsoft, came on stage to demonstrate some of the new in-memory features of SQL Server 2014. He built a demo that simulated 20,000 users simultaneously performing the same actions he's performing during the demo. The first steps is game recommendations generated in 6.2 seconds, and the purchase completed in 4.0 seconds. In converting to use in-memory features, he got a 9x performance boost doing the same exact activity. The recommendations came up in 0.7 seconds, and the purchase in 0.1 seconds. The hot list generation baseline took 26 minutes to build, and after conversion to in-memory technology, it took 0.4 minutes.
Quentin also announced the ability to back up SQL Server for all supported versions os SQL Server (2005-2014) to Windows Azure, so you've got instant off-site backup, without having to spin up your own off-site storage solution. AND they've added the ability to encrypt the backups when creating the backups! (This is really a great new feature, to protect your backups from unauthorized access.) Tracy demonstrated a feature called smart backup, which automatically figures out whether or not a "significant" amount of change has occurred, and if so, automatically kicks off a backup. They've provided a free download that allows you to backup databases in SQL 2005, 2008 and 2012 to Azure storage, with encryption. Yes!
Data warehousing and "big data" is also being targeted for hybrid solutions, with a focus on using HDInsight, data warehouse virtual machines and PDW spread across on-premises and in the cloud for better performance. Using the new technology, one company has reduced DW load times from days to hours.
The Power Suite (Power Query, Power Pivot, Power View and Power Map) provide "real-time" insights for everyone, according to Quentin. He said that "everyone can ask the question", so they're trying to simplify the ability to get answers to those questions. Kamal Hathi, Program Manager, came out to demonstrate the Power BI features. The simplicity of the way he pulled data from the source with simple questions reminds me of what they tried to do with "English Query" in SQL Server 2000 days, but it looks effective.
If you go to www.facebook.com/microsoftbi, you can participate in a Power BI contest, to show how you are pushing boundaries with Power BI. Top ten winners get the new XBox One.
This week I had the honor of presenting two sessions at the IT/Dev Connections conference in Las Vegas. My two sessions were Manage SQL Server 2012 on Windows Server Core with PowerShell and Manage SQL Server Efficiently with PowerShell Remoting. I think both sessions went well, and the attendees indicated that they will be able to use what I presented as soon as they get back to the office, and to me that's the best praise I could get. I meant to post the session materials the next day, but I was busy taking advantage of the conference to attend other great sessions myself.
So here's the slide deck and demo scripts from both sessions, and thank you for attending my sessions!