PASS Processes and Results posted

Openness

Ive been encouraged to blog openly about the Volunteer work that I do for PASS and the processes we use to get the work done. This blog is the first in what I hope to be a long series that will outline the different things that have to be done to bring a >somewhat< seemless experience to the SQL community

Pass Processes –what you (dont) want to know

The final PASS Summit session evaluation results have finally been emailed out to the speakers. This brings an interesting month and a half of PASS to a close for me. Back on the 24th of november I asked for some help to get the Session evaluations together and generate some results. As it turns out I had a huge outpouring of support from everyone wanting to help (Thanks again) In the end though, I wound up working with 2 volunteers: Tim Mitchell and Christina Leo, as well as Elena Sebastiano from PASS HQ to make this work…

To get to the end you have to start at the beginning

I’ve been involved with the program committee in various ways since 2006 so I have eval counts going back to 2005. We’ve tried various ways of upping the evaluation return rate over the years but until 2009 we have had little luck in improving it. This is a classic example of be careful what you wish for because you just may get it. We had an amazing 336% increase in return rate of the evals.

2005 — 3518

2006 — 2114

2007 — 2991

2008 — 2379

2009 — 8008

While the added evaluations will be of great use to everyone they created an unanticipated problem of having to manually enter these. PASS hired a temp to enter the data and since we didnt have an accessible designed database to store the data in we decided to use zoomerang. The sessions were entered directly into a zoomerang survey and the results were extracted into an excel file that was delivered to me.

Once we had the session results in hand Christina went through the rather painful process of cleansing the data and getting it into a format that could be used. The data was loaded into a SQL Server database, where Tim spent his time building an SSIS package to extract the data and put it into individual excel spreadsheets that could be emailed to the speakers. Once this was complete, I took a preformatted email Elena had wordsmithed for me and built an additional SSIS package that would read the email addresses from the db and send the attached excel spreadsheet. This was an excellent opportunity for me to expand my SQL Skills. I dont get to use SSIS in my current position, I always learn better when I have a real problem that needs solving so I enjoyed the work.

All was going perfectly, I was about to move on to my next task when the emails started to flow in with speakers asking where their results were since their spreadsheets were blank. This caused me to absoluetly PANIC. I immediately started to verify where the mixup was, when managing a process with so many moving parts theres always a chance that it was something in the process. After verifying that the evals werent in the original dataset, I felt quite a releif as it wasnt something in our process that ate the evals, it was something far more sinister…

The case of the missing evals

I contacted HQ about the issue first thing in the morning and they were obviously thinking the worst as was I. A few phone calls and emails later the options were “lost in the zoomerang DB”, “entered incorrectly” or “lost in transit”. I wound up getting an email at about 9PM titled “Crisis Averted”, even though the crisis want any of my doing, you can imagine the relief when I heard that the an envelope(s) containing just over 1400 evaluations had been found at HQ. They were apparently misplaced during the transit of the hundreds (thousands??) of boxes returning to HQ from the summit.

Now comes the hard part

About 48 hours later I got a new extract with all of the missing data in it. I only assume because of the speed we got these 1400 abstracts that every free hand at HQ was working furiously to get them entered. As it turns out Christina was in Europe and unavailiable to recreate what she had originally done and Tim was busy so I took on the task of recreating the process that was done the first time. Luckily, I had the source to Tim’s SSIS so that wouldnt be too much trouble. After about 5 more hours of work I had the data loaded into the proper taables and ready to be reported on. The process was updated and everything was re-run and with that, all the speakers got their evals and were happy, Success!!

Reporting on the data

I proposed that we generate a page for the summit09 site that had the top 10 sessions and other various data/matrixes for use primarily by the speakers. In the end it turned out that this info is very valuable to PASS for generating interest in the quality of the educational opportunities at the summit. Since there is a value add we had to work around how to “properly” release this data. Not a big deal, just an aspect some members of the community might not have even thought of. (I know I hadnt thought of it)

The Grand Finale

The link which I hope will be of some interest to both speakers and potential conference attendee’s

Here you should find the top 10 sessions overall, the top 5 sessions per track as well as all sorts of data that I extracted from the evaluation database. its also worth noting that these pages directly link to the presentations (and recordings for summit 09 attendees) so you can relive the best of the best today.

Did I miss something that you think is valuable? let me know and Ill see about getting it added!!

Takeaways

PASS has some very “interesting” processes that backup the front end and thre is definetly room for improvement, the biggest issue is how do you improve a process such as this one without spend very much (any) money?

We need to design a database >gasp< to hold the speaker eval information and not rely on a 3rd party that only exports to excel

We’ve already enacted a change for 2010:The registration group will enter the evaluations from the paper immediately after they are collected, this should kill the delay in getting results back to the community. We >may< also go to a split online/paper eval process but, im hesitant to mess with a process that we had such a huge improvement , especially after earlier trying an online process with less than stellar results

If we combine these 2 items, I think it would be outstanding to have a realtime update on the main PASS website during the summit of what the top 10 sessions have been, and maybe even a “reserved slot” for a repeat of the top session per track?

The scoring system that we used to deliver the results (very poor, poor, average, very good, excellent) did not work well, we will go back to using only numbers 1-5 next year.

I’d estimate that I spent somewhere between 50 and 60 hours completing this task, and ill admit that some of that was learning new things in SSIS, but youd be amazed how many emails it took to put this piece of info out for all to see.

Have you considered doing an online eval form via a website? I’d be willing to bet that most of your attendees brought their own laptops… If your venue has wifi internet access you could get the evals done online and be able to chart the data in realtime! The few people that don’t bring their laptops could always use some PCs in a common area set aside for evals between talks.

Jeremy, We have absolutely used an online process before. In 2006 we had nothing but online surveys, which was an absolute failure. We didnt get nearly the return rate we need. In 2007 we had online and paper and got a little better rate but still not what we wanted. So in 09 we changed the method by which the paper surveys were handed out, as well as the method in which they were collected. It seems we have hit on a winner but, in the never satisfied realm, I think next year we will do a split online/paper eval system. Ive also started investigating what it would take to do SMS voting (Thanks for the Idea Andy Warren)

This months TSQL Tuesday #61 is hosted by Wayne Sheffield on Giving Back to the SQL Server Community. An interesting topic given the season and something that is near and dear. Here’s his original post http://blog.waynesheffield.com/wayne/archive/2014/12/t-sql-tuesday-61-giving-back/

It seems like I have a knack for doing things backwards differently and my time giving to the SQL Community

This is a simple post to say that yes, you read the NomCom ballot correctly and YES, I’ve decided to run for the NomCom this year instead of the PASS Board of Directors. After last year and only having 3 candidates for the NomCom it was quite refreshing to

This years election results are officially in the books. Unfortunately I’ve come up four votes short in my effort to be elected.
I’d like to congratulate the winners, I’m sure y’all will do great things. Dont worry, we’ll all be watching to see you make good on those campaign promises!
I’d also like to thank all of my supporters,

According to the published timelinethe 2013 iteration of the PASS Board elections will be over in just a few hours. As I wrap up this year’s campaign I wanted to send out a huge thank you to the SQL community members who have supported me in my endeavor to have a second term on the Board.

Happily, It’s Friday and as I was looking at all the digital bits I’ve spilled in this campaign, I realized there was one more topic I wanted to draw your quick attention to: Things I want to do better this time

I need to remember that communication is key and telling you what I ‘m working

When I started volunteering for PASS nine years ago I was focused on contributing and helping where I could. I met a few people, volunteered some more, met a few more people, and somehow before I knew it I was serving on the PASS Board. I learned a lot, tried to help people where I

During my earlier term on the Board I struggled to keep up with all of the opportunities that came along to contribute. I had projects within my portfolio I wanted to get chartered, projects in other portfolios that interested me, and plenty of intra-Board discussions where participation is vital. It’s was incredibly easy to find myself

I’ve been fortunate enough to have gained a wealth of experience with PASS over the last 9 years. Serving in as many different roles as I have has allowed me to see the organization from most angles, good, bad, or otherwise. While thinking of all those experiences, I can’t think of a more challenging one

The Slate
A few weeks ago I wrote that I applied to be on the slate for the PASS Board election. Today the list of candidates has been officially announced and I am happy to say that a group of hardworking SQLFamily members on the NomCom thought highly enough of my application, interview and references that I

I have decided to apply for the SQLPASS Board of Directors this year. Before I get to some of the specifics on why I’m running for election again, I’d like to tell you a bit about my journey with PASS
History
In 2004 when I attended my first PASS Summit, I still remember being convinced by two