Blog Post

SQLSaturday #8 Session Evaluations

,

Session evaluations are one real benefit that speakers derive from participating in community events - a chance to see how they did and maybe even get some ideas on how to improve. The challenge is that most attendees only tend to fill out the eval if the session was really good, or really bad. We tried to fix that this year by asking the questions in a different way, and also treating each completed eval as a raffle ticket in the end of session drawing for a couple books.

Here's the form we used:

image

And thanks to my friend and volunteer Mike Antonovich, we've finally got the compiled results - here's a sample of the output:

image

Full results can be downloaded directly from http://www.sqlsaturday.com/files/SQLSat8EvaluationResultsBySession.xps. If you look there is one speaker with especially bad results and it wasn't his fault; we couldn't get the projector to work with his laptop (they had weird VGA connectors) and an attempt to use a loaner didn't work out. Aside from that, the results are pretty good!

Is it enough feedback? The right feedback? I'm open to discussion but I think this is decent.

One of the things I hope to add for next time is a short video that reviews what we're looking for in evals and why they matter. I've put off adding web support for capturing and reporting on this data until we could lock in some type of standard. Ideally we would collect these at the end of each session and then have a couple volunteers key the data while the event was in progress. Direct capture would be nice, but trying to setup kiosk machine just adds complexity and a potential bottleneck, I've considered some type of cell phone solution and maybe that's worth a look. I lean towards analog and old fashioned because it does reduce the complexity and hopefully leverages the volunteers.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating