http://www.sqlservercentral.com/blogs/andy_warren/2008/01/09/evaluating-speakers-at-events/

Printed 2014/04/24 05:19PM

Evaluating Speakers at Events

By Andy Warren, 2008/01/09

Speakers tend to live for the evals. Hopefully it's a validation of work done well after a lot of time preparing, but most speakers would gladly pay for meaningful feedback that would help them improve their game. Event organizers use the evals to see if there is anyone that did extremely well or extremely poorly, wanting to highlight the ones that resonated and probably politely decline to re-invite those at the bottom of the curve. But attendees are there to learn and not to evaluate, and that's where things suffer some. I don't fault them for that; they are there to learn and they expect a reasonable level of professionalism from those doing the presenting so commenting only on things that really differed from their expectations makes a lot of sense.

If you'd like to see some examples of attendee completed evaluations take a look at Event and Session Evaluation Results for SQLSaturday Orlando and Session Eval Results for my PASS 2007 Presentation. Imagine doing the work to build a presentation and then present it publicly, wanting to improve, and then that's all you get? I teach a class on speaking (note that I do so because I want to encourage others to teach, not because I think I'm a world class speaker!) and I spend some time talking about evaluations and having reasonable expectations about the results. About all you can do as a speaker is ask people to take a couple minutes and do the eval, sometimes just the reminder will get you a few more responses. The problem is that even if you get a few more results, you're not necessarily getting any more useful results.

My solution/suggestion is that we should augment the attendee evals with a more detailed evaluation conducted by one or two lightly trained evaluators. By having someone focus on evaluating rather than learning I think we might get much more usable results. Below is my first draft of the questions I think would help a speaker and the event organizer understand what worked and what didn't. I'm not sure I'm asking all the right questions or that I've grouped them quite right, but it's a start. I'm hoping you will find this post either really good or really bad and send me a comment or two about whether you think the overall idea will work, and if will, what can I do to improve the form below?

 

Speaker Evaluation Form

Session Information

Session Number/Name:

 

Speaker Name:

 

Date Conducted:

 

Reviewer Name/ID:

 

 

Pre-Session

Item#

Item Description

Yes

No

NA

1

Did the speaker arrive at the room 5-10 minutes prior to the posted session start time?

 

 

 

2

Did the speaker move to begin setting up as soon as the previous session was concluded?

 

 

 

3

Did the speaker use their own laptop?

 

 

 

4

Did the speaker go about setting up in a reasonably organized fashion (found AC power, tested AV, got other stuff put away) and have it done several minutes prior to the scheduled start time?

 

 

 

5

Once set up was completed, did the speaker interact with early arrivals?

 

 

 

6

Did the speaker put up a slide from the presentation that contained the session name, speaker name, and level so that those in the room could verify the presentation about to be delivered?

 

 

 

7

Was this a repeat session at the same event?

 

 

 

 

Session Start

Item#

Item Description

Yes

No

NA

8

Did the session begin on time (within 2 minutes of scheduled)?

 

 

 

9

Did the speaker announce the session name and level?

 

 

 

10

Did the speaker introduce themselves to the attendees?

 

 

 

11

Was the introduction short and to the point, and not overly commercial?

 

 

 

12

Did the speaker use a microphone if one was available?

 

 

 

13

Did the speaker appear nervous (indicate symptoms in comments)?

 

 

 

14

How many people attended the session (count at 5 min mark)?

 

15

Was the session format designated at the beginning as interactive or lecture?

 

 

 

16

Were people turned away from the session due to lack of seating?

 

 

 

 

Presentation Slides

Item#

Item Description

Yes

No

NA

17

If there was a standard template to be used for all presentations was the template applied correctly?

 

 

 

18

Overall were the slides easy to read, easy to understand, and left room for the speaker to add value?

 

 

 

19

Did the deck contain references or additional resources as suggested reading?

 

 

 

20

Did the deck contain contact information for the speaker?

 

 

 

21

Did the deck provide a URL or other information for downloading the presentation and any associated files?

 

 

 

 

Questions

Item#

Item Description

Yes

No

NA

22

How many questions were asked during the session?

 

23

How many questions were answered successfully?

 

24

In cases where the question was not answered successfully did the speaker remain credible, or did it point to a knowledge gap?

 

 

 

25

In cases where the question was not answered successfully did the speaker offer possible sources to find the answer, or commit to looking up an answer post session?

 

 

 

26

If there were off topic questions did the speaker handle them gracefully and keep the session on track?

 

 

 

27

If there were too many questions in general, or too many from one person, did the speaker moderate the number of questions politely and confidently?

 

 

 

28

Did the speaker make eye contact with attendees that asked questions?

 

 

 

29

Did the speaker repeat all audience questions?

 

 

 

 

Demos

Item#

Item Description

Yes

No

NA

30

How many live but previously planned demos were done?

 

31

Of those demos, how many were completed successfully?

 

32

How many unplanned demos where done?

 

33

Of those, how many were completed successfully?

 

34

Did the speaker manage the transition from slide to demo and back smoothly (watch for too many alt-tabs)?

 

 

 

35

Did the speaker leverage written notes or pre-written code to make demo’s run smoothly without taking away the ‘liveness’ of the demo?

 

 

 

36

Did the speaker manage the demo presentation so it was readable (large font, good background color)?

 

 

 

37

Did the speaker avoid using the mouse/keyboard to highlight text (which is then often hard to read)?

 

 

 

 

Style

Item#

Item Description

Yes

No

NA

38

If the speaker elected to ‘walk and talk’ did they use a remote mouse to keep the presentation moving?

 

 

 

39

Was the pace of the presentation consistent (didn’t have to speed up at the end to finish)?

 

 

 

40

Was the speakers body language comfortable and approachable?

 

 

 

41

Did the speaker use any inappropriate humor or remarks?

 

 

 

42

Did the speaker appear to have a negative attitude (impatient, arrogant, dismissive)?

 

 

 

43

Did the speaker control their arm movement without being rigid?

 

 

 

44

Did the speaker engage with the audience effectively?

 

 

 

45

Did the speaker use any self deprecating remarks/behavior that pointed to nervousness/frustration/lack of experience or confidence?

 

 

 

46

Did it feel like the speaker had never presented the session before (no practice)?

 

 

 

47

Did the speaker come across as knowledgeable about the topic presented?

 

 

 

48

Did the speaker come across as passionate about the topic presented?

 

 

 

49

Did the speaker appear to have knowledge beyond the boundaries of the topic presented?

 

 

 

50

If there were multiple speakers did they work smoothly together?

 

 

 

51

Were there any techniques that stood out as very good (please add a comment if yes)?

 

 

 

52

Were there any techniques that stood out as very bad (please add a comment if yes)?

 

 

 

 

Session End

Item#

Item Description

Yes

No

NA

53

Did the stated level match the presentation level?

 

 

 

54

Did the session end on time (or up to 5 minutes early)?

 

 

 

55

Did the speaker offer attendees a final chance for questions at the end?

 

 

 

56

If the session ended more than 5 minutes early was it due to the speaker moving through the presentation too fast?

 

 

 

57

Did the speaker encourage attendees to complete the session evaluation?

 

 

 

58

Once completed, did the speaker move to quickly vacate the speaking area so the next speaker could begin setting up?

 

 

 

59

Did the presentation match the session title?

 

 

 

60

Were there any equipment/room problems during the session?

 

 

 

61

If attendees queued to ask follow up questions (after session completion), did the speaker treat them politely and give them at least 10 minutes post session time?

 

 

 

62

How many people attended the majority of the session?

 

63

If attendees were rating the overall quality of the session on a scale of 1-5 (5 being highest), what do you think the average rating would be (based on interaction, applause, mood)?

 

 

 

Comments


Copyright © 2002-2014 Simple Talk Publishing. All Rights Reserved. Privacy Policy. Terms of Use. Report Abuse.