A Proposed Method for Improving Session Quality at Future Drupalcons

Published May 11, 2010

Much has been written in the past couple weeks about the recent DrupalCon SF, the vast majority of it overwhelmingly positive. By most accounts, it was a successful event, with the overwhelming majority of attendees leaving satisfied. Granted, the accounts I read come from sites whose posts are aggregated via Drupal Planet - an admitedly very pro-Drupal crowd. But in the spirit of continual improvement, one area we should address is providing better insight to newer members of the community about the quality and expections of individual sessions.

During the conference I had the opportunity to speak with people who were new to Drupalcons and the larger Drupal community. While most of the newbies I talked to were psyched to be there and getting a lot out of it, there was a common thread: the quality of the sessions was not consistently high. Kieran Lal, the Director of Business Development of the Drupal Association noted this in a recent blog post:

In order to grow Drupalcon, we need to focus on the quality of the main program. Drupal sessions are still wildly hit or miss, both in session quality and session attendance. As a community, we need to take a hard look in the mirror and raise consistency and quality of every Drupalcon session. With more than 400 sessions submitted, there should be a better way to select quality sessions, and set higher standards in presentation preparation and delivery. With a quality approach, we will attract attendees not just because the conference is about Drupal, but because the sessions are worthy of attending on their own.

As Drupalcons get bigger every year, we're gaining large numbers of people not just attedning, but getting involved. The makeup of the community is changing - we're diversifying from an informal group of coders to professionals that include designers, project managers, and executives. We need to make sure we serve new members better than we serve the pioneers. A good way to start doing that, based on many conversations I had with these new Drupalistas, is to do a better job in delivering high-quality Drupalcon sessions. Someone new to our community that sits through multiple mediocore sessions at their first Drupalcon is not the first impression we should be making.

Everybody who presents a session at Drupalcon has the best intentions. But without preparation and practice, best intentions often don't get the job done. We're extremely fortunate that we have some stellar speakers and presenters in the community. We need to foster an envioronment where that group continues to grow.

As a veteran of 5 Drupalcons, I tend to pick sessions to attend based on the speaker, rather than the topic. This type of information is easily transferable to newbies. If we aggregate this type of data and make it part of the session proposal and program presentation, it will not only provide newbies with a measure of the strength of the speaker, but it will also encourage speakers to prepare and practice more in order to garner more positive feedback.

In addition to the obvious approach of recruiting quality speakers, I think we can address the problem from two directions.

Motivated Presenters

Clearly, presenters require some additional motivation to better organize, prepare, and present their session. As an open-source community, we review every single line of code and documentation; why should our Drupalcon sessions receive any less attention? Let's incentivize speakers to practice their presentations at DrupalCamps and meetups by not only taking note of it, but also by actively encouraging people who have seen the presentation to provide thumbs-up (Plus 1) type feedback. The idea is not to slam presenters when they do a poor job, but to encourage them to improve their material and presentation skills.

A positive peer-review should then be used as one of the selection factors. I don't think it is a wise idea to use this information as a filter, but rather providing it as additional information that conference organizers can use as needed.

Additionally, if a peer-reviewed presentation is selected for Drupalcon, a badge should be displayed next to the title of the session in all conference materials indicating that the session has passed a vetting process. This should act as an incentive for speakers to prepare more effectively.

More Informed Attendees

Those of us who have attended Drupalcons in the past have a pretty good idea of which speakers consistently provide top-notch sessions. This information should be aggregated and shared with both conference attendees and organizers. By integrating a similar thumbs-up type rating for speakers, this information should also be used as a factor by conference organizers.

Just like with the peer-reviewed badges for presentations, consistenly good Drupalcon speakers should also be noted with a badge of their own. A similar method can be used to indicated speakers who have presented sessions at previous Drupalcons as yet another measure of quality.

Conclusion

Collecting and aggregating both peer-reviews and speaker quality will help all attendees make more informed decisions about which sessions to attend, which will lead to a better Drupalcon experiences for everyone, especially those who don't have the benefit of familarity. It is also a great way to motivate presenters to step-up their game when thinking about submitting and preparing Drupalcon sessions.

Postscript

One of the Drupalcon newbies I alluded to earlier sent me this link to a great article about how to deliver a great presentation. All future Drupalcon presenters should be encouraged to read it before presenting.

Comments

I've attended DrupalCon Boston, DrupalCon DC and now DrupalCon SF and I found each one less rewarding than the previous. This is no doubt because I have learned less and less because I have had less and less to learn. Granted, there are always more things to learn about Drupal especially with the new features and directions being developed with each release.

But I think it is also because there are more overview sessions now (my impression anyway) and fewer nuts and bolts sessions. And some of the sessions continue to be really weakly prepared, containing just a few minutes of real info and ideas.

DrupalCon is great for beginners and perhaps for developers, but not so much for those in between. I wish it would include more programming like the Do It With Drupal or Design 4 Drupal events. or longer more in-depth sessions rather than just 1 hour sessions

Excellent points, all.

It should also be noted that this year is the first year that we have implemented session surveys. If you attended sessions at DrupalCon SF, please, please log back in to the DrupalCon web site to complete evaluations for each of the sessions you attended. You can find links to the survey form here:
http://sf2010.drupal.org/conference/schedule
(Again, you must be logged in to fill out the surveys.)

Presenters can already view preliminary results of these surveys on their own session nodes (only their own, not everyone else's). And more complete data will be available to presenters once we close the survey forms.

These session surveys will help future DrupalCon planning committees select sessions and presenters, and it's our hope that it will increase the overall quality of these sessions in the future.

Submitted by Shawn DeArmond (not verified) on Tue, 05/11/2010 - 15:24

Pay the presenters a small stipend. It can be just $100, for example. This sets up a more professional relationship. The conference organizers can then mandate stuff like pre-screening the slides for format and content. And the presenter will feel more obligated to deliver on the sale. If you want to get real fancy, add a bonus for sessions which score highly in survey feedback.

I submitted a session for DrupalCon SF, which I thought was quite compelling (a joint client/developer session on how we built http://progressillinois.com for a very modest budget -- shameless plug, shameless plug!!).

I'm not one to judge the merits of a presentation I never gave, but the selection process was completely opaque (there was a voting component, but then also a selection component, but how it all worked wasn't adequately described on the site or elsewhere), the announcement deadline was blown completely by the organizers, and ultimately, I was *never* contacted to be told my session didn't make the cut.

Could this kind of completely inscrutable and disorganized process be a cause of the varying quality of sessions at DrupalCons in general? Isn't such a system very much open to abuse and cronyism, even if unintentional?

Mike,

This was my first DrupalCon, and it was incredible. Like you, I picked my presentations based on the speakers -- not on the topics.

I've been going to tech conferences for over a decade. Too many times I picked a panel or talk based on a topic and found myself listening to someone who had no public speaking experience or, worse yet, had nothing to say that I didn't already know.

This is where I'd like to put in a thanks for the DrupalEasy podcast. I had the opportunity to become familiar with many of the speakers through your and lullabot's podcasts. I know that folks like JoshK, EmmaJane, and others will give compelling talks because I'd heard them speak before. The panel on Drupal Training was excellent, but what made it a no-brainer were the panelists.

Every year I go to SXSW Interactive. This year, the conference broke all attendence records -- over 15K attendees. Many of the panels were garbage.
Was it worth the money? Yes. All the panels/talks I went to were good to great. I've learned what to avoid, and truth is, I didn't go to too many panels. I spent more time in meetings scheduled in advance of the event. Conferences like these are a great opportunity to discuss business. Everyone is gathered in one place.

I didn't go to too many talks at Drupalcon either. However, the discussions I had with colleagues, new and old, were alone enough to justify the cost of attendance.

You are right: Transparent peer review would be a straightforward improvement to the process. Excellent.

Of course we still have the major problem: How do you get people to review? We do this so informally with patches, but it requires quite a bit of social sophistication. You often have to figure out who might be interested and ping them. If this carried over into Drupalcon presentations, it would mean that a core of socially-connected people would always be the key presenters.

At my nonprofit group, we have a different group of people review different types of presentation abstracts for our annual conference. For example, we would have people with expertise in "social justice" evaluate social justice submissions.
I think that a similar solution could easily be applied for future Drupalcons. A group of developers could review developer-focused presentations, project managers could review business/PM presentations, etc. The Drupal Association could put out a call for presentation abstract reviewers in advance of opening up abstract submission, and then those people would be in charge of individually assigning scores for each abstract submission. If it came down to deciding between two equally-scored abstracts, then the conference organizing committee could make the call.
I think that the tricky part of moving toward this type of system for future Drupalcons would be figuring out how to segment presentations. It's hard to know whether it would be more helpful to segment by skill level (newbie, advanced, etc.), by type (case study, demonstration, etc.), or by focus (development, project management, design).

@David Eads

While I had a similar experience, I felt much clearer about the process after our podcast with Josh Koenig was released. He let us know about the huge number of submissions, and the process the programming committee went through to select sessions.

Submitted by admin on Wed, 05/12/2010 - 14:27

I recently blogged about the issue of bad presentations along with some resources and tips on making them better: http://civicactions.com/blog/2010/may/07/weve_met_enemy_and_he_powerpoi…

I would suggest that the session selection committee talk to each presenter that makes an initial "cut" and is being considered to find out if they have given the presentation before, if they have a slide deck, get a copy of it to review it, etc...

Perhaps more than simple "voting" on sessions there should be an endorsement process, sort of like when patches are reviewed by the community and ready to be committed.

Last thing: as a community we must review the sessions that we attend. Without that feedback it will always be an uphill battle both for organizers and presenters to improve session quality. I reviewed EVERY session I attended at DrupalCon in San Francisco, and it if is not too late, suggest that others go back and do the same.

A few question that is not asked which should be is: After having attended this session, would you recommend a friend or colleague watch the video? Would you yourself go to this session if you could go back and make the choice again?

(thanks mollom for thinking this post, with one link was spam)

Sign up to receive email notifications of whenever we publish a new blog post or quicktip!

Name
CAPTCHA