Building from the Roots Up

As noted in the Strong Roots story, over the span of my career I have found myself working in different fields – ranging from youth development to disability research to immigration – but throughout I have always kept a soft spot for cities, and more specifically, their neighbourhoods. Fortunately, I have been able to combine my passion for things urban with my career through work and volunteer opportunities, starting with my first job in the area of community development and community-based research. The summer after my first year in grad school, the community association for the neighbourhood I lived in received funds from the municipal government to hire summer students to support their work through preparing grants, investigating potential structures for their organization, and assisting with outreach and community research (hmm, sounds a lot like the services I provide today!).

Working with this group of committed volunteers was inspiring in many ways, but three lessons come to mind today:
Continue reading

Seeds for Thought: Volunteer Retention

Volunteers are an important piece of the puzzle for any non-profit organization. Whether they’re contributing to programs and special events, helping out with fundraising and outreach, or providing guidance and leadership as members of the board, good volunteers are indispensable. As these individuals are giving their time and effort without compensation (at least of the financial kind), organizations are increasingly recognizing that they can’t take these superstars for granted.

Along those lines, this week’s entry in the Seeds for Thought category is a case study from the Stanford Social Innovation Review on volunteer retention for Girls Scouts of Northern California (GSNorCal). Like many other nonprofits focused on youth development, GSNorCal relies heavily on volunteers and as a result already uses many best practices in orientation, training, and recognition: however, broader changes within and outside of the organization has made it difficult to keep volunteers returning. In response, the organization hired a consultant, TCC Group, to “mine its data and pinpoint ways to keep volunteers engaged”. Through a survey of 1,371 current and past volunteers and follow-up focus groups, TCC Group identified factors that predicted volunteer retention and suggested improvements to GSNorCal’s practices.

This example demonstrates the value in using multiple sources of information, in this case quantitative data from a large survey, qualitative insights from small groups of volunteers, and general principles from scholarly research on the topic. If you don’t have the resources that GSNorCal (or even if you do) and want to learn more about your volunteers, what can you do?

  • Start by counting. How many volunteers do you currently have, how long have they been volunteering, how many new volunteers have come onboard recently and how many have left? How many hours are they contributing? Are there differences in these numbers based on demographic factors or what tasks they’re doing for your organization?
  • Use some simple questionnaires with both current and former volunteers. I could spend a full post or three on what a volunteer questionnaire could look like, but at the very least it should include questions around overall satisfaction, support from the organization (or lack thereof) and what keeps them volunteering and what makes them leave. Just remember to use a mix of question types and watch out for potentially misleading numbers.
  • Take a participatory approach. Include volunteers in the discussion, both long-time contributors and those who are new or in a temporary position, such as through a World Cafe: as a bonus, this approach can help improve retention by demonstrating to volunteers that their opinion is valued by the organization. Another idea – have a staff member step into the shoes of a volunteer for a shift to get a firsthand perspective!
  • Partner with organizations that can provide a broader view. Many cities have a volunteer centre (either standalone or part of a larger organization like the United Way) or a professional association of volunteer administrators such as PAVRO liaisons in Ontario that can link you with resources on volunteering and keep you in the loop about new developments in the field. Volunteerism is also becoming increasingly recognized as a topic of scholarly research, so look into partnerships with universities: programs related to community development, organizational studies, public policy, and even business are good starting points.
  • A bit of self-interest here: consultants can help! If resources are tight, use consulting expertise for specific tasks that may be impractical to do in-house, such as analyzing complex statistical data or acting as a neutral party to collect feedback (current and even former volunteers may be hesitant to provide criticism directly to staff). Volunteer management, especially as it relates to research and evaluation, is one of Strong Roots’ strengths, so drop us a line if you want to have a chat about how to learn more about your volunteers!

Question: What are some strategies that you have seen successfully used to engage volunteers and improve retention?

Summertime Evaluations

Summertime and evalin’ is easy
Surveys are fillin’, and response rates are high
Your dataset’s rich and your graphs are good lookin’
So hush little funder, don’t you cry

(With apologies to the Gershwins and Ella Fitzgerald!)

Despite the song, summertime evaluation has its own challenges. The nicer weather often signals a hiatus to regular programming and an increase in special events such as community BBQ’s and multiple-day festivals, requiring a different approach to engaging participants for their feedback. We also slow down a bit in the summer and limit tasks that seem too heavy – who wants to fill out a long survey when you could be outside having fun?

With that in mind, some thoughts on how to collect useful information when the weather’s nice:

  • Start with the simple metrics, like attendance, ticket sales, or amount of food consumed. They’re easy for stakeholders to understand, but just remember that they can be greatly influenced by factors outside your control (especially if your event is rained out): also, they won’t provide much insight if you’re looking for evidence of a greater impact.
  • Hit the pavement! Set up some volunteers with pencils and clipboards and get them talking with participants. Keep the questions to a minimum (3-4 max) so you’re not taking people away from the event for too long, and consider providing a little reward such as a sticker or coupon for providing their two cents.
  • Alternatively, set up a stationary spot for attendees to come by and participate. This method provides the option for longer surveys or more innovative data collection methods such as dot-voting. The main downside is that you need something to encourage people to come to you: if it’s a hot day a shaded tent and a cup of water may be a strong enough draw, but in any case take a minute to figure out what will appeal to people at your event.
  • Go online! Consider including in your evaluation plan social media statistics such as the number of visitors to the event website, likes on Facebook, and usage of the event hashtag on Twitter. Online conversations through these channels can also provide insights into what’s working and what needs to be changed. Promoting an online survey through social media and at the event itself can help collect data, as long as you remember that participants using these tools may not fully represent everyone who attended the event.
  • Debrief with your team of event organizers, volunteers, staff, and other key partners, using an approach such as the After Action Review. Don’t wait too long to hold it, and remember that your team’s perspectives may not match those of event participants.

Determining which method or methods to use will depend on a number of factors, including the scale of the event and the resources you have available. The main consideration, though, should be the purpose of the evaluation – what do you want to learn from the process, and what does success look like? If you just want to demonstrate that your event is popular, collecting attendance numbers (with perhaps a quick demographics survey) would be sufficient. In contrast, if you’re hoping to see more of an impact such as increased community awareness of your organization or a change in attitudes or behaviour, more time and effort will need to be spent engaging participants.

Got any tips for evaluating in the summer? Share them below!

Word Counts

In response to my post last week on open-ended questionnaires, Sheila Robinson over at Evaluspheric Perceptions explored some of the risks in interpreting this type of data. Without a systematic approach to analyzing qualitative data, we can fall prey to confirmation bias, which as described in her post, “causes us to remember or focus on that with which we agree, or that which matches our internalized frameworks, understandings, or hypotheses”. Another risk is that we pay too much attention to extreme viewpoints, whether positive or negative, because they are more likely to be remembered. Check out Sheila’s post for more thoughts!

One question that I want to address quickly is what to do if you have collected some data from an open-ended survey and want to avoid these pitfalls, but don’t know where to begin? As with evaluation in general, one of the simplest starting points is counting. Read through all the responses and keep a running tally of how often certain ideas come up. You may already have some ideas in mind for how to categorize responses, which will help in sorting but could leave you open to confirmation bias: take care that you’re not trying to fit a square-shaped response into your round category! If you come across strong or extreme comments, make sure you view it in relation to general trends (having complementary numerical data helps here!) to determine how representative that position is: that’s not to say that you should ignore a point raised by a small number of people, but as in the example raised by Sheila in her post, you don’t need to rush and make sweeping changes to something that’s working for the vast majority of respondents.

If there’s interest, I can share an extended example from my first experience with qualitative analysis – food for a future post!

What’s in a Question (type)?

While on Facebook earlier today (I was connecting with some colleagues on a work-related issue, honest!), I came across a survey for a local non-profit initiative. As someone who both identifies as a researcher and generally likes filling out surveys, I eagerly clicked the link … and found myself looking at ten open-ended, fill-in-the-blank questions.

Now, I don’t have anything against this style of question: indeed, as I noted in an earlier post, it’s good to provide space for respondents to share their own perspectives and stories without being boxed into a particular set of responses. In my opinion, though, inviting only written responses is a move too far in the other direction. Some respondents may not have the time to write down their thoughts, while others may feel pressured to provide insightful, well-crafted responses to each question and decide to take a pass on the survey as a result. I remember a conversation with a community group where one member personally disliked open-ended questions: this person’s view was perhaps a bit extreme, but it brings up the good point that individuals may simply have preferences for one question type over another. Accessibility is also a potential concern: will people who have low literacy skills or other challenges around writing feel comfortable participating? A final consideration is analyzing this type of data, which takes more time and effort compared to compiling statistics from multiple choice or rating questions.

Again, I have nothing against open-ended questions: depending on the intended audience and purpose of the survey, it may even be completely appropriate to only use that type of response. For most general surveys, though, a little bit of variety is probably a good thing.