Feeds:
Posts
Comments

Archive for the ‘Working Theories, Methods & Resources’ Category

Mapping training attendees

Clockwise from upper left: Kerry Thomas, Matt Schumwinger, Aaron Bergtrom, Bob Waite and Amy Schlotthauer.

On March 26th, 2014 Virginia Carlson of the Public Policy Forum and Matt Schumwinger of Big Lake Data LLC presented an intensive one-and-a-half hour training on Tilemill here at the IMPACT Planning Council offices. . Tilemill is a free downloadable tool used to edit Cascading Style Sheets (CSS). If that is a confusing sentence to read, you would have felt right at home with the training group, over half of whom had never coded.  After an excellent presentation, attendees left the training with a homework assignment to try the tool out using their agency’s data.

Two weeks after the training, attendees returned for a discussion and the opportunity to share maps. Several of the group members had successfully created a map, and most people came back with questions.  The group spent the second meeting discussing the maps that had been created by attendees, sharing frustrations, exploring solutions and celebrating successes.  Virginia and Matt went through step-by-step solutions to problems brought up by participants.

Top lessons learned from the group:

  1. Working with CSS code is tricky.
  2. Working with CSS code means you can do almost ANYTHING you can think of to your map.
  3. Data clean-up usually takes longer than creating a map.
  4. Learning as a group makes it clear that several people hit the same stumbling blocks, and can help each another overcome them.

Ticket density map

Above is a map created by Matt Schumwinger, Big Lake Data LLC, on Tilemill, using a random sample of parking tickets issued in Milwaukee and mapping them in yellow.  The brighter the spots, the more parking tickets were issued at that location.

After the training, a short survey was sent to attendees. All of the respondents “Strongly Agreed”or “Agreed” that the training:

  • Will help them to create better maps;
  • Expanded their familiarity with data visualization; and
  • Will be useful in expanding their agency’s capacity to produce and share information.

The Tilemill training was part of the Evaluation Institute of IMPACT Planning Council, funded by the Faye McBeath Foundation, the Greater Milwaukee Foundation, the Foley Family Foundation and LISC. The Evaluation Institute builds the capacity of local agencies to use evaluation to strengthen outcomes. Data visualization has been a particular focus, kicked off by Cole Nussbaumer’s Storytelling With Data Workshop last October.

Do you have any Tilemill tips to share? We’d love to hear them!

 

Read Full Post »

The nonprofit sector in the United States contributed $804.8 billion to the country’s economy, according to the Urban Institute.  That’s about 5.5% of the gross domestic product!  The same source reports that on an average day, about six percent of the U.S. population (14.6 million Americans) volunteer.  And the Center for Civil Society Studies at Johns Hopkins University reports that nonprofit employment represents 10.1% of total employment in the U.S. The 2.3 million U.S. nonprofits that behind these numbers produce benefits for the community in nearly every area of life, including education, health, the environment, the arts, human services, and religion, to name only the broadest categories.

I feel privileged to work in this sector and to be a part of improving the quality of life for my local community. But I often wonder, given the importance of the sector, why we know so little about what’s working or not working. I was a consultant to nonprofits for eight years before coming to the Planning Council, where I continue to function as an outside expert to local nonprofits.  I have seen many good organizations and people working hard to implement programs that they strongly believed would make a difference.  But very few nonprofits evaluate their work in a systematic way.  Still fewer act on that information to modify their programs.

In an article in the spring 2013 issue of the Stanford Social Innovation Review, the co-directors of the Center of Philanthropy and Civil Society at Stanford University point out that “By comparison to medicine or teaching, civil society, the nonprofit sector, and philanthropy . . . . lacked cross-cutting standards of practice, assessment, and accountability. The activities and outcomes of nonprofit organizations and foundations are opaque and not readily susceptible to observation.”  This concept was echoed in another recent article, this one in the Chronicle of Philanthropy, pointing out that foundations all too often are not forthcoming about the impact of the projects they have funded.

Here at the Planning Council, we have found that when nonprofits and funders take the time to step back and look at their offerings with a critical eye, everyone benefits.  Through evaluation, we learn which efforts have the biggest impact on clients, and which have limited or even negative impact.  We learn which strategies need to be enhanced , which modified, and which abandoned.  But that is only the beginning of the task.  By being transparent about the results of our evaluations and holding ourselves accountable to the public for our activities, we become better stewards of the trust the public has placed in our sector.

Are we really too busy acting for the common good to stop and ask whether what we are doing is changing anything?  Why do you think our sector doesn’t evaluate itself more rigorously and share the results more broadly?

Read Full Post »

The message that evaluation is valued is best delivered by the funder. Grantees need to know that funders take results seriously. If grant applications ask about measurement plans, but never discuss results, it’s unlikely that the foundation or grantee will be making the most of the benefits of learning from their work and improving results.

A report[1] on funder initiatives to build evaluation capacity provides a few practical lessons for funders who are interested in using evaluation to improve results. These are confirmed and supplemented by the experience of the Planning Council in providing decades of evaluation capacity building.

  • Clarify what evaluation is (and is not): A simple needs assessment can help spur conversation about opportunities to use evaluation to learn from mistakes, adapt new challenges and take success to scale.
  • Design the opportunity to be an opt-out rather than opt-in: Offer evaluation training or consultation as a benefit of receiving the grant and make it known it is an expectation.
  • Include evaluation activities and outcomes in grant reporting: Require grantees to report on evaluation capacity building activities and include high quality evaluative information in reporting.
  • Balance providing responsive versus strategic services: Not all requests for assistance will have an equal payoff. Other needs may be primary.
  • Build capacity at all levels of the organization. The commitment and buy-in of leadership is essential, but include mid-level practitioners for best success.
  • Provide timely services when they are of most use to grantees. Be sensitive to the fact that there are “teachable moments.” Make evaluation a genuine opportunity for learning—not another hoop to jump through.
  • Begin with a short survey to spur discussion about the need for evaluation capacity building.
  • Provide basic training:  Help to establish common language and level the playing field.
  • Host a technical assistance day. Provide each grantee with a one hour in- person session with a skilled evaluator, followed by additional work brokered by the funder.
  • Provide a bank of hours of evaluation assistance annually for grantees.
  • Share examples. Highlight and reward those agencies that learn from results.
  • Don’t limit evaluation capacity building to the work of an immediate grant. Think about improving the evaluation capacity of the organization.

[1] Welsh, Myia and Johanna Morariu, “Evaluation Capacity Building, Funder Initiatives to Strengthen Grantee Evaluation Capacity and Practice,” Innovation Network, June 2011.

Read Full Post »

report todayNonprofits are pressured by funders, the media, government, business leaders and board members to demonstrate their impact.  A recent study[1] surveyed 177 nonprofit leaders across the country and found that nonprofits of various sizes and causes want to be able to understand their performance and are taking steps to do so. The survey also found that nonprofits want more help in assessing performance than they are currently receiving from their foundation funders.  Highlights of the study may be found below.

  • 81% of the survey respondents believe that nonprofits should demonstrate the effectiveness of their work by using performance measures.
  • Only 32% believe their foundation funders have been helpful to their ability to assess their progress.
  • More than three-fifths of the nonprofit leaders (62%) would like more help from foundation funders to assess their progress.
  • Nearly three-fourths (71%) of the nonprofits report receiving no foundation support (financial or non-monetary) to advance their organizations’ assessment efforts.
  • When asked to identify the most important step funders could take to help organizations assess progress, the nonprofits suggested providing additional funds designated to support measurement, develop tools and help with analysis.
  • Nonprofits identified several key areas where more discussion with funders was desired.
    • how to develop the skills of staff to collect and interpret data (71%)
    • how to interpret collected data (58% )
    • results of their performance assessment (58%)
    • what data to collect (57%)
    • what performance targets to set (52 %)

[1] Andrea Brock, Ellie Buteau, PHD and An-Li Herring, “Room for Improvement, Foundations’ Support of Nonprofit Performance Assessment,”  a report from the Center for Effective Philanthropy, 2012.

Read Full Post »

Innovation Network Inc. explores nonprofit evaluation with 2012 report

Innovation Network Inc. explores nonprofit evaluation with 2012 report

Concluding there is room for improvement in the capacity and practice of evaluation in the nonprofit sector, a recently released study[1] of more than 546 nonprofits across the country rated the state of evaluation as only “fair.” In fact, the assessment suggests that “more than two-thirds of organizations do not have the promising capacities and behaviors in place” to reap the benefits of evaluation that can improve outcomes and maximize impact. Below are some of the key findings.

  • Organizations with larger budgets (over $5 million) and those that have been around longer (over 20 years) are more likely to evaluate their work than smaller, less established organizations.
  • Only a little more than a quarter of the responding organizations were judged to have capacity or behavior in place to engage in meaningful evaluation.
  •  Agency self-assessment of their evaluation capacity appears to overstate actual ability. (For example only 42% of those that rated themselves as having high or moderate evaluation capacity had developed a logic model or similar document and even fewer had updated it in the past year.)
  •  More than 70% of the organizations are spending less than 5% of their budgets on evaluation.[2]
  • Almost a third (32%) of the responding agencies reported receiving support from foundations or philanthropy for evaluation.
  • Outcome evaluation is the type of design used by most (79%) of the responding organizations.
  • 37% of respondents from large organizations reported that most data collected in their organization is not used.
  • Less than one fifth (18%) of the responding organizations had a full-time employee dedicated to evaluation.
  • Knowing where or how to find a professional external evaluator was cited as a challenge by 44% of the responding agencies.
  • Of those organizations working with a professional external evaluator, nearly 70% (69%) reported a positive experience.
  • Limited staff time, insufficient financial resources, and limited staff expertise are the greatest barriers to evaluation.
  • 75% indicated they regularly discuss evaluation findings with funders although only 37% thought funders are accepting of failures as an opportunity for learning.

______________________________

[1] Innovation Network, Inc., State of Evaluation 2012, Evaluation Practice and Capacity in the Nonprofit Sector, October 2012, Researched and written by Johanna Moraiu, Katherine Athanasiades, and Ann Emery.

[2] Generally, recommendations regarding the percent of budget to devote to evaluation range from 5-15%, but in this report, 5-10% is the recommended range.

Read Full Post »

When I was a graduate student, I was handed the opportunity of a lifetime when my employer, the YMCA, sent me to work at a partner YMCA in South America as part of a professional exchange program.  It promised to be the opportunity of a lifetime. I made the most of it, however, many of my colleagues in the same program did not.  They would complain about all the problems they were having and how they weren’t getting anything out of the program. Now, many years later, I am the coordinator of internships at a nonprofit think tank and I see the same pattern repeating itself.  Some interns learn a lot and feel the experience was worth their time and energy, while others languish or flounder, seeming to never get a firm grasp of the possibilities that are right in front of them. 

So how can you be one of those people who have a great intern experience?  Be aware of the three intern pitfalls and how to avoid them.

Intern Pitfall #1:I’m bored! The people I am interning for never give me anything to do.”

American workers have never been stretched as thinly as they are right now; every business is doing more work with fewer personnel. You would think this would make interns valuable commodities, and they are, but still, your supervisor is juggling so many projects that sometimes finding something for the intern to do gets bumped down the priority list. You can avoid spending your valuable time twiddling your thumbs, though.  One good way to start is to be reliable.  If you frequently miss your scheduled day or are often late, the people who are relying on you for help will stop saving projects for you. Another good habit to get into is to send an email or voice mail a day or two prior to your next visit. “Hi – just reminding you that I will be in tomorrow from noon till 4:00.  I’m looking forward to finishing up that inventory in the first hour so if you could line up a new project for the rest of the day, I’ll be ready for it!” 

No matter what, at some point in your internship you will find yourself without a project.  In that case, do not simply start surfing the Internet, or texting your friends, or doing your homework.  Be proactive and ask your supervisor for a project to do.  If your supervisor isn’t available, go around the workplace until you find someone to ask for a project.  If no one is available, try learning more about the agency or business by reading the annual report or other materials that you can find around the office.  At the very least, the information you learn will help you better understand the mission of the organization and it might point you in the direction of some project you can undertake on your own while you wait for your next assignment.

Intern Pitfall #2: “My supervisor doesn’t answer my questions quickly enough.  I feel like I could be more effective, but I need answers in order to keep working.”

Make it one of your first tasks as an intern to find out from your supervisor how he/she prefers to communicate, then use that channel to keep in touch.  Some people prefer to communicate by email, others by voice mail, still others in writing or even in person. Don’t assume your supervisor will even see that text message you sent with an urgent question unless you have confirmed ahead of time that he/she even knows how to receive and send texts. Many will be okay with you coming in to their office with an urgent question, but be sure you don’t overstay your welcome by spending the next 10 minutes regaling them with the story of the movie you saw last night or the test you are taking next week.  A short, timely question will leave them appreciating your initiative in getting what you need in order to keep working. And don’t just walk out the door at the end of your shift.  Leave an appropriate message with your supervisor telling them how far you got on the project, where to find your work, and how to communicate with you if he/she has any followup questions.  Then, be sure to answer your supervisor’s questions quickly if any are sent prior to your next on-site visit.

Intern Pitfall #3: “The work I am doing is nothing like what they told me I’d be doing. It doesn‘t match my skills, and half the time it’s just busy work.”

Whether you are being given projects that you feel are beneath you, or that seem unconnected to your particular strengths, it is vitally important that you do your very best on each and every task. Do not fly through assignments you think are simple; too often this results in mistakes that cause your work to be unusable.  Even mundane tasks can be positive career builders if handled in the correct manner.  Before starting your internship, do a skills inventory.  List different tasks and experiences you would like to have as a part of your internship. Be sure to include not only professional skills you’re trying to develop, like designing surveys or conducting focus groups, but also more universal skills like designing power point presentations or learning excel.  Rank yourself on each skill, designating whether you are already an expert and can perform the skill independently, or if you have intermediate skill but would appreciate getting opportunities to practice/improve, or if you have no skills but are interested in learning. A sample intern skills inventory has been uploaded to our Box.net widget that you can adapt for your own use.  Fill it out and share it with your supervisor at the beginning of your internship, if possible, so that he/she is aware of your own goals for your internship.

Another way to avoid all these pitfalls is to keep a learning log.  Each day or week of your internship, take some time to reflect in writing on what you have accomplished.  Describe the tasks you were assigned and then list the skills you learned and used to complete the tasks.  Don’t forget to list the people you interacted with in order to complete your assignment and the teamwork skills required.  Finally, write down the lessons you learned and will carry forward to other parts of your life.  You can find a sample learning log in the Box.net widget.  Use it and you might be surprised at what a great intern experience you are having.

Read Full Post »

I wanted to share a recent newsletter (below) from a Canadian site that I have been following for the past few years. It’s called Tamarack and the focus is on authentic community engagement. The Institute and its website draw lessons and provide tools and examples from across Canada, the US and elsewhere.  (It’s sometimes too easy for us to think we are the only ones wrestling with these issues!)

I was particularly interested in the second feature article about the White House Council on Community Solutions and the new whitepaper on Community Collaboratives. Based on an analysis of 12 “needle-moving” collaboratives, they highlight some core principles of collaboratives, characteristics of success, and supportive resources needed for collaboratives to thrive. There are important lessons about the need for sufficient planning time, dedicated staff support, and ownership by community members.  I don’t think the findings are surprising but it’s nice to see them together in one place!  For example, successful community collaboratives require:

  • Long-term investment in success
  • Cross-sector engagement
  • The use of data to set the agenda over time
  • Community members as partners and producers of impact
  • Dedicated capacity and appropriate structure
  • Sufficient resources

Finally, you may also be interested in this new report from the Bridgespan Group which reviewed more than 100 collaboratives and drew important lessons, including the promise and the risks associated with this kind of effort.

New Bridgespan Report on Community Revitalization Efforts
http://unca-acf.org/?p=4907

What do you find is needed to collaborate effectively? What lessons have you learned from other collaboratives that have moved the needle to improve community conditions?

Read Full Post »

Older Posts »