| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Evaluation 101

Page history last edited by Amy Luckey 12 years, 9 months ago

Description: Not completely sold on the need for gathering all that evaluation data? Or, need help convincing your program partners of the need for evaluation data? Join Amy & LeeAnn for an interactive hands-on session discussing the what and the why of evaluation. Amy & LeeAnn encourage concerns be raised and we encourage those concerns to be posted on this wiki page! *Note, this is a two part session. The session is most useful if you attend both. 

 

Time: Tuesday 1:30-2:30 and 2:45-3:00

 

Location: Hanna Room

 

Facilitator: Amy Luckey, ZeroDivide & LeeAnn Shreve, WV Broadband Opportunities Program

 

Notes:

The value of evaluation for your program and your staff; concentration today is for staff.

 

10-minute overview to “level the playing field” in the room. Some words to consider:

 

Organizational learning, reality testing, deliberate, integrated, intentional, clarity, “getting to the nitty-gritty”

 

Driven by external audiences:

  • ·         Contribute to the knowledge to your field
  • ·         Demonstrate your value to the community

Should be driven by improving your own program

  • ·         organizational learning
  • ·         “be the best program that you can be

Why are we collecting the metrics we are collecting

                What are we trying to achieve, measure - what is your impact to your community

                What are you doing to achieve your goals

                How and why do you think what you are doing is going to give you useable results

Two tools:

                1. Logic model – Outcomes Chain – Return on Investment – Graphic Tool

  • ·                     “Reality testing” – describes how your program works and determines what you need to measure
  • ·                     5 boxes: Resources/inputs – Activities – Outputs – Outcomes – Impact
  • ·                     Collect the first 4 boxes, integrate data to determine box 5
  • ·                     Examples on the Wiki

2. Theory of Change – narrative, short and representative of your program

                                Activities ----- Expected Results

“Good evaluation is a useful evaluation” – by whom and how will the information be used.

                Funders, staff members, managers, your board, your community, policy makers, your peers

What data do we need to collect – based on audience, both you and the above

 

“Are we doing things right and are we doing the right things?”

 

What words come to mind when you think of evaluation?

 

Entire group:

Tedious, selling, learning, measuring, practical, qualitative, quantitative, gaming (over reporting), quiz, test, survey, accountability, pain in the, burdensome, judgment, questions, monitoring, info gathering

 

Small group reports:

Sell it –

Safe place to talk – be honest

Consider stakeholders

Organization needs to deliver information in a timely manner

Have actionable items behind data

Determine success – evaluate during program and incorporate results

Have clear standards from beginning

Teaching staff evaluation so that staff have an understanding

Take time to integrate results into the program

Incentivize staff or sub grantees to collect data

How does evaluation tie back to your program

How is the data going to be reported – individually or aggregated

Can staff collect and understand the own data

Report positives, negatives and action steps side by side

 

“Provide a quick easy win” – Look for pieces of data that can be collected easily and reported back to your audience (example: staff or funder) as soon as possible.

 

Evaluation 101, Part II

 

Amy gave each table breakfast cereal and an evaluation worksheet (logic model) to complete.

Both tables enjoyed the exercise.

Table 2 tried to think outside the box (bad joke).

 

Discussion revolved around what standards we should use to measure the constituents. (What should we measure?)

Process driven – continuous improvement of process.

Categories and patterns result in steps to evaluate.

Need to separate parameters – granular evaluation…What about “How does it taste?”

Audience – Evaluation team differed on what we like…raisins? Bran? We are all adults…what about children’s perspective.  Who is your audience?

What are we missing for a marketable product? Cost, process evaluation, packaging, marketing

Bring common ground to the table, expectations.

Large groups can contribute to evaluation but smaller group needs to put it all together, i.e. use the input from your staff to inform evaluation creators.

Schedule time to discuss evaluation at all levels of the organization. This shows that you value input.

Thanks, Amy, for a fun, informative session!

 

 

Here's a basic intro to evaluation:

Evaluation - Where to Begin.doc

And, a laundry list of evaluation resources:

Nonprofit Evaluation Resources.docx

(ZeroDivide's programming is focused on youth media, hence the first few resources listed.)

ohio slides.pptx

Comments (0)

You don't have permission to comment on this page.