Description: We all recognize the need to get beyond gathering data on a spreadsheet (or paper!) but what are the options? Since there is no out of the box solution, how have programs customized data collection tools? What is their cost, in terms of set up, maintenance and staffing? Once the tool is chosen, how is data gathered? This is a hot topic so please take notes on this wiki to share with others!
Time: Tuesday 11:00-12:00
Location: State Room
Facilitator: Colin Rhinesmith, New America Foundation
Notes:
Questions for participants in the room:
⁃ How many program managers?
⁃ How many trainers?
⁃ How many evaluators?
⁃ How many people are working with single organizations or multiple organizations?
⁃ Has anyone started the evaluation process?
Discussing Evaluation
⁃ What are we evaluating?
- Meaningful Use (positive impacts such as getting a job, GED, etc.)
- Quality of life with Seniors (connecting with far away family and friends via email/skype, accessing health information online)
- In New Mexico measuring economic impact
- Measuring impact on community
- Measuring home based employment
Challenges with the process
- Limited capacity for collecting data
- IRB
⁃ What data are we collecting?
- Demographics
- Use
- Self-efficacy 6 months after trainings
⁃ How do we measure success? Different organizations have their own definition including:
- Retention of training - increase in digital literacy
- Chronicling the stages of adoption
Internal Feedback - what is the process for input from trainers
- Too many questions
- The importance of a needs assessment
- One size might not fit all for collecting data - combining high-level across multiple sites/lessons and situational questions
⁃ What tools are we using to evaluate our programs?
- Moodle
- SBA Needs Chart
- Google Forms
- AcuTrack
- Survey Monkey
⁃ What is needed?
- Clarity from NTIA on terms that need to be tracked (such as adoption & subscription)
- Guidance on what to track and how to do it
- How to evaluate trainings that were conducted before evaluation framework was developed
- Ambiguity is an opportunity - defining success to the program and offering to the field
Outcomes/next Steps
⁃ Continuing the discussion
losey@newamerica.net
⁃ Thinking about sustainability
⁃ Sharing tools/metrics (and creating a repository)
Forms
Survey Templates
⁃ Posting/sharing definitions
Comments (2)
Jason Schroeder said
at 11:15 am on Jun 28, 2011
start by defining success - only then can you evaluate your program
Eva Artschwager said
at 11:32 am on Jun 28, 2011
Is the PPT for this available as a resource?
You don't have permission to comment on this page.