• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.


SBA PCC Evaluation Working Group July 26, 2011

Page history last edited by Angela Siefer 10 years, 2 months ago

Conference Call Notes



Angela Siefer, Connect Your Community

Kistine Carolan, OTI

Preston Rhea, OTI

Chiehyu Li, OTI

Lynda Goff, Connect Your Community

Judy, Project Endeavor

Cortney Leach, COSLA

Jim, Northern Illinois University

Karen Mossberger, University of IL

Amy Luckey, ZeroDivide

Mike, Partnership for Connected IL

Ashley, Freedom Rings Partnership

Jennifer, Freedom Rings

Juliet, Freedom Rings

Bob, Partnership for Connected IL

Preston, OTI

Laura Breeden, NTIA

Charles Benton, Benton Foundation

Sarah Chesemore, COSLA


Lynda - 

Where do we ask these questions?

Intake surveys - skill level, demographics, particulars

Post program - testing of skills

Evaluation on program - community investment impact



Charles - 

Goals -- obviously many things to evaluate and different perspectives

Context important - NTIA and its goals.

Recognizing that when round 2 grants were made - evaluation organization hired ($5million) to ask fundamental question about economic and social impact of programs.

We have an evaluation framework in place with ASR analytics - evaluating SBA/PCC programs.

Next year focus on infrastructure evaluation.

Need to understand what they are doing, what questions they are asking.

Propose that at next working group have Peter A. or someone else from ASR update us on what they are doing and what questions they have formulated. Whatever we do is done in the context of ASR.

Doing case studies on 25% of the projects - 10 and 15 are probably SBA/PCC.


Laura - 

We're talking about outcomes. outcomes at different levels. 

May be focused on # of people trained, # of subscribers signed up.

Since all projects have to address underserved populations and needs in these communities. There will be cross cutting issues.

ASR - program-wide impact study. Rigorous, statistical look at impact of $ on social and economic outcomes in communities where project take place. 

15 -17 case studies.

Key word - outcomes. what were the outcomes of the project. could be skills, could be jobs, could be personal responsibility.

Way out on far end is your impact, intermediate outcomes are on the way to that impact.

Recognize that each project is different. But that every project is interested in outcomes at some level. how to balance those two things. 


Angela - 

Intending to come up with questions that we all use. Does it make sense for us to talk with ASR?


Laura -

yes, absolutely.

Can get someone to talk with this group from ASR.


Angela -

Invite to next call...But,

If ASR is on every call, may restrict people from speaking freely. 



Amy -

Seems this group could go 2 ways (or both)

-      Walk through evaluation design as a learning community. This is a big undertaking and would probably require resources to hire evaluator to help. Would get people far, though.

-      Less ambitious – assume everyone doing own evaluation design, but determine if there are a few fundamental evaluation questions that everyone shares to some extent. For example:

1.Who are we serving and Internet access prior to involvement in program.

a. Demographics

b. Internet skills

c. Internet access & use

2. What changes did program participants experience following from our program?

a. Internet skills

b. Internet access

c.  Internet use

Then, develop common set of indicator level questions that grantees could choose from for use in their own evaluation tools. (e.g. intake surveys and exit surveys)


Zerodivide created list of questions that others are already asking, such as Pew and NTIA. (Posted to this wiki.)


Karen -

- Saw table that ZD pulled together, thought useful.

- Wants to note that some questions on the table come from telephone interviews.. actual wording would need to change depending on tool. eg. if using telephone survey questions, they would have to be transformed to work well on a paper survey.


Laura –

- Have lots of examples of surveys that have been collected from BTOP grantees.

- Most not proprietary – could be shared.

- Committed to do at NTIA –TA around Evaluation.

- Could provide resources. Webinars..

- Concerned about helping people learn from each other.. how to help connect grantees with experts.

- Would like to have input from folks on what they would like to see happen.

- Lots of resources here and material on the table.

- Could have resources to move this forward.

- BTOP collaboration site – could put tools there.


Sarah – Seems there are two potential outcomes/directions:

- Coalescing of what evaluators have already done.

- How to put out there so that people can pick and choose from what exists

- How to do that rather quickly  - So that those on other side of evaluation design access them and use them.

- These go hand in hand.


Individual grantees could use structured pathway – hand holding in developing evaluation.



-      Laura could you take on providing evaluation support through webinars? Experts who will provide guidance on how to create evaluation structures. Real support, not "here's how to hire me".



-      Working title is Evaluation 101.

-      Yes…  Can commit evaluation support in webinars. Will ask grantees for suggestions of speakers.


Angela -

- This group can help NTIA help us.


Laura -

- Yes. We get good feedback from people. Would like feedback on evaluation TA needs.


Sarah -

COSLA provides support to state libraries. Received funding from Gates Foundation to support state library BTOP grantees. Intends to have a series of evaluation design workshops for cohorts. Happy to share and collaborate.



- Templates of Reports would be useful.

- Sample of what overall evaluation could look like. Would be helpful.



Looks like this group is doing 3 things

-      Helping NTIA help us.

-      Gathering evaluation docs to share. Hasn't been resolved so use CBAIS wiki for now. Help gather eval docs.

-      Maybe – common set of evaluation questions (detailed survey questions, etc.)



-      Bringing together evaluators from Universities working with IL BTOP grantees.

-      Before 3rd regional meeting in IL – will have conference with evaluators.



- Next meeting will be 2 weeks from today at same time.






Comments (0)

You don't have permission to comment on this page.