SBA and PCC Training Programs Evaluation Designs


Description: Both Sustainable Broadband Adoption programs and Public Computing Center programs have training components. Knowing that evaluation of the training can impact future funding, how are training programs being evaluated? How is your training program being evaluated? Samantha will share a PCC evaluation design and guide the discussion. Don't forget to take notes below!

 

Time: Tuesday 9:45-10:45

 

Location: Allen Room

 

Facilitator: Samantha Becker, University of Washington

 

Notes: What do you want to measure?

Are they satisfied with your training?

Have they acquired the skills they expected at the beginning of training?

How can we measure those metrics?

 

For example…

Can you save or open a file from a file folder

 

The point:

Can the student eventually go out on their own and use a computer comfortably to complete the tasks they need to in a “novel situation.”

Unless you mention it from the get-go, not good to conduct testing for the client.

How do they feel when they have their skill level assessed? Have to think about this.

 

Session is to discuss how we design a program with these things in mind.

Floor is open – issues that are important.

 

First question: Looking for templates to evaluate an SBA's meaningful use.

Posted on the web and on a previous WIKI

http://tascha.uwedu/usimpact (CCN, CTOP, PL studies)

Good to use consistent document so we can have the same metrics to evaluate across different programs.

 

Francine - What do we hope to accomplish, what do we hope to say, at the end of the program?

 

What topics are covered:

Financial literacy (go to the banks, deals, finance rates)

Basic computing/internet

Workforce development

Continuing Education

 

Financial Literacy metrics

                Want to use it

                Want to generate an Outcome

                                Bounce less checks – save money paying bills online

Measurable – pre & post assessment

Good to keep anonymous – students aim to please so may be dishonest if they know you’re watching

                Embarrassment, etc.

Be mindful of confidentiality – not saving passwords and looking, etc. – verbal and written feedback is the best way. Be mindful of verbal communication in a classroom setting.

 

Difficult to document that they’ve attained knowledge

                How do they demonstrate?

                                Observation

                                Self-assessment

                                Test?

Houston:             Basic literacy program – 100 contact hours

                                Class includes exercises with online tutors for support

                                                Must complete exercises/assessments to pass the course

                                                Less lecture –more hands on

                                End of class – assessment “What did you know?” exit survey

                                70-75% completion rate (is Pass/Fail)

                Adults, low education level, immigrants

                                Intake session / orientation – introducing concepts & vocab

                16 weeks to grasp the material, with open labs for extra work

What is the right question to ask?

                Do you know how to use a computer (not specific enough)

                Do you know what a keyboard is? (not in depth enough)

Do you know what a keyboard is meant to do? (good mix of a specific, relevant topic that gauges knowledge)

 

2 weeks after the last class – open labs, 1-on-1 appointments to go over skills

Checklist assessment used (client fills out on their own)

 

Florida – uses point system to measure change from Pre to Post

                Each question is worth certain points based on skill level – all different

Has very specific questions (where can you find the “F4” key)

                Also gives a good idea of instructors’ teaching style, etc. and strengths and weaknesses

                Good, because students may not have the skills but give instructor high marks anyway

Concern with a sliding scale?

                Reliability – must “measure with the same stick” to allow us to assess properly

                                Especially if scores are shared with students – we have to be fair

                Refine tools and assessor, use multiple assessors

Discover assessors that may have a tendency to score higher to begin and those that are “harder” on their subjects

“If you tell someone they failed, that means they didn’t try.”

                Instructors continue to work – chalk it up as misunderstanding or an error in teaching

 

Do we want them to embrace technology, or become skilled?

                We do want some retention through training.

What have they accomplished with this knowledge?

 

Florida - 30 day follow-up afterwards

                Seniors still love to e-mail and stay on top of things, instructors available for questions

 

How diplomatic should we be with our instructors? If we want to impose tests, evaluations, assignments, and know it’s important for the program, it’s something that must be consistent across the board.

 

Ethics with clients, respecting privacy, dealing with special populations: Ask the right questions, in the right way!

 

 

 

Human Subjects Response Letter:

                Is the information we’re collecting enough and relevant enough for an application to IRB?

Difference between research and evaluation

Research – contributing to the general knowledge of the population

Evaluation – take a look as what we’ve done and how it has affected our population

NTIA does not create the definitions – follow the NIH ones

                TOP – evaluation & research required

                                Award conditional to the “Human Subjects Protection Code”

                                If not doing research – doing OK

                                If doing research – either claim exempt or request IRB review

All of this has to do with “research” conducted by funds from the Federal government

 

A lot of times students come out and only know a linear method of doing things – how you showed it to them. When put in a novel situation they tend to struggle because they don’t have the specific steps as you laid out for them. Need to be able to evaluate and train to prevent this behavior.

 

Control groups – Can we measure the difference the instructor is making?

                Have they done other things on their own that weren’t showed in the class?

                Try to survey the community and separate out who has taken the program & hasn’t

                                Matched pair – statistics analysis – comparison groups

Improving the program – more targeted to generate the highest impact

                Experiment?

                                Evaluate on the household and individual level

Target specific populations – use a controlled environment, in different regions, but the same demographics

Test using inputs and the outcomes of how that input affected the behavior

Discover similarities and differences

Evaluation should be scaled based on the size of the program – When is good enough?

                “It’s better to be partly right than totally ignorant.”

Is our program meeting the needs of our clients?

                Don’t overdo it?

 

Master the storytelling of the contribution. Are they getting it?

Testimonials

Case stories VS studies

 

Follow-up – they soon discover what they need to know, after exploring

Connection confirmation e-mail – What did I learn? How did it change my life?

 

Indicator: Time – productivity & cost savings

                How does the change in activity change how much time is spent on activities?