• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Want to get organized in 2022? Let Dokkio put your cloud files (Drive, Dropbox, and Slack and Gmail attachments) and documents (Google Docs, Sheets, and Notion) in order. Try Dokkio (from the makers of PBworks) for free. Available on the web, Mac, and Windows.


SBA and PCC Training Programs Evaluation Designs

Page history last edited by Pahniti Tom Tosuksri 10 years, 7 months ago

Description: Both Sustainable Broadband Adoption programs and Public Computing Center programs have training components. Knowing that evaluation of the training can impact future funding, how are training programs being evaluated? How is your training program being evaluated? Samantha will share a PCC evaluation design and guide the discussion. Don't forget to take notes below!


Time: Tuesday 9:45-10:45


Location: Allen Room


Facilitator: Samantha Becker, University of Washington


Notes: What do you want to measure?

Are they satisfied with your training?

Have they acquired the skills they expected at the beginning of training?

How can we measure those metrics?


For example…

Can you save or open a file from a file folder


The point:

Can the student eventually go out on their own and use a computer comfortably to complete the tasks they need to in a “novel situation.”

Unless you mention it from the get-go, not good to conduct testing for the client.

How do they feel when they have their skill level assessed? Have to think about this.


Session is to discuss how we design a program with these things in mind.

Floor is open – issues that are important.


First question: Looking for templates to evaluate an SBA's meaningful use.

Posted on the web and on a previous WIKI

http://tascha.uwedu/usimpact (CCN, CTOP, PL studies)

Good to use consistent document so we can have the same metrics to evaluate across different programs.


Francine - What do we hope to accomplish, what do we hope to say, at the end of the program?


What topics are covered:

Financial literacy (go to the banks, deals, finance rates)

Basic computing/internet

Workforce development

Continuing Education


Financial Literacy metrics

                Want to use it

                Want to generate an Outcome

                                Bounce less checks – save money paying bills online

Measurable – pre & post assessment

Good to keep anonymous – students aim to please so may be dishonest if they know you’re watching

                Embarrassment, etc.

Be mindful of confidentiality – not saving passwords and looking, etc. – verbal and written feedback is the best way. Be mindful of verbal communication in a classroom setting.


Difficult to document that they’ve attained knowledge

                How do they demonstrate?




Houston:             Basic literacy program – 100 contact hours

                                Class includes exercises with online tutors for support

                                                Must complete exercises/assessments to pass the course

                                                Less lecture –more hands on

                                End of class – assessment “What did you know?” exit survey

                                70-75% completion rate (is Pass/Fail)

                Adults, low education level, immigrants

                                Intake session / orientation – introducing concepts & vocab

                16 weeks to grasp the material, with open labs for extra work

What is the right question to ask?

                Do you know how to use a computer (not specific enough)

                Do you know what a keyboard is? (not in depth enough)

Do you know what a keyboard is meant to do? (good mix of a specific, relevant topic that gauges knowledge)


2 weeks after the last class – open labs, 1-on-1 appointments to go over skills

Checklist assessment used (client fills out on their own)


Florida – uses point system to measure change from Pre to Post

                Each question is worth certain points based on skill level – all different

Has very specific questions (where can you find the “F4” key)

                Also gives a good idea of instructors’ teaching style, etc. and strengths and weaknesses

                Good, because students may not have the skills but give instructor high marks anyway

Concern with a sliding scale?

                Reliability – must “measure with the same stick” to allow us to assess properly

                                Especially if scores are shared with students – we have to be fair

                Refine tools and assessor, use multiple assessors

Discover assessors that may have a tendency to score higher to begin and those that are “harder” on their subjects

“If you tell someone they failed, that means they didn’t try.”

                Instructors continue to work – chalk it up as misunderstanding or an error in teaching


Do we want them to embrace technology, or become skilled?

                We do want some retention through training.

What have they accomplished with this knowledge?


Florida - 30 day follow-up afterwards

                Seniors still love to e-mail and stay on top of things, instructors available for questions


How diplomatic should we be with our instructors? If we want to impose tests, evaluations, assignments, and know it’s important for the program, it’s something that must be consistent across the board.


Ethics with clients, respecting privacy, dealing with special populations: Ask the right questions, in the right way!




Human Subjects Response Letter:

                Is the information we’re collecting enough and relevant enough for an application to IRB?

Difference between research and evaluation

Research – contributing to the general knowledge of the population

Evaluation – take a look as what we’ve done and how it has affected our population

NTIA does not create the definitions – follow the NIH ones

                TOP – evaluation & research required

                                Award conditional to the “Human Subjects Protection Code”

                                If not doing research – doing OK

                                If doing research – either claim exempt or request IRB review

All of this has to do with “research” conducted by funds from the Federal government


A lot of times students come out and only know a linear method of doing things – how you showed it to them. When put in a novel situation they tend to struggle because they don’t have the specific steps as you laid out for them. Need to be able to evaluate and train to prevent this behavior.


Control groups – Can we measure the difference the instructor is making?

                Have they done other things on their own that weren’t showed in the class?

                Try to survey the community and separate out who has taken the program & hasn’t

                                Matched pair – statistics analysis – comparison groups

Improving the program – more targeted to generate the highest impact


                                Evaluate on the household and individual level

Target specific populations – use a controlled environment, in different regions, but the same demographics

Test using inputs and the outcomes of how that input affected the behavior

Discover similarities and differences

Evaluation should be scaled based on the size of the program – When is good enough?

                “It’s better to be partly right than totally ignorant.”

Is our program meeting the needs of our clients?

                Don’t overdo it?


Master the storytelling of the contribution. Are they getting it?


Case stories VS studies


Follow-up – they soon discover what they need to know, after exploring

Connection confirmation e-mail – What did I learn? How did it change my life?


Indicator: Time – productivity & cost savings

                How does the change in activity change how much time is spent on activities?



Comments (8)

Arnold Redd said

at 10:08 am on Jun 28, 2011

specific clear focused questions are a must

Jemarius Moore said

at 10:22 am on Jun 28, 2011

Not every student has the same desire to learn or achieve. With that said, teachers should not be evaluated by how much information the student retains especially for someone without a technology background and a computer to practice with at home.

Arnold Redd said

at 10:25 am on Jun 28, 2011

Also, We have a small demographic that shows up to get the kids a computer

Pahniti Tom Tosuksri said

at 10:31 am on Jun 28, 2011

A teacher's goal should be to impart knowledge to the students. If the students do not have the desire to learn or achieve, I would question why they are in the classroom to begin with. On the other hand, the level of progress should not be held against the instructor because the classroom has students with varying learning levels and speed. But it should be considered a failure if a student with desire to learn has not learned anything.

Arnold Redd said

at 10:37 am on Jun 28, 2011

I agree, it is also my goal as an instructor to figure out how to best provide the instruction in a manner that motivates and allows the students to make

Jemarius Moore said

at 10:39 am on Jun 28, 2011

Failure on the student and not the instructor... A student with the DESIRE to learn will pay attention and ask questions about things they do not understand while in class.

Arnold Redd said

at 10:34 am on Jun 28, 2011

transference of knowledge

Pahniti Tom Tosuksri said

at 10:40 am on Jun 28, 2011

If they're there just for a computer - then we have to open their eyes to a problem they don't know they have yet with the computer. Do you want to deal with viruses? Do you want your child surfing the internet and watch inappropriate material? Do you want them in chatrooms? It's unfortunate that we'd have to use fear, but the childrens' safety is always something that is close to the parents, especially those trying to get a device that can help their child succeed.

You don't have permission to comment on this page.