Dr. Eunice Askov’s Response to Questions

Dr. Eunice Askov’s Response to Questions from the Workplace Literacy Discussion List

In August 1997, the NIFL Workplace Listserv invited Dr. Eunice (Nikkie) Askov to provide responses to questions about workplace literacy program evaluation. Dr. Askov has had ten years’ experience evaluation programs funded by the United States Department of Education National Workplace Literacy Program. She also serves as an external evaluator of three NWLP projects.

The questions posed by the workplace literacy listserv members focused mostly on evaluation. The Q and A session has been divided into the following categories:

  • General information on workplace literacy programs and evaluation
  • Long-term authentic evaluation
  • Legal aspects of evaluation
  • Evaluating workplace literacy materials
  • Participatory workplace evaluation
  • Qualifications for being an outside evaluator

Inviting the listerv to participate (Barbara Van Horn)

Dear subscribers,

Our guest speaker, Dr. Nickie Askov, will be available all this week to

answer your questions on evaluating workplace literacy/workforce education

programs. Don't be shy about asking -- we are looking forward to hearing

your questions and concerns about program evaluation!

Dr. Askov is considering the questions and will reply throughout the week.

Response (Dr. Eunice Askov)

Dear NIFL-Workplace List Serv:

Thanks for your terrific and challenging questions! I am really glad that you all are thinking in depth and showing lots of experience in the issues around evaluation of workplace literacy programs. My hope is that you all will contribute to the answers to these questions. I will take a stab at answering them, but please chime in with your thoughts. My answers are based on experience over the past ten years of evaluating programs funded by the US Department of Education's National

Workplace Literacy Program. I currently serve as external evaluator of 3 NWLP projects--Community College Board of Colorado ("front range" community colleges), Wisconsin's Technical and Adult Education Board (including many

of the 2-year postsecondary institutions in Wis.), and College of Lake County (north of Chicago).

On general developing a comprehensive workplace evaluation plan

1. I would appreciate comments re: evaluating my workplace (preparation)

program. I administer (and teach) "workplace know-how skills" ranging a

full gamut from reading to math to computer literacy, to writing,

listening, problem solving, etc. -- preparing adults with disabilities

and /or disadvantaging conditions for employment. I would like to form an advisory council to assist in this process, as well as program promotion, fund seeking, etc.

Response (Dr. Eunice Askov)

Thanks for your question about how to evaluate the "full gamut" of

your programs. My question to you is: Who needs the information and why? That

will shape your evaluation. If you need evaluation data to demonstrate

program effectiveness to funding agents, then you need to involve them in

determining the types of data that would be most useful. If you are

working with a company delivering workplace literacy, then obviously the

company becomes an important stakeholder. (August 5, 1997 answer to J.M.L. Fulton)

do evaluation of workplace literacy programs.) An advisory council for

your program would be very helpful, especially in being responsive to

community needs as well as in developing multiple funding sources. We rely

heavily on stakeholder groups (which is really an advisory council) in

designing training evaluation for the workplace. (See my response to Paul

Jurmo for the name and ordering info. of our manual.) If you would like to

ask a more specific evaluation question, go ahead. Thanks.

2. Could you provide us with a general overview of your approach to workplace literacy evaluation, then provide answers to the specific question proffered?

Response (Dr. Eunice Askov)

First, let me describe my approach to workplace literacy evaluation that I am using in the 3 NWLP projects where I serve as external evaluator. I follow Kirkpatrick's four

levels of evaluation because they seem to work and get at essential information

for summative evaluation. (Reference to follow.) I also do formative

evaluation by asking the same questions in interviews over the 3-year time

period so I can provide useful feedback for program improvement. The first

level of Kirkpatrick's "hierarchy" is satisfaction on the part of all

stakeholders--workers, supervisors, training director, management, union,

educational provider, instructor, etc. Usually this is measured with a

brief questionnaire by the provider or company. I do in depth structured

interviews also to get an objective look at satisfaction as well as

suggestions for improvement. The second Kirkpatrick level is mastery of

the training content. The provider needs to demonstrate that the workers

have learned the content that has been taught. To me, this seems to be the

point of the program! (If you can't demonstrate gains in relevant basic

skills in the work context, you are in trouble!) The third level is

transfer to the workplace. Here you look for carryover of instruction into

the daily jobs. This can be measured by supervisor and worker interviews

(which I do as external evaluator) or questionnaires. One of my evaluation

sites--Community College Board of Colorado--has done a lot with this by

having workers write down on a card what they have learned at the end of

each class. They are then instructed to write down on the same card when

they apply the skill in their jobs. The cards are then brought back to the

instructor who collects them as part of project evaluation. This approach

encourages workers to think about application which encourages transfer; it

also demonstrates that the classroom instruction is being applied in the

workplace. (Mary Gershwin has presented on fourth level this approach at several

workplace literacy conferences.) The fourth level is impact on the

organization—bottom-line results. The company needs to identify what their

concern is--why they want a workplace literacy program. That's the clue as

to what they should measure. Note: They have to gather this type of data.

Providers simply don't have access to this type of info. Best to use data

that they already collect--you are more likely to get access to that rather

than asking that they collect something new.

3. What range of evaluative methods are currently being used for

workplace programs across the country?

Response (Dr. Askov):

I really don’t know. I do know that there is a tremendous range of approaches being used in the NWLP projects. Almost any text on evaluation will outline methods ranging highly prescribed to participatory evaluations. The approach you use may depend on your philosophy. I lean toward the eclectic with a participatory bent!

4. With a variety of businesses interested in workplace literacy/training programs is there general baseline data that should be collected and monitored for literacy program evaluation?

Response (Dr. Askov)

I find Kirkpatrick’s levels of evaluation very useful (See above). You, of course, must collect information to demonstrate basic skills gains. What is transferred to the workplace and bottom-line measures will depend on the needs of the specific workplace. Find out where they “hurt” and measure that.

5. What types of pre- and post-testing are businesses interested in giving as part of a program (i.e., competency-based assessments, standardized tests, other)?

Response (Dr. Askov)

Businesses don’t usually know how to assess workplace literacy skills pre- and post- . they rely on educational providers to do that. My preference is for competency-based assessments that measure basic skills in the specific workplace context. That usually means that you have to get into test development. Using standardized tests pre- and post- usually doesn’t work since they are not sensitive to the curriculum which you are delivering. You want to be able to show mastery and gains in basic skills. [You might be interested in our discussions of assessment issues in our most recent publication from the Institute for the Study of Adult Literacy, which gets intot he pros and cons of various types of assessments. Ordering information is available from Kitty Long at kpl1@psu.edu.

6. What does a workplace literacy program evaluation involve?

Response (Dr. Askov)

See above for my approach.

7. What types of data should be collected for literacy programs delivered in: manufacturing, service, and government settings?

Response (Dr. Askov)

I assume you mean bottom-line data. Again, finding out what the company wants in almost any setting (whether it’s the number of widgets, hamburgers, or customer calls received) is a major issue along with product quality. Safety is particularly important in manufacturing since it affects bottom-line results as well as their workers’ well-being.

8. What procedures would you use to analyze the data?

Response (Dr. Askov)

I'm not sure there is a generic answer to this question. If you have the capability to perform statistical tests to measure pre- and post- test gains with T-tests, good. If not, perhaps you can work with a graduate student from a university to get help with statistics. However, the assumptions behind most statistical tests are difficult to meet in a workplace program. The optimum is having a randomly selected control group or comparison group which receives no training/treatment or the usual training. If you can set up that situation, then measure both groups pre- and post- with tests that will reflect mastery of the training.

On long-term authentic workplace evaluation

1. I work with the Workplace Learning Resource Center, part of San Diego

Community College District. We contract with companies to provide basic

skills instruction in the workplace. I would like to find hard data re: how basic skills improvement affects the bottom line, to assist us in marketing these classes to industry. We do a good job of assessing learner progress, collecting anecdotal evidence of

positive changes in the workplace, and demonstrating immediate gains. We

haven't been as successful in documenting more long term, concrete changes,

things like "fewer defective parts" or "less down-time of machinery" or

"more effective participation in team meetings", mostly because we aren't

there any more after the course is over to collect the data, follow up, etc.

We are focused on the next project, and the last one gets lost, and the

company as well has moved on to the next thing. Suggestions? Experience? Assessment tools? Examples of data?

Response (Dr. Askov)

Marian raises an excellent issue that all of us who operate on "soft" money

face. How can you take time to do an evaluation which is not funded as

part of a grant? On the other hand, how can you NOT take time to do an

evaluation which is going to provide evidence of effectiveness for future

contracts? She is absolutely right in pointing out that the best way to

"market" services is to demonstrate a past record of accomplishments.

Tom Sticht pointed out in a paper on workplace literacy evaluation written

for the US Department of Education in the late 80's that educational

programs should not be accountable for producing "bottom-line" gains. He

stated that the purpose of workplace literacy programs is to improve basic

skills. If the program leads to better functioning in the workplace, and

to bottom-line improvements, then great! (We need to remember that

perspective! We can't be remedies for all bottom-line difficulties!)

However, companies that are engaging providers may be looking for a more

direct payoff. My experience is that companies enter into a contract with

an educational provider for the purpose of improving their so-called

bottom-line. However, once they see the effectiveness of the program

demonstrated through their workers (through improved morale, self-esteem,

attitudes, work behaviors, etc.), they often stop asking for bottom line

proof. They are more likely to point to changes in individual workers as

evidence of program effectiveness. However, top managers on up the line,

usually at other locations, do question the cost effectiveness of workplace

literacy programs so the question remains a good one.

The only way that an educational provider can gather bottom line evaluation

data on effectiveness is with the cooperation of the company. We do not

have access to their records! Somehow, you need to get company buy-in on

data collection so that you can show impact. In fact, they need to do the

data collection. (Find out where they "hurt"--where their bottom line

problems are. Make that the focus of the evaluation.) Because it is

difficult to get companies to gather data for you (they may be "too busy

with production" or unwilling to share confidential info.), most providers

rely on testimonials of CEOs or training directors. This approach is often

effective if you have a convert who is willing to speak out at business

roundtables, etc. What are the rest of you doing? The NWLP projects offer

experience in funded evaluation efforts. (I'll describe my approach in

another message. This is long enough!) Please share ideas. Thanks.

On legal aspects of workplace evaluation

1. What legal issues have been encountered when evaluating job-literacy

requirements? FYI- we evaluated a particular category of workers for reading skills

- and those who passed at a certain level demanded (and won) a higher

grade of pay - they were not HIRED to read at a specific level. (We're union) .Can a person be fired due to lack of literacy skills? In a complex manufacturing setting - what do you feel realistic base reading grade levels should be? Can you recommend any studies that show concrete proof of higher reading levels equate a better (more productive) employee?

Response (Dr. Eunice Askov)

Peg's questions take us in a slightly different direction from program

evaluation. My suggestion is that you hunt up this reference for guidance

on legal issues: Arthur, Diane (1994). "An Employer's Guide to Policies

and Practices: Workplace Testing". New York: Amacom. It is available from

the American Management Assn., 135 West 50th Street, New York, NY 10020.

Try your library first--it's not cheap! As to your question about reading

grade levels for advanced manufacturing jobs, I avoid using grade levels

since they are meaningless for adults, especially in a workplace where

content knowledge and experiential background are so important. Any answer

I would give would really not be accurate or meaningful. I also can't

provide "concrete proof" that higher reading levels translate into better

worker productivity. It makes sense that higher reading capabilities would

enable a worker to deal with the literacy demands of the job, but there are

many other factors as well, such as attitudes, work habits, technical

knowledge and experience, etc. We do know that higher education levels

translate into better salaries, but I'm not sure that is what you are

asking. Any thoughts from others? Thanks.

On workplace education material evaluation

1. Do you recommend evaluating the reading ease or difficulty of on-the-job

printed materials in every workplace? Telling people, "You'll have to learn

to read well enough to understand your on-the-job materials" doesn't seem to

be the only answer. I think we need to look at the difficulty of the

materials and help the writers learn some new skills, too. Several HR

materials I've analyzed were written at graduate school level, poorly

organized, and badly in need of some graphic design and layout changes. How

do you recommend evaluating the materials and their characteristics? Then

what do you recommend be done about making them more reader-friendly?

Response(Dr. Eunice Askov) on evaluation of materials

Here are my thoughts to Audrey's concerns. She is absolutely right about

workplace materials. They are often obscurely written by engineers (or

someone far away from the frontline workers). You might do as she

suggests--apply a readability formula (Fry Graph or whatever) to a sample

of materials just to show the company that the workers will have difficulty

reading the required materials. (You need to remember, however, that

workers who are familiar with the company can read materials at levels

higher than predicted by a readability formula because they are familiar

with the content.) Often workplace literacy programs become involved in

writing a "translation" for technical manuals as part of the contracted

training services. The key to success in rewriting materials is working

closely with the training director so that your version of the materials is

not only readable but accurate! Don't forget to pilot test the rewritten

materials on new and experienced workers to be sure that they do work!

Because we lack the technical background, we may omit or distort important

material that workers need to know. We also need to identify the essential

workplace vocabulary (including abbreviations) and teach those words as

part of the rewritten materials rather than avoid terms just because they

may elevate a readability formula. So, while it is useful to use a

readability formula to demonstrate need for revised materials and training

for a company, don't rely on a formula too heavily for your own rewriting

of materials. You may also be able to identify some audiovisual training

materials that could replace some of the company's written training

materials. The point to keep in mind is this: Workers really need to be

able to interact with training materials in some way, such as by

computer-assisted instruction, role play, workbook activities, etc. Simply

reading materials (whether in their original or rewritten forms) is usually

not effective for learning. Audrey, please share your experiences. Others

also. Thanks.

On participatory workplace evaluation

1. I'm hoping it's not too late to add a question for Dr. Askov...

If not, it would be great to hear her thoughts about what *participatory*

evaluation has looked like or could look like in workplace literacy

evaluation. Are there any classic examples in circulation or in process?

Response ( Don Cichon)

I'm just winding up a participatory evaluation of a 3-year workplace ed

program across 7 companies in NY State. We'll have draft reports ready

in early September. If you send me your mailing address, I'd be glad to

get you copies, then "talk” if you’re interested.

Response (Paul Jurmo)

In 1993 and 1994, the National Institute for Literacy funded a one-year

project in which a team of researchers developed a "participatory"

(a.k.a. "collaborative" or "team-based") approach to workplace education

evaluation. The results and methodology were summarized in a series of

case studies from the seven workplace education program sites which

participated in the study, along with a guidebook which outlined the

evaluation process used.

These documents (authored by Laura Sperazi and Paul Jurmo) are available

from Learning Partnerships at the address below. (A third document by Paul

Jurmo -- an overview of issues and new directions for workplace education

evaluation, based on a literature review and interviews -- is also

available.)

This collaborative approach to evaluation grew out of an earlier evaluation

project funded by the Mass. Workplace Education Initiative. This

evaluation approach was later built on in a series of guidebooks issued

by ABC CANADA (416-442-2292) which laid out principles and practices of

a collaborative approach to workplace needs assessment, evaluation, and

curriculum.

Underlying this approach to workplace education and evaluation (whether

called "participatory," "collaborative," or "team") is the notion that

key program stakeholders can build a deeper understanding of the

potential and outcomes of workplace education if they take the time to

reflect on program goals and monitor what the education effort is

achieving. By working in a planning and evaluation team, stakeholders

can also build collaborative planning mechanisms consistent with a more

participatory workplace.

This model has been tried out more recently in the 3-year federal

workplace education program in NY State which Don Cichon refers to

below. This and other experience using this model (in the US and

Canada) show that, although a collaborative approach to workplace

education has significant potential, it also requires time, common

values, and expertise, assets which busy stakeholders often find hard to

come by.

Response (Dr. Eunice Askov)

Paul, thanks for chiming in. I was going to reference your materials--just

had to get the correct citations with me at the computer! You did a great

job of describing the participatory approach. We have followed a similar

approach--in which the stakeholder group really takes charge of the

evaluation--in a manual created as part of the National Workforce

Assistance Collaborative (NWAC) funded by the US Department of Labor to the

National Alliance of Business (NAB). [Askov, Eunice N., Hoops, John, &

Alamprese, Judith (1997). "Assessing the Value of Workforce Training".

Washington, DC: National Alliance of Business.] You can order the guide

for about $7 from NAB at 1201 New York Ave, NW, Suite 700, Washington, DC

20005 or call 1-800-787-2848. You will also be able to download it from

NWAC's web site (http://www.ed.psu.edu/nwac/index.html). Don't look there

yet for it, but you might find other helpful reports for workforce and

workplace training and literacy which are already in place. Thanks.

On outside evaluator qualifications for workplace evaluation

1. This is a slightly different type of question concerning evaluation. If

you want to hire an external evaluator to assist in evaluating a workplace

literacy program, how would you identify a qualified evaluator (i.e., what

qualifications should the individual possess) and how do you determine a

fair amount to pay the evaluator (and/or what should the evaluator be

expected to do as part of the evaluation)?

Response (Dr. Eunice Askov)

This is a good question. Of course, you want to consider a person's formal

training and experience. Experience in and knowledge of the literacy field

are essential. (Some evaluators say that this is not necessary, but I

don't agree. An evaluator needs a sense of the adult literacy delivery

system, how instruction is carried out, the nature of the students, etc.)

It is important to interview your potential evaluator to determine his/her

philosophy toward evaluation. My recommendation is to choose someone who

believes in a participatory approach so that all the stakeholders are

involved. It is also important to determine the evaluator's philosophy

toward assessment. For example, sometimes an evaluator believes in using

only standardized tests. The program decision-makers have to decide

whether or not they agree with that point of view. Then, you also have to

reach agreement on the extent of involvement in the program. The evaluator

should be involved upfront and throughout the project in both a formative

capacity as well as a summative evaluation capacity. You need to decide

what level of depth is necessary for your purposes. Finally, you need to

decide why you need an external evaluator--why you can't do it for

yourself. If an outside point of view is essential, then go ahead. If you

think you can perform the evaluation function, then check out our book from

the National Alliance of Business as part of the National Workforce

Assistance Collaborative ("Assessing the Value of Training") which tells

you how to do your own evaluation in a workplace setting. Good luck.

On references about workplace literacy program evaluation.

1. What publications, journals and books are foundational to understanding workplace literacy program analysis and evaluation?

Response (Dr. Eunice Askov)
I promised you the reference that I use in my evaluation plan as an

external evaluator for NWLP projects. It is Kirkpatrick, Donald L. (1994),

"Evaluating Training Programs: the Four Levels". San Francisco:

Berrett-Koehler Publishers. (Address is 155 Montgomery St., San Francisco,

CA 94104-4109. Telephone: 415-288-0260; fax: 415-362-2512) I like

Kirkpatrick's four levels because they make sense to people. While the

levels are not strictly a hierarchy, they do represent a natural

progression in evaluation.

To continue the list serv chat on program evaluation, it is now popular to

try to figure the ROI or Return on Investment. Jack Phillips wrote a

series of articles on the subject in "Training" last spring, complete with

formulas for figuring impact. Ed Gordon is also working on ROI formulas to

be published shortly. It appears that this publication (see below) may be

similar. If you are going that route, you may also want to take a look at

Mikulecky, Larry, & Lloyd, Paul (1992), "Evaluating the Impact of Workplace

Literacy Programs" available from the National Center on Adult Literacy,

University of Pennsylvania, Philadelphia, PA.

NWAC's web site (http://www.ed.psu.edu/nwac/index.html). Don't look there

yet for it, but you might find other helpful reports for workforce and

workplace training and literacy which are already in place. Thanks.

Response on the value of the Guest Speaker Forum

(Carolyn Talarr) To everyone who responded to my question, Thanks so much for the wealth of references and leads on participatory workplace literacy evaluation. I'm going to take you up on your offers and *will indeed* be contacting those of you who indicated that you'd be willing to share materials. And generally I'd like to thank Barb Van Horn and Dr. Askov for this week of rich, focused information sharing. Makes me want to suggest similar events on other lists...

(Barbara Van Horn) Carolyn Talarr said that she wanted to suggest similar events (discussion with guest speaker) on other lists. I would like to encourage subscribers
who enjoyed this week's exchange to send me suggestions for other topics of

interest and/or suggestions for guest speakers. I would like to do this

more often!

(Josephine Kramer) Yes, Nickie and Barb, thanks for the week's exchange . It was very

enlightening. Stimulated a lot of good thoughts and interaction.

How about Larry Mickuleky sometime? Or Mary Gershwinn (Colorado)?

(Dr. Eunice Askov) Thanks for inviting me to be the guest resource on the NIFL-Workplace list serv last week. I enjoyed thinking about and responding to the challenging questions. Your suggestion of Mary Gershwin who is a practitioner on the

front line is a good one. The Community College Board of Colorado NWLP

project (which she directs) has done some really interesting and innovative

things in their project. Enjoy!