Back to Page 1
Initial Assessment and Instructional Planning
What Does Initial Assessment Look Like?
You will need a practical system for deciding who needs which assessments and what tools to use. The system below was introduced in Chapter 3 and is described in greater detail here. This model is offered as one example of how to get started with component assessment.
Once again, we advocate a thoughtful, one-step-at-a-time approach using professional wisdom to adopt or adapt this plan as your needs require and your resources allow. Then, after you have worked with it a while, evaluate your experience and decide how to expand and improve upon it.
A start-up plan for component assessment: two paths to instruction
The first three steps in the process provide important information for instructional planning. (Find details on the process further on in this chapter.)
Step 1: (For all learners)
Interview each learner at enrollment to set individual reading goals and to learn about specific reading difficulties, past educational experiences (including any special reading help), job skills, other abilities, and interests.
Step 2: (For all learners)
Administer a standardized reading comprehension test (you're probably already doing this), to get a measure of silent reading comprehension and establish a baseline for progress and outcomes measurement.
Step 3:(For all learners)
Ask each learner to read a short passage aloud as rapidly as possible, with accuracy, and count the number of words read in one minute. This sample of oral reading is a quick measure of the speed aspect of fluency. The difficulty level of the passage depends on the learner's reading ability. See the tables in this chapter for details.
The other important purpose for assessment at this point is to identify those learners whose reading component weaknesses are severe enough to suggest the need for further assessment. You don't have time to waste (and neither do they!), so you don't want to give a complete battery of tests to those who don't need it. These first steps (described further below) act as a screening process to identify those who need further assessment.
Decision point: The results of Steps 2 and 3 identify those who should have further assessment. The cut-off scores are 8 GE on the standardized test and 125 words per minute on the oral reading sample.
You may assume that those who score at or above the cut-off on both measures should be able to participate in group-based instruction focused primarily on meaning: building vocabulary and improving comprehension. Further assessment may be required as instruction proceeds, but is not indicated at this point.
For those who score below 8 GE or read more slowly than 125 words per minute, you should get more information to identify the specific causes for the comprehension and/or fluency problem. Is it a decoding problem, a limited vocabulary, or some combination of the two? Remember to keep all assessment data completely confidential. Oral reading and interviewing should be done in private, and all paperwork should be immediately filed in a secure place. Adult learners need to know you respect their privacy.
Step 4: (For those who need further assessment)
Administer a decoding and/or a word-recognition test to identify problems at the word level: decoding skills and recognizing high-frequency words on sight.
Step 5: (For those who need further assessment)
Administer an oral vocabulary test to see if (or to what extent) the problem is related to limited knowledge of word meanings and to get a measure of vocabulary not influenced by reading ability--a sense of reading potential.
For details on the process and examples of tests, see the following tables.
Initial Assessment: A Start-up Plan (First steps for all learners)
Step 1. Interview
If you are only collecting basic demographic information and educational goals, expand your interview to include information on native language and level of native language literacy/education, as well as information that will help you to target reading instruction, such as presence of vision or hearing problems, history of reading problems or special help in school, and self-assessment of specific reading difficulties. You may download a questionnaire from the website, Assessment Strategies and Reading Profiles: Adult Reading Components Study (ARCS) (National Institute for Literacy) at www.nifl.gov/readingprofiles/PF_Learner_Questionnaire.htm
Step 2. Standardized test
Follow test administration directions. Explain the purpose of the test, time limits, and recommendations about guessing. Give the practice test or provide practice items. Administer the appropriate test level, based on a "locator" test or publisher recommendations.
Step 3. Oral reading sample
For detailed directions on timed oral reading assessment see "Calculating Oral Reading Rate" on the website, "Assessment Strategies and Reading Profiles: The Adult Reading Components Study (ARCS)" at www.nifl.gov/readingprofiles/FT_Match_Intro.htm
A fluency test is another option. Examples:
Initial Assessment: A Start-up Plan (Next steps for those who require further assessment)
Step 4. Decoding or word-recognition test
Spelling test (optional)
Spelling ability is closely related to word-reading ability because both require phonemic awareness, decoding skills, and visual memory (National Institute for Literacy, ARCS website, www.nifl.gov/readingprofiles/FT_Match_Intro.htm ). Practically speaking, you may find it easy to introduce a spelling test because many adults include improved spelling among their learning goals.
Step 5. Oral vocabulary test
You will need training to properly administer any of these tests.
Diagnostic Assessments of Reading. Another option for initial assessment is to give one or more of the subtests of the Diagnostic Assessments of Reading (DAR) (Roswell & Chall, 1992). This comprehensive instrument includes tests of word analysis (decoding), word recognition, spelling, oral passage reading, comprehension, and oral vocabulary.
Informal reading inventories (IRIs). IRIs also provide component assessment: word identification, oral passage reading, and comprehension, and some assess decoding and spelling, too. These instruments may yield good information for instruction, but unlike more formal tests, many have not been validated.
You might find the form below useful in keeping a record of the initial assessment process for each adult learner.
Initial Assessment Record
How Would Initial Assessment Work in Programs?
Following are examples of the process for three fictional students.
Stephanie and Rob
On a Monday in early September, Stephanie and Rob walked into Old Town Learning Center to enroll in the adult education program. They filled out their enrollment forms and waited with the other new students to be called for an interview. The three adult education teachers who work at the Center conducted the individual interviews, each of which took 15-20 minutes.
Stephanie's interview. Stephanie told the teacher she liked the kids and teachers but didn't do well at school and had to repeat two grades. She also mentioned being ill frequently as a child and missing a lot of school. She gave up and dropped out before finishing the seventh grade. She explained that she had always been employed and enjoyed being a waitress because she liked talking to people. She was also interested in helping people and worked for a while at a nursing home. She said she mainly cleaned the rooms, but often talked to the residents and tried to be helpful. She eventually quit that job because she needed more money and, including tips, she earned more at the restaurant. Now she has learned that the community college has a certificate program for nursing assistants, and she would like to give it a try. She said she has always liked medical shows on TV, and thinks she would enjoy the hospital atmosphere. When asked about her other interests she mentioned baby-sitting, working in her parents' garden, singing in the church choir, and getting together with friends. She admitted she sometimes can't read the words in the church music as quickly as the others around her, so she always takes the music home, "works out" the lyrics, and then memorizes them.
Rob's interview. Rob talked about his early struggles with reading. He explained that he had special reading help in elementary school, and as a result, he was "pretty good at sounding out words." In spite of the extra help, he was always behind, had trouble getting his homework done, and never did well on tests. When asked about his reading problem, he said he wasn't sure that he still had a problem because he could read "most words." Rob talked about his job at a grocery store where he had worked for several years, stocking shelves. His supervisor thinks he's a good worker and has suggested that he could become a cashier. For that job, though, he would have to have a GED. When Rob started learning a little about the cash registers, he discovered he was fast and accurate and remembered he had always liked math much better than reading. He said his wife and parents have been encouraging him to think about a future beyond the grocery business, but he's not sure he's ready for a big change. He does want to get a GED, though, so he can at least move up in his current job. He told the teacher he is interested in playing cards, watching sports on TV, doing yard work, and playing with his 18-month old son. He also said he is interested in history and especially likes the history channel, although he never could learn much from the history books at school.
First steps in assessment. At the end of their interviews, the teachers explained to Stephanie and Rob that at the beginning of class on Wednesday evening, everyone would take the first of three tests: reading, math, and language. The teachers explained that the tests would identify what they knew and what they needed to learn. They reassured Stephanie and Rob that no one in the class but they and their teachers would see their test scores. The other tests were scheduled for the following week. The teachers also explained that they would be scheduling individual meetings with each new student during the first few class periods to listen to them read aloud. Rob had some questions about this. He said he could read silently but got nervous if he had to read aloud. The teacher reassured him that he would only have to read a short passage and they would be alone in the room.
Soon after assessment was completed the teacher met with each student briefly to explain the results. She had done an analysis of the errors on their standardized tests and so was able to talk in some detail about specific strengths and weaknesses that the test identified. For example, she told Stephanie she had done well on several items that required her to make an inference or draw a conclusion and also did a good job with the figurative language questions. (Stephanie wasn't sure what that was, so the teacher explained). But, she said, Stephanie had some trouble with the main idea questions. She told Rob that his oral reading showed that he was a careful, but slow reader, and noted that he didn't make any mistakes. Although she shared the test item analysis with the students, the teacher did not give them their scores as grade equivalents. Stephanie and Rob's initial assessment results are in the table below.
As you can see, Stephanie and Rob did not meet either of the cutoffs, so they both needed further assessment (steps 4 and 5). This testing was done individually by the teacher at the center who has that responsibility. She scheduled Stephanie's testing before class on Monday of the third week of the semester. Rob's appointment was also that week during the class's individual study time.
Stephanie's next steps in assessment. Stephanie's score on the decoding test was very high, but her vocabulary was somewhat limited especially in light of her goal. The teacher suggested that Stephanie could work on science and health vocabulary to prepare for the nursing assistant program. She explained that Stephanie would probably find she could sound out a lot of the difficult science terms more easily than most people, but she did have to learn what they all meant.
Rob's next steps in assessment. Rob's decoding skills were pretty strong (as he had said they were), but his oral vocabulary score was low. The teacher told him his word knowledge might be limited because he hadn't read very much. She explained that people learn lots of new vocabulary by reading. She suggested that he should work on building fluency (especially speed) and vocabulary. She also thought he might start his study with history materials since he probably has more knowledge and a broader vocabulary in this area of interest.
Carrie found out about the family literacy program from her social worker. She decided it would be a good way to work on her GED and get her daughter into a preschool program at the same time. She didn't want to put off enrolling in the program and was glad to learn that she could start right away even though it was November.
Carrie dropped out of school in the 11th grade when she got pregnant and always intended to go back and finish her education. Although she has a job, she would like to get one that pays better and covers health insurance. She hopes she will qualify for a better job with a GED. Of course, she can only come to the program in the mornings because she has to be at work at 1:30, but she wants to give it a try. Carrie called the number the social worker gave her and was told to come in on Tuesday at 10:00 A.M. The family literacy program meets on Mondays, Wednesdays, and Thursdays, and they schedule intake and testing on Tuesday mornings.
Carrie's neighbor agreed to watch her daughter so Carrie could keep her appointment at the program. She caught a bus and was only a few minutes late. After an interview that included a discussion of Carrie's high school experience, the teacher felt that Carrie was comfortable enough with standardized testing procedures to be able to begin the assessment without a lot of preparation. She administered a short test to determine which level test to give and put Carrie to work on the reading portion of the assessment battery.
The teacher suggested that Carrie should take the math test on another day, but Carrie didn't want to put it off. However, by the time she finished the reading and math tests, she felt tired and was concerned that she hadn't done as well as she might have on the math problem-solving test. She was glad she would be taking the language and oral reading tests the following Tuesday. (The program's policy is to spread out the testing to ensure that learners are able to do their best work on each test.)
The teacher explained to Carrie that she could bring her daughter and join the class the next day. She said she would assign a buddy to help Carrie get familiar with the program routines. She also said that after Carrie finished her testing she would find a few minutes during the group's independent study time to sit down with her and start working on a learning plan based on her goals and test scores.
Carrie didn't need any further assessment, so her teacher created a draft learning plan and they met to discuss her test scores and agree on the main goals of the plan.
How Do We Decide Which Assessments to Use?
Even if you decide to try a simple initial assessment process like the one described above, you will have decisions to make about tests and other assessments. And of course, you may want to think about expanding your system later on. That means you'll want to know what kind of information you need and what types of assessments will provide that information. The tables on page 113 (adapted from J. Sabatini, personal communication, July, 2004) describe the formats and types of tasks typically used to assess the reading component skills. Think carefully about the descriptions, so you will understand the logic of the assessment tasks. You can use that understanding to make decisions about choosing tests. Understanding what a test requires of a learner also helps you to make a reasonable interpretation of the results.
Other (local) factors in decision making
When you can talk about what you are trying to measure and have an idea of the types of instruments that exist, you're about halfway to the point of taking action. But you still need to consider other important factors before making or recommending a decision about tests.
(Source: adapted from personal communication, J. Sabatini, 2004)
In the fictional example below, Phyllis T., lead teacher at the Old Town Learning Center explains how they made decisions to expand their assessment system:
"The district decided to buy the TABE 9 & 10, so we were committed to the TABE for the next several years. We thought it would work well for accountability purposes, and we all knew how to use the diagnostic feature to get information for individual planning. Since the TABE now has a vocabulary subtest, we would get a vocabulary score in addition to silent reading comprehension. (We hope to find a way to add an oral vocabulary test next year.)
"We needed to expand our intake procedure, so we could get more information about prior educational experience and possible reading problems. We asked a group of teachers to research and present the possibilities for expanding the intake interviews. We now have a more informative interview process.
"We had never assessed fluency, so we needed to find a way to add oral reading to our assessment process. We decided as a first step in our new "reading profile system," we would use reading samples from one of the older textbook series to assess fluency. We had to train the teachers and be sure everyone was using the same passages and the same procedure, but at least we didn't have an initial outlay of cash for a test and formal training. We had to get the teachers together to discuss how to fit in this additional assessment. Since the process would only take a few minutes, we suggested that the teachers schedule three or four oral reading assessments during each class period while the rest of the learners were doing independent study or working in small groups. (Of course, if we want to track progress, we'll have to schedule ongoing assessments. How can we do that? We're not sure yet.)
"Another thing we didn't have was a way to assess decoding. We never thought we needed a test like that because we had been referring our low readers to the literacy council so they could have free private tutors. We had always just taken the adults' word for it if they said they couldn't read--or we looked at the TABE Locator Test score and referred them if they needed the Literacy-level test. We've always assumed we couldn't meet their needs in the classroom. But last year we started talking with the literacy council staff about working more closely with us, and we tried out a new collaboration this year.
"The council agreed to find three volunteers to work at our center to assist us in working with small groups and one-to-one with some of the weaker readers. In return, we provided training in the [fictional] ABC Reading System for several of our teachers and about ten of the council's volunteers. The ABC system is a structured curriculum for basic reading instruction, and we bought the training videotapes and brought in a facilitator to do a workshop on the system.
"We weren't used to working with volunteers, so it took a while to figure out how to provide the supervision and support they needed. Some of the teachers were concerned about turning over responsibility for the learners to people who might not be educated as teachers. We learned that it worked well to place the volunteers in classrooms where the teachers also had ABC training. That way, they had some knowledge in common and seemed to find it easier to work together. Then we tried (when possible) to steer the low-level readers to those classes.
"However, we had learned that decoding is an issue not only for beginning readers. We knew that some people who score in the higher ranges on the TABE might also have decoding weaknesses that could prevent them from making good reading progress.
"That meant we needed a way to assess decoding. We found a simple word analysis inventory online that had clear and detailed directions. (It's not standardized, so it wouldn't have been our first choice, but it's free and we had already committed to the costs of ABC training). Of course, we had to study it and do some training with all the teachers. It turned out that a couple of those who were most interested in reading took the lead on that.
"I think we should also investigate the standardized instruments. We would all feel more confident about the scores, and some of them assess more than one reading component, so it might be worth the investment for that reason, as well. Maybe we can put a new test in next year's budget.
"It looks like we'll have to wait till next year for oral vocabulary assessments, but maybe we could at least do a couple of "pilot" assessments this year--maybe a picture vocabulary or an expressive vocabulary test--if the district would lend us a speech-language therapist. Maybe we could refer a few students based on teacher recommendations and we could see what we learn from an oral vocabulary score.
"We are also going to check with the district office about getting a reading specialist and a speech-language therapist to talk with us about reading and vocabulary measures and making instructional decisions."
Important assessment issues
Although your setting and resources will be important factors in your decision making, remember that practical considerations alone should not drive your program plans. You want to make the best possible use of the resources you have. Most important, you should choose measures that are recognized as valid and reliable because you are going to base important decisions on the results.
Evaluating assessment instruments. Validity and reliability data may be available from publishers, but these are complicated, technical concepts, and you might want an objective opinion. Consult the Mental Measurements Yearbooks8 for detailed evaluations of many tests, no doubt including some of those you are considering. 8 Many editions of the yearbook have been published by the Buros Institute of Mental Measurements, University of Nebraska-Lincoln. To find reviews of a test, locate the appropriate edition at a college or university library or go on line at www.unl.edu/buros/ Another resource to guide your analysis is the short article, "Questions to Ask When Evaluating Tests"9 (Rudner, 1994). 9Available online at www.pareonline.net/getvn.asp?v=4&n=2 A third option for evaluating instruments is the "National ALLD Report Card on Screening Instruments" developed for the Bridges to Practice learning disabilities project.10 10The report card is specific to learning disabilities screening instruments, but many of the assessment concepts are broadly applicable. It is in Guidebook Two of the Bridges to Practice materials available online at www.nifl.gov
Examining your practice. There's more to valid measurement than choosing good tests. What you and your program do with these instruments is equally important.
You must use instruments appropriately--for the purposes for which they were designed. And to get reliable and valid results, all those who administer assessments must follow the same procedures in scoring and interpreting scores. Teachers/test administrators must be trained in the proper use of each of the assessments they will administer or interpret. They should clearly understand what each instrument or procedure measures--what the scores mean and what they don't mean.
GE scores may be misleading. An adult with a test score of 6 GE, for example, does not necessarily have skills similar to a typical sixth-grader. And of course, an adult has much more knowledge and experience than a child. The advantage of grade-equivalent scores is that they allow you to compare abilities across components. This is important because different tests often have different kinds of scores and are hard to compare.
Standardized tests. Finally, you should consider for which purposes you need a standardized instrument. For outcomes measurement, especially when results will be reported to external stakeholders, you will need scores that have meaning to outsiders (scale scores or grade equivalents, for example) and may be used to make comparisons across classrooms and programs. Standardized measures that yield grade-equivalent scores also are useful in developing profiles, because they allow you to make fairly precise comparisons across the reading components, identifying relative strengths and weaknesses.
Consider all your assessment purposes as you make decisions. Be sure that your assessment system includes valid measures to address all your instructional planning and accountability needs.
Alternative assessments. And finally, remember that assessment is more than testing. You have many options for learning about adults' goals, interests, strengths, and needs. Informal measures may help to "round out" learner profiles and suggest ways to individualize instruction. The intake interview is your first opportunity to get to know the learner. Check-ins or meetings to revisit and revise goals, journal writing for self-evaluation, and of course, your observations of attention, participation, and attitudes may provide important insights for ongoing monitoring of learning.
How Does Initial Assessment Inform the Individual Planning Process?
If one of the primary purposes of initial assessment is to allow you to provide appropriate individualized instruction, you need to think next about how to use assessment results in planning. The first steps below cover the assessment-to-instruction process from initial assessment through development of learning plans. The next steps--trial lessons and revised or expanded learning plans--are described later in the chapter.
First steps in planning
In the ideal world, to create this plan you would next consult the cookbook of reading instruction and find just the right recipe. Since, once again, that is not the world we live in, you don't have such a cookbook. You will have to analyze assessment data and use what you know about reading and learning to develop learning plans, while reserving the right to amend and adjust the plans if necessary. You may consult a reading specialist in your organization or school district if you have access to such a resource. You may do a little research on your own to assess available learning opportunities (see Chapter 1, p. 5). But in the end, you will have to use professional wisdom to develop a plan, and see how it works.
From profiles to planning
For examples of profiles, consult the website, Assessment Strategies and Reading Profiles: Adult Reading Components Study (ARCS) at www.nifl.gov/readingprofiles/. The ARCS study identified 11 profile clusters based on 569 ABE students (from the original study sample of 955 adult learners). The site provides details on the characteristics of these clusters or types of learners. The clusters may help you to understand that the profiles of individuals in your program represent common patterns of strengths and weaknesses. The general instructional recommendations may help you to understand how to make individual learning plans.
The profiles featured below are the top three of the 11 profiles: those with the highest silent reading comprehension scores. General instructional suggestions for two of the profiles are adapted from the ARCS website and included here as examples--not recipes!
ARCS profile samples. The GED-level learners in the ARCS study include three distinct profile clusters. Although the average silent reading comprehension score for these adults was around 11 GE, their other component skills showed great variation. We'll briefly discuss the three clusters below.
Word Meaning (Vocabulary) Enrichment:
Next steps for Stephanie & Rob
Completed profiles and learning plans for Stephanie and Rob are below. The reading profile form is adapted from one suggested in The Reading Components Approach (Strucker, 1997a). All tests and textbooks in these examples are fictional.
The sample learning plan form here is one option. Other formats for learning plans (as well as orientation and goal-setting tools) may be found in The Comprehensive Adult Education Planner (Mellard & Scanlon, 1998).
How Does Ongoing Assessment and Planning Inform Instruction?
Of course, the assessment-to-instruction process involves more than initial assessment. Initial assessment can't tell you everything you need to know about what and how to teach each individual learner. Trial lessons can give you more--and more specific--information.
Next steps in planning for individuals
What are trial lessons and how do they work? Trial lessons may serve two purposes: (1) to get specific information about learning needs that the initial assessment instrument/process often doesn't provide, and (2) to find out what kind of instruction and what types of materials are most appropriate for each individual (J. Strucker, personal communication, December 9, 2003). The examples below show how the process works. For other suggestions about trial lessons, consult publications such as Diagnostic Assessments of Reading and Trial Teaching Strategies (DARTTS) (Roswell & Chall, 1992).
Vocabulary instruction. Even though a vocabulary test may indicate that a learner needs work in this area, it doesn't tell you which words or types of words she/he doesn't understand. In trial lessons you could use the content word lists in publications such as the Reading Teacher's Book of Lists (Fry, Kress & Fountoukidis, 2000) to get a sense of vocabulary and general background knowledge in math, science, or social studies (J. Strucker, personal communication, December 9, 2003). By teaching words from these lists you may get answers to important questions:
Comprehension-strategy instruction. To improve comprehension you can use trial lessons to identify subjects the learner finds interesting or knows a lot about. Knowing about interests and background knowledge will help you to select materials for instruction. You also might have the learner think aloud during oral reading of a high-interest passage to get an idea of where the comprehension problem lies (see page 80 for more on think-alouds).
According to Strucker (personal communication, December 9, 2003), typical problems related to or contributing to poor comprehension are lack of fluency and lack of background knowledge or knowledge of word meanings. You may see evidence of these difficulties during oral reading or a think-aloud. Another common problem is imprecise understanding of the functions of "signal words" (Fry et al. 2000). Common signal words and phrases, like therefore, however, consequently, in contrast, and in other words provide important clues to readers about the way information in one part of a text relates to another part. Introduce words on the signal words list in publications such as the Reading Teacher's Book of Lists (Fry et al.) to check on this aspect of vocabulary. In working with English language learners, you could also use trial lessons to assess their knowledge of basic English grammar and syntax (sentence structure and word order). If they don't understand the meaning of verb endings, for instance, they will have trouble with comprehension.
Instructional approaches. The examples above illustrate the first purpose of trial lessons--identifying specific gaps in skills and knowledge. You also need to know what instructional strategies to use (the second purpose), and you can use trial lessons to experiment with different approaches. For instance, in working with vocabulary, what does it take to establish the word meanings in long-term memory? Is it helpful to discuss the meanings and do a word map? Is it better to have the learner compose and write sentences using the words? Does he know how to use a dictionary? You can find out by comparing the response to different approaches--how quickly words are learned, how comfortably and accurately they are used, and how well they are later recalled--and also by asking the learner what works best.
This second purpose for trial lessons may be especially important for struggling beginners who may have learning disabilities or other special learning needs. These adults may need multi-sensory techniques and lots of practice and review. It helps to find out up front so you can plan for the additional time that will be required and help adult learners to be realistic in goal setting.
The trial-lessons concept is hard to implement in a classroom setting. If you have important questions about a particular learner, you might be able to design a trial lesson or two and have an aide or volunteer work with the learner. Or, if individual trial lessons as described above are not possible in your classroom, you might be able to adapt the concept, in short trial activities or mini-lessons with small groups or pairs.
The Assessment-to-Instruction Cycle
In summary, the process described above consists of four main steps:
In fact, of course, it doesn't end with step 4. It's a continuing cycle in which both formal and informal assessments guide your instructional decision making. If the learner is struggling, you try to figure out where the problem lies. Have you assumed too much? Do you need to back up and work on prerequisite skills? Might it make a difference to just slow the pace, and provide more coaching and review? Or should you try another approach? On the other hand, if you and the learner can see that skills are improving and knowledge is growing, you are not inclined to "fix something that isn't broken."
Of course, to make the best use of your observations and other assessment data, you must continue your own development to add to your repertoire of instructional options and build knowledge that enriches your professional wisdom. Adjusting instruction and trying new approaches requires knowledge, skill, and creativity. But it also takes management skills to meet the varying needs of individuals in a multi-level setting.
Meeting individual needs
The process described above is based on individualized instruction, which may be difficult or impossible to provide in your classroom. In the ideal situation you would be able to devote at least part of your instructional time to this kind of one-to-one work with learners. Especially for those with serious reading needs, providing anything less than this kind of instruction--as in large group work with adults with varied needs or the individual workbook-study format--amounts to a sacrifice of the limited time adults can give to learning. (Strucker, 1997b). But if that's your situation, you can only make the best of it for the time being.
How can you meet individual needs in the multi-level group setting? There are no simple solutions, but the suggestions in the next chapter may be helpful.