From May 21 - 29, 2007, Elizabeth Greenberg was a guest
facilitator on the Poverty, Race, Women & Literacy listserv.
The focus of her discussion was the findings of the National
Assessment of Adult Literacy (NAAL) as they pertained to
gender, race, and socioeconomic status (SES).
Elizabeth Greenberg is a principal research analyst at the
American Institutes for Research (AIR), and is AIR's Project
Director for the 2008 National Assessment of Adult Literacy
(NAAL) Special Studies contract. She was also AIR's Deputy
Project Director for the 2003 NAAL Design, Analysis, and
Reporting contract. In her role as Deputy Project Director for
the 2003 NAAL, she led the development of the NAAL
background questionnaire and assessment items. She is a lead
author or co-author of several reports based on the 2003 NAAL, including A First Look at the Literacy of America's Adults in the 21st Century, The Health Literacy of America's Adults, Literacy in Everyday Life, Literacy Behind Bars, and the 2003 NAAL Public-Use Data File User's Guide. Elizabeth is also an author or co-author of several reports and articles based upon the 1992 adult literacy data, including English Literacy and Language Minorities in the United States.
Thanks to Ryan Hall, a graduate student at Georgia State University, the following represents a compilation of the various topics discussed by listserv members while Elizabeth Greenberg facilitated the discussion. Each topic contains one or more discussion threads arranged by questions and answers. All of Elizabeth's questions and comments are labeled with her name, while questions and comments from listserv members are labeled with first and last initials. Most of the postings were copied and pasted verbatim, with a few words edited here and there to facilitate reading. For complete postings, along with author information, go to the Poverty, Race, Women & Literacy Archives and look at postings between May 21 - 29, 2007.
Elizabeth: The 2003 National Assessment of Adult Literacy (NAAL) assessed the English literacy of adults in the Untied States for the first time since the 1992 National Adult Literacy Survey (NALS). The assessment was administered to more than 19,000 adults (ages 16 and older) in households and prisons. The assessment measured English literacy directly through tasks completed by the participants. These tasks represent a range of literacy activities that adults are likely to face in their daily lives.
Three types of literacy were measured by the assessment:
Prose Literacy. The knowledge and skills needed to search, comprehend, and use information from continuous texts. Prose examples include magazine articles, news stories, and informational brochures.
Document Literacy. The knowledge and skills needed to search, comprehend, and use information from noncontinuous texts, such as tables, graphs, and maps. Document examples include job applications, payroll forms, transportation schedules, maps, and food and drug labels.
Quantitative Literacy. The knowledge and skills needed to identify and perform computations using numbers that are embedded in printed materials. Examples include figuring out a tip, filling out an order form, and comparing costs of employee benefits.
The 2003 NAAL also measured health literacy. The health literacy tasks were organized around three domains of health and health care information and services: clinical, prevention, and navigation of the health system. All the health tasks were also prose, document, or quantitative tasks.
If you are interested, you may want to look at the following 2003 NAAL reports:
NAAL is planning to assess vocabulary as part of a special study in 2009, but there are not any plans to also measure oral reading fluency as part of that special study. Therefore, I don't think it will be possible to directly link oral reading fluency with vocabulary knowledge.
Elizabeth: Some of the older adult literacy items that were originally developed for the 1985 Young Adult Literacy Survey or for the 1992 National Adult Literacy Survey have been released and are available on the NAAL website (nces.ed.gov/naal). NCES generally releases older items that have been used in at least two assessments because they are dated and cannot be used in future assessments. They do not release most of the newer items because they try to use items in at least two assessments in order to be able to measure trend (changes in the population) between assessments. If all the items were new each time, it would not be possible to link results from one assessment to the results from the previous assessment.
JBS: The NAAL Test Questions Tool provides access to 109 questions and answers from the 1985 and 1992 assessments. We recently added a few health related questions that you can access through entering "Health" term in the database at http://nces.ed.gov/NAAL/SampleItems.asp?PageId=138. The 2003 NAAL questions are not available on the web. However, people who are interested to see them can make appointment and review them in our office in DC.
- Between 1992 and 2003, women's average document and quantitative literacy scores increased. During the same time period, men's average document literacy score decreased and there was no statistically significant change in average quantitative literacy for men.
- Between 1992 and 2003, women's average prose literacy score stayed the same, while men's average prose literacy score decreased.
- In 2003, women had higher average prose and document literacy than men, and men had higher average quantitative literacy than women. In 1992, there was no statistically significant difference between men and women in their average prose literacy, but men had higher average document and quantitative literacy than women.
- Between 1992 and 2003, average prose, document, and quantitative literacy increased for Black adults.
- Between 1992 and 2003, average prose and document literacy decreased for Hispanic adults. Average quantitative literacy did not change for Hispanic adults. The percentage of the adult population (age 16 and older) that identified themselves as Hispanic increased from 8 percent in 1992 to 12 percent in 2003.
- Between 1992 and 2003, average prose literacy increased for Asian/Pacific Islander adults and there was no statistically significant change in average document and quantitative literacy for this group.
- Between 1992 and 2003, there was no statistically significant change in average prose and document literacy for white adults, but there was an increase in quantitative literacy.
- Among adults with Below Basic prose literacy, 26 percent lived in households with average incomes of less than $10,000 and only 7 percent lived in households with average incomes of $60,000 or greater. Among adults with Proficient prose literacy, 2 percent lived in households with average incomes of less than $10,000 and 65 percent lived in households with average incomes of $60,000 or greater.
- Higher percentages of adults with higher literacy levels than adults with lower literacy levels were employed full-time, and lower percentages were out of the labor force. Sixty-four percent of adults with Proficient prose literacy were employed full-time, compared with 29 percent of adults with Below Basic prose literacy. Eighteen percent of adults with Proficient prose literacy were not in the labor force, compared with 57 percent of adults with Below Basic prose literacy.
- The occupational groups with the highest average prose, document, and quantitative literacy scores were Professional and related and Management, Business, and Financial. The occupational groups with the lowest average prose document and quantitative literacy scores were Service; Farming, Fishing, and Forestry; Transportation and Material Moving; Production; and Construction and Extraction.
Adults with learning disabilities had lower average prose, document, and quantitative literacy than adults who had never been diagnosed with a learning disability. To me, some of the most striking findings are among adults with Below Basic literacy. Among adults who reported they had learning disabilities, 24 percent had Below Basic prose and document literacy and 38 percent had Below Basic quantitative literacy. Among adults who did not report they had been diagnosed or identified as having a learning disability, 13 percent had Below Basic prose literacy, 12 percent had Below Basic document literacy, and 20 percent had Below Basic quantitative literacy. The overall results and results by level are summarized in figures 2-15 and 2-16 (page 30) of the Literacy in Everyday Life report.
To focus on the systemic issue rather than on NAAL, the continued use of LD "self-report" is a disservice to people with LD. I am reminded of Ellison's Invisible Man. In this case, people with LD are invisible because they are not definitively a group.
It reminds me of the race politics that Native Americans (as well as other groups) went through. In one decade they are "prisoners of war", in the next decade they are "citizens", then "foreign nationals", then "wards of the government", then "Indians if they have the right blood quantum", then "Indians if they belong to a recognized tribe", the point being if the government can blur the group affiliation, the tribes too become invisible.
To stay with Indians a bit longer, I have seen studies that didn't report on Indians because the sample was too small. While I understand the statistical issue, I worry that this too can be complicit with efforts to obfuscate real issues. While the "n" for people with LD will always be large enough to be included in studies, policy-makers may ignore the findings if the group definition is squishy.
These questions about disabilities are included on the NAAL public-use data file, which is also available on the NAAL website, so it should be relatively easy for a secondary user to explore the relationship between these questions and literacy in more depth if anyone wants to.
It also occurred to me that some of the questions are questions one would find on several of the LD screening tools. I was thinking that one way to get around the "LD self-report" issue next time, if there is a next time, would be to select questions from one of more LD screeners that consistent indicators of a likely LD, and use them as a group as an indicator of LD. All this would do is increase the probability that the respondent might have an LD; it wouldn't be a proxy measure.
Finally, I have seen some research that indicates that adults are very accurate and honest in assessing their educational levels, so that may hold true as well for self-report about disabilities. I do have some concerns that the stigma of disabilities might affect the willingness of some students to self-report. Has anyone seen research about the self-report of disabilities?
or exceptional ranges.
Finally, the idea that a low-test score goes hand-in-hand with LD was concerning. It might be true for a person with LD who has not been able to figure out how to work around an LD (self-accommodate), or who has not gotten exposure to evidence-based approaches like strategy instruction, or who has learned to compensate by using a technology.
I don't think it is accurate (or helpful) to think of LD students as bottom-tier students, as I am sure you know. The LD self-report issue rears its head here. What we may be seeing in the 6% self-report LD respondents are people who have been unsuccessful in school for several reasons (poor instruction, frequent absence, social promotion, etc.) one of which may have been having a learning disability that was either not identified or was not accommodated.
While we've talked about LD a bit, do you have any data on physical or sensory disabilities? I'd hypothesize that blind/low-vision respondents might show a very low score in document literacy even when they could use a screen reader. I haven't looked at the definition of document literacy for a while, but I'm thinking it would include the kinds of documents one encounters in real-life, like forms, which can be problematic for people using screen readers.
Did the interviewers track how many people used a specific modification? This could give some insight about how many people with LD were involved in the study. If 6% self-report LD, but if 20% ask the interviewer if they could dictate responses, it might tease out a more reliable number. It would, of course, have to be disentangled from those who asked for a modification because it was a preference or because it suited them better as a respondent, and those who can't write, and those with disabilities other than LDs that interfere with writing.
What percentage of people with various levels of literacy proficiency get some health information from the internet?
Basic Literacy Skills ---- 42% get health info from internet
Intermediate Literacy Skills ---- 67% get health info from internet
Proficient Literacy Skills ----- 85% get health info from internet
Does this surprise you? Do you have any questions about this for our guest facilitator, Elizabeth?
Here are two resources that World Education has developed that encourage this:
The Health & Literacy Special Collection
Family Health and Literacy
Ms. Hansen is the Lead Librarian in the South Texas Independent School District and Biblioteca Las Américas in Mercedes, Texas. Among other programs, her library sponsors the ¡VIVA! Peer Tutor Program, a student-centered project to improve community health literacy by using MedlinePlus. http://bla.stisd.net/viva.htm
Available at least partially in Spanish, Medline Plus is a service of the U.S. National Library of Medicine and the National Institutes of Health, bringing together authoritative information such as medical journal articles, extensive information about drugs, an illustrated medical encyclopedia, interactive patient tutorials, and latest health news. http://medlineplus.gov/
Although the ¡VIVA! Peer Tutor Program is targeted at high school students in this border community, the community outreach and peer tutoring aspects of this award-winning program would lend themselves to partnership with adult literacy programs.
Below Basic: indicates no more than the most basic and concrete literacy skills.
Basic: indicates skills to perform simple and everyday literacy activities.
Intermediate: indicate skills to perform moderately challenging literacy activities
Proficient: indicates skills to perform more challenging and complex literacy activities
I equate below level to reading skills and math skills of someone in the third grade. Basic would be someone with reading and math skills below seventh grade. Intermediate would be someone with the skills of a high school student. Proficient would define the skills of a high school graduate. Note that this in not a scientific comparison but my own estimation based on my years in the education field.
Result of the health NAAL:
|Percentage of adults in each literacy level||Percentage of males in each literacy level||Percentage of females in each literacy level||Percentage of adults over 65 yearsof age|
|Below basic: 14%
|Below basic: 16%
|Below basic: 12%
|Below basic 29%
Note that women scored higher than males. This is not surprising since women are usually the health providers of the entire family.
Also note that close to 60% of seniors have very limited understanding of health related print. This limitation may be related to limited vision. So, seniors may require extensive verbal support. Also, many seniors have hearing loss. This would require that health providers speak slowly, clearly and in a loud voice. Finally, seniors have declining memory and decreasing cognitive skills. Considering that the average time that doctors spend with patients nationally is seven minutes, it is doubtful that seniors get the needed support.
|Percentage of Whites in each literacy level||Percentage of Blacks in each literacy level||Percentage of Hispanics each literacy level|
|Below basic: 9%
|Below basic: 24%
|Below basic: 41%
Note the very high numbers of Hispanics below basic. This is likely because many are not native English speakers.
Based on this assessment, one third to one half of all adults does not understand written information related to health well or at all. There is a third that understands information better. Only 14% of all adults can understand health related information well.
For more info go to: http://nces.ed.gov/naal/
While reports tend to produce "just the facts," the questions they seek to address and the ways in which they address them are never neutral. How/can/does anyone consider a broader analysis of the, say, the instructional implications, if any of this report?
I also wonder what measures may have been considered to ensure that there were no biases against minority groups in the items used. For example, have there been any analyses to see if there are gender differences in how people respond to items?
For my own part, I'm curious about how information gleaned from the internet is understood? Who has access to online technology? Anyone who has explored a health-related topic online has probably experienced the overwhelming number of hits and the difficulty of making sense of this. How/does education and access to language help or hinder and who are the people who can reasonably expect to have questions answered via online resources? How/do we learn how to figure these questions out, and, within/around them - look at how race and class play a part in it all?
You asked about measures to ensure that there are no biases against minority groups or women/men in the items used. The NAAL staff was very concerned about this issue. As the items were developed, outside reviewers with a broad range of backgrounds (including reviewers who were personally members of minority groups as well as reviewers who professionally worked with diverse populations) were brought in to review the items and also to suggest sources for new items. All the NAAL items are based upon "real" reading materials (none of them were written just for this assessment) and the reading materials come from a wide variety of sources, including magazines and publications whose target audience is minority groups.
After the items were field tested, the NAAL technical staff looked at each item for differential item functioning (DIF). The object of DIF analysis is to identify any items on which members of a target group perform worse (or better) than their performance on the other items would suggest they should do. DIF analysis was done for racial/ethnic groups (Black, Hispanic), gender, and age (60+). Based upon the DIF analysis of the field test data, a few items were identified as unfairly advantaging or disadvantaging one or more of those groups and those items were eliminated from the final item pool.
April 9, 2007
International Consultant in Adult Education
"The NAAL, the first assessment of adult English reading and writing ability in the U.S. since 1992, estimated that 30 million people over age 16 are barely able to read and write."
Robert Wedgeworth, president and CEO of ProLiteracy Worldwide
This statement by the head of the largest organization of adult basic education and literacy in the world occurs in the midst of a longer message calling for increased funding for the Adult Education and Literacy System (AELS) of the U. S. so it can serve more than the 3 million or so adults that it presently serves. Yet in the past, when similar pleas have been made for increases in adult literacy funding, in the wake of surveys showing 30 to 40 million adults with low literacy skills, the President and the Congress have responded with either no or very little increases in adult literacy education resources.
Why is this so? Is it possible that neither government officials or any one else for that matter, actually believes what the National Assessment of Adult Literacy (NAAL) or its predecessor, the National Adult Literacy Survey (NALS) reports about the literacy skills of America's adults?
The latest report on adult literacy in the United States, entitled Literacy in Everyday Life (LEL) (Kutner et. al, 2007), once again reports the litany of social problems that have been shown to be correlated with low cognitive skills, including low literacy, for almost a century. Adults with lower literacy skills tend to be in the lower socioeconomic classes, to be in minority ethnic groups, to not be native English language speakers, they are more likely to be unemployed or to hold low wage jobs, are less likely to vote or participate in other civic and community activities, are less likely to read to their children a lot, and more likely to be on welfare or other forms of public assistance (see Sticht & Armstrong, 1994 for an historical review of adult literacy assessments from 1917 to the present).
The LEL report presents data from the 2003 National Assessment of Adult Literacy (NAAL) which measured literacy using three literacy scales: Prose, Document, and Quantitative and reported results in four major categories of skills: Below Basic (the lowest level), Basic, Intermediate and Proficient. Another category, those non-proficient in English or Spanish could not take the test, and made up about 2 percent (4 million) adults. Some 14 percent of adults were in the Below Basic category for Prose literacy and 12 percent for the Document literacy scale. The Quantitative scale had about 22 percent in the Below Basic category.
But does being in the lowest level of literacy, the Below Basic level, indicate that these adults can "barely read and write?" According to the LEL report, being placed in the Below Basic level "indicates no more than the most simple and concrete literacy skills. * Adults at the Below
Basic level range from being nonliterate in English to having the abilities listed below:
- locating easily identifiable information in short, commonplace prose texts [Prose scale]
- locating easily identifiable information and following written instructions in simple documents (e.g., charts or forms) [Document scale]
- locating numbers and using them to perform simple quantitative operations (primarily addition) when the mathematical information is very concrete and familiar." [Quantitative scale]
The foregoing indicate that adults in the Below Basic level of literacy may, in fact, be able to read and write with some real degree of skill above the ability to "barely read and write." One suggestion that adults in the Below Basic literacy level have some degree of functional literacy was the finding that only around a third (34-35 percent) of adults with Below Basic Prose and Document skills thought their reading skills limited their job opportunities "a lot." Another third (33-35 percent) thought that their reading skills limited them "some" or "a little" and a final third
(32-33 percent) of these adults reported that their reading skills limited their job opportunities "not at all." So two-thirds of adults categorized as possessing Below Basic literacy skills did not seem to perceive themselves as very limited in their job opportunities due to their poor reading skills.
Why don't these adults in the Below Basic category of Prose and Document literacy, the lowest level, perceive themselves as limited in their job opportunities? There is no information explicitly given in the LEL report about this. However, there are some data in the report that may be relevant to addressing this issue.
The LEL report states that the correlations among Prose, Document, and Quantitative scales were between +.86 and +.89 out of a perfect correlation of +1.00. This indicates that all three scales placed people in roughly the same rank orders. This means that those who scored poorly on the Prose scale were likely to be low on both the Document and Quantitative scales, those in the middle range of the Prose scale would be in the middle of the ranges of the Document and Quantitative scales, and those with the higher Prose scores would also tend to have the higher Document and Quantitative scales.
This suggests that in estimating people's literacy skills we should consider the sum of the Prose, Document, and Quantitative skills for any given person. Presumably all adults have some of each sort of literacy. But when the skills are discussed, they are discussed as separate in terms of a particular scale. But persons at the Below Basic level on Prose literacy presumably also have Below Basic skills on both Document and Quantitative scales, too. But without adding up skills across the three scales it is not certain how to characterize these people in terms of what they can actually do in the real world of literacy. They would seem to have some ability in all three domains, but exactly how these add together to form an overall estimate of what the Below Basic adults can do with their Prose, Document, and Quantitative literacy combined is not clear.
It may be that the combined literacy skills across the three NAAL scales render people more capable at getting, keeping, and progressing in a job than a discussion of just one scale would suggest. In turn, this might influence people's judgments about how little or how much their literacy skills limit their job opportunities.
For policymakers in federal or state governments, low unemployment rates, below 5 percent, may render the results of the NAAL and the claim that 30 million or so adults are functionally illiterate less than critical. In this case, the policymakers seem to be of the same mind as the adults in the Below Basic literacy level themselves. In both groups, adult literacy may seem to be somewhat of a problem, but not one serious enough to require a major increase in funds for adult literacy education.
Who believes America has an adult literacy problem? If most of the adults with Below Basic literacy skills themselves do not think they have much of a literacy problem, why should their governmental representatives or anyone else think so?
Kutner, M., Greenberg, E., Jin, Y., Boyle, B., Hsu,Y., & Dunleavy, E. (2007). Literacy in
Everyday Life: Results From the 2003 National Assessment of Adult Literacy (NCES 2007-480). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
Sticht, T. G. (2001). The International Adult Literacy Survey: How Well Does It Represent the
Literacy of Adults? The Canadian Journal for the Study of Adult Education, 15, 19-36.
Sticht, T. & Armstrong, W. (1994, February). Adult Literacy in the United States: A
Compendium of Quantitative Data and Interpretative Comments. Washington, DC: National Institute for Literacy.
Wedgeworth, R. (2007, April). ProLiteracy Worldwide's President Reacts to the NAAL Comprehensive Report. Downloaded April 9, 2007 at http://www.proliteracy.org/news/index.asp?aid=235
My knee-jerk reaction to this is if you have a low literacy level, perhaps you don't fully understand the kinds of jobs and advancement available to those with higher literacy levels. Or perhaps you are resigned to your current socio-economic position. Or perhaps you are content. What I am trying to say is that there could be dozens of reasons why adults with lower literacy don't believe they are being held back because of it, but that doesn't mean governmental representatives should look at the issue on such a superficial level. Even if an adult is low performing in a single area, that deficit is reflected in career choice and advancement potential.
DB: Here's a link that speaks to this issue.
The Santa Ana Chamber of Commerce has taken it upon themselves to encourage the populace to avail themselves of the opportunities to learn English that will help people improve their employment opportunities.
Most of the populace can get along without learning English because Spanish is spoken and understood by so many, but their job opportunities are limited. An interesting approach by the Chamber of Commerce, and one that deserves emulation.
DR: Two other possible explanations:
1. Perhaps adults thought to have low literacy skills don't feel that they have a problem because they have developed successful workarounds such as:
- A spouse who reads and writes
- A group of friends or buddies who learn and teach each other in an oral culture
- Avoidance of situations (e.g. jobs) where reading, writing or numeracy are required
- Lies or excuses such as "I left my glasses at home"
- Getting information from non-print media such as radio and TV
- Having sharpened their memory skills
These strategies, however, don't always work. Workarounds are sometimes fragile, and when they break down, it can be an emotional challenge or a crisis in person's life.
2. There is an emotional charge around words like "literate," "low-literate" and of course "illiterate." People who have difficulty reading often do not identify themselves with these words.
First, as I noted, the adults may actually have higher literacy skills than the tests portray due to summing across the three scales. You didn't address this point. Also, adults at Below Basic can actually be expected to perform some tasks at the Basic, Intermediate, and Proficient levels only without the degree of probability (.67) as they can perform average tasks in their assigned level of Below Basic proficiency. So people with Below Basic skills can actually be expected to perform some literacy tasks at higher levels. This may influence their judgments of their skills.
Second, large percentages of the adults with Below Basic skills are older adults for whom the complex information processing tasks of the NAAL with their heavy emphasis upon overloading working memory, particularly at the higher levels, are not valid indicators of their ability to read and understand novels and informational books or reports drawing upon their general and specific areas of knowledge. The NAAL may not be valid across the lifespan for representing what adults know and can do with their literacy skills.
Third, high levels of employment, over 95%, may lead adults to think they possess literacy skills sufficient for working and getting by on a daily basis. This is supported by the earlier ALL (Adult Literacy and Lifeskills) survey which measured the literacy skills of the workforce-aged population (16-65). That report indicated that 80 percent of the workforce had literacy skills that matched or exceeded the demands for these skills on their jobs. Interestingly, 20 percent were estimated to be working in jobs for which their literacy skills were too low. But still they were working and earning a living.
Whatever the reasons that adults with Below Basic scores on the NAAL my have for thinking that their literacy skills are better than suggested by their NAAL level, it is clear that the NAAL has not influenced the Bush administration to ask for increases in the adult education budget, and the Congress has flat-lined funding for several years, too. So there does not seem to be much of an urgency about adult literacy among federal policymakers.
Also, philanthropic organizations do not appear to have rushed in to make up for the lack of federal efforts to help poorly literate adults achieve well and advance in the workforce or to seek higher education.
Given all this, I don't thank that declaring 30, 40, 50 or more million adults with Below Basic or just Basic literacy has had much of an impact upon federal policymakers, other funders, or even the general adult population itself. There certainly are not some 30 million adults presenting themselves for instruction in adult education programs each year. And of the 2.7 or so million adults in the Adult Education and Literacy System in program year 2001-02, fewer than one in six said they were there to improve their employment situation and of these only some 42 percent actually reported improving their employment situation after enrolling in a program.
So like I said, if adults don't think they have a problem, and with high levels of employment policymakers and others don't think of adult literacy as a national problem, how will the field attract adult learners into the system and acquire sufficient funding to provide adequate education for those who do show up?
Perhaps it is time for new thinking about how to establish the scale of need for adult literacy education.
- They are too ashamed to admit to a stranger that they have difficulty reading (research has indicated that the shame factor associated with difficulty in reading is VERY high).
- They do not know how deficient they are in the specific literacy skills assessed in the NAAL. For example, it is not uncommon for a person on the second grade reading level to think that [s/he] can get a GED in a few months. This makes sense to me. We often do not know what we do not know until we are deeply into it. For example, I am clueless at how deficient my knowledge is of advanced calculus.
- They are completely proficient in their literacy NEEDS. This also makes sense to me. For example, I am perfectly comfortable not knowing advanced calculus. If someone were to ask me to rank my mathematical proficiency, I might be inclined to rank it very highly because I know what I need to know.
In Alabama, with its rating of third worst (tied with Florida and South Carolina) in the country for Level one readers, it's hard to get some people interested in improving their reading, especially in the rural areas. Many people have "made do" with what they had, are proud of it, and if they have no more responsibilities, are content to live out their lives as they are. There is also the fact that many don't know how much better they can be doing. The coordinator of the welfare to work class that was part of our program from 1991 to 2002 took the class (most were single mothers) to one of our shopping malls and only one two had ever been there before. I have a young male student right now who is trying to talk me out of his placement in Laubach Book four, but the more he tries to do the next level work, the deeper into trouble he gets.
I don't have enough expertise on adult literacy programs to feel entirely comfortable giving an opinion on how those programs should assess student progress. However, it's my impression that if you want to test components of reading (such as fluency or vocabulary) to figure out what aspects of reading a student needs to work on, you may need to move away from authentic materials. That's why the NAAL oral reading fluency assessment included things such as lists of words and pseudo-words, which I don't think anyone would consider to be "authentic" materials. While the goal is for everyone to be able to read, understand, and use authentic materials, a test based only on authentic materials may not provide the diagnostic information you need to help an individual student.
However, as I said, I am far from an expert in this area, so I would love to hear what other people have to say about this.
AW: I've used the NAAL in my literacy program as part of our advocacy efforts. I'd been using the information from the NIFL publication of state and local results of the 1993 NALS in my pitches to the local population to raise funds and recruit volunteers for our tutoring. I read the NALLS reports to the point that I found the similarity of results in prose literacy between the two studies (actually a loss on the high end) and added that chart to my presentation and informed them that our new study showed no progress made in 10 years. I used some points raised by ProLiteracy in their 2006 literacy status on lack of funding as one of the primary reasons for the lack of progress (most of our funding is locally raised, except for some grants).
My affiliation with ProLiteracy has helped since they've taken a pretty proactive approach to the NAAL. In January, 2006 they announced that they/we were concerned with both the below basic and the basic category people (After a little more reading, I think we have to look at some of the intermediate people, too). In the web presentation by the NIFL on the below basic and basic categories, the Harvard professor said the basic level reads at the seventh grade level. ProLiteracy has used that information in some of their advocacy statements and I think I've seen it on one of the other listservs I'm following.
DG: Do you find the NAAL confusing?
I have heard from many people that they wish that the information from the NAAL report were more simply presented.
- People do not necessarily think of literacy in terms of document, prose and quantitative literacy. There is specific confusion between document and prose literacy.
- There is also confusion about what the various levels mean: below basic, basic, proficient, etc. I have heard people say that too much reading needs to be done to figure out how each level translates into levels which are more commonly used.
- Finally, people have shared that it would be helpful to have findings reported both in terms of percentages and numbers.
Because of a-c, I know that at least some adult literacy practitioners do not relate to the report. It is not written nor presented in language that is useful to them. Off list, some people shared with me that that is why they didn't get involved with a discussion about the meaning of the report in terms of gender, race, and socioeconomic status. They were having a difficult enough time trying to figure out what the report means in simple English, and could not focus on implications.
AW: Concerning percentages and numbers, I think I found them in the initial NCES documents, but not in the same place. The percentages are available in the 2003 vs. 1993 comparison charts and the numbers are listed in one of the category descriptions. Putting them together for prose literacy (the only one that matters for me), I think we have:
- below basic - 14 percent, 30 million people
- basic - 29 percent, 63 million people
- intermediate 44 percent, 95 million people
- proficient - 13 percent, 29 million people
Right now, I'm uncertain if I've adequately considered the "not literate" group, but I felt OK with them when I came up with the figures.
JBS: Percentages and numbers of the NAAL data are available on NAAL web site under Key Findings at:
JM: Confusing, you ask? To me the most confusing thing is to figure out which levels are considered adequate for living a good life. Is "basic" good enough? How about "intermediate"? The last survey came back with a figure of "90 million Americans" (have trouble functioning due to literacy constraints) ---what is the comparable figure for this assessment?
AW: I agree that the NAAL report was a pretty wordy document and difficult to stay with and I must admit that I haven't tried. I've downloaded and printed every report that's come out, but it's in a growing stack that will go with me on my next vacation. I feel I've gotten what I need from it for my program right now and that a lot of information is of little need to me other than just to know. I do look forward to further reports and further discussions such as this. It was good to see that others shared my ideas and concerns.
And what has been the outcome--it may be too early for 2003 NAAL, but we can look at the results from the 1992 NALS? Another angle is evaluating the impact of the reports on building public awareness and recruiting resources. As a community, we need to evaluate if we have been successful in using the NAAL (or NALS) data as a powerful tool to our benefit...
BM: We used NALS findings to defend literacy programs in federal prisons at a time when other programs (college programs, wellness programs) were being stripped away (with the loss of Pell Grants in the mid-90's, for example). Literacy seemed to be the one program that even the most get-tough-on-crime legislators championed. Perhaps too a fault: it's also the era when many prison literacy programs were mandated (not only in policy, but also in law). I think NALS did have an impact on prison literacy policy, maybe more so than in the community...
JM: In terms of policy makers' use of the report, it seems to me that the "90 million Americans" figure from the 1993 report had a pretty big impact on the health field and has been used to garner support for some important initiatives in the area of health literacy. I think that figure was striking enough that it opened a lot of eyes. I would still like to know what the comparable figure is from the 2003 report.
JBS: [DG and TS] have raised valid questions regarding significance of the NAAL for policymakers and other funding sources. [DG] wrote: "It doesn't appear to me that policy makers use the report all that much-but I could be wrong...but I have never heard a loud appeal for a significant raise in funding for literacy as a result of this report".
[TS] points out that "it is clear that the NAAL has not influenced the Bush administration to ask for increases in the adult education budget, and the Congress has flat-lined funding for several years, too. So there does not seem to be much of an urgency about adult literacy among federal policymakers".
This discussion made me to look back at NIFL's approach to NALS (National Adult Literacy Survey) and share a short history that may help us in considering our approach to NAAL. Under the leadership of Andy Hartman (NIFL Director between 1994-2000), and hard work of Carolyn Staley (NIFL Deputy Director1994-2002) and Alice Johnson (Policy Analyst 1994-2002), in 1996, NIFL initiated and funded the Public Awareness Campaign that was based on the NALS data. As an important component of the campaign, NIFL published State of Literacy in America report (prepared by Stephen Reder), which repackaged the NALS data with color-coded maps of literacy levels in every state, county, and municipality. There was a policymaker component, which was directed at both federal and state policymakers, and also business and general public components. The campaign resulted in:
- Increase of federal funding for adult education: The report and the campaign definitely caught the attention of some Congressional offices and NIFL got positive feedback on it from the Hill and actually did help in getting additional federal funding for literacy. In addition, armed with these books and this information, local and state programs were able to call on members of Congress and provide them with statistical information on their constituency. This made a huge impact in building awareness and support from Congress.
- Increase of corporate support in general for literacy--as awareness increased: Verizon stepped in the literacy field and Wal-Mart began campaign for literacy and established a call center including a CD about reading, and financial support for local literacy organizations from local Wal-Mart stores increased. In addition, we see at this time Dollar General initiated their literacy campaign, and Faith Hill (country singer) launched PSA for literacy.
- Change in legislation: As HHS changed regulations requiring "workfirst" and curtailing education as counting for work credits (TANF), many states stepped up and included adult education and literacy courses as eligible activities for "work" so that low-income adults could continue literacy training as they moved toward work.
- Others: The issue of literacy became much more widely understood and supported across the country. Many literacy organizations reported to NIFL a marked increase in local business and corporate support as a result of the campaign. The campaign's professionally-created materials that were provided to local literacy groups, including broadcast quality TV and radio tapes/CDs, and print-ready materials made it much easier for local and state literacy programs to have PSAs running with their local program information provided as well.
The overall lesson learned was the way you present information to policymakers and funding sources makes a big difference in terms of how it is received and also that folks on the Hill like to have data broken down by state and especially by Congressional district.
So, how do we get a repeat of the activities of 1996 to happen again? Sounds like we need a well crafted campaign using the 2003 data!
for Adult Literacy Education Should be Forsaken
The National Assessment of Adult Literacy (NAAL) report released in 2006 indicated that almost half the nation's adults were so poorly literate as to be unable to function well in our contemporary society. This problem was exacerbated when talking about Hispanics and Black Americans. They fell well below the White adults in their literacy skills. All this lead some to suggest that our nation's international competitiveness is at risk because of the lack of functionality of our workforce. But is this all true? Some background and additional research information provides a basis for questioning these results and inferences from the NAAL.
The Young Adult Literacy Survey (YALS) of 1985 was the forerunner to the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), Adult Literacy and Life Skills (ALL) survey, and the National Assessment of Adult Literacy (NAAL) of 2003. The YALS provided the basic methodology for all these assessments in scaling literacy on Prose, Document and Quantitative scales. These assessments all used Item Response Theory (IRT) to scale the literacy abilities of adults and the difficulty levels of the test items. This produces some interesting comparisons.
For instance, on the 1985 YALS, in the Document literacy assessment, 73 percent of the tasks demanded 300 level skills or lower, while 57.2 percent of young adults possessed 300 level skills (about the middle of Level 3 on the later NALS) or higher. Thus, the Document tasks tended to be skewed toward the easy end of literacy task difficulty. Overall, the average percent correct for Document literacy tasks was 83.3. Whites scored 85.9, Hispanics 77.6, and Blacks 71.8 percent correct on the average for Document tasks. Thus, using percentage of items correct, all ethnic groups appeared fairly capable, even though there are clear ethnic differences.
However, while 65.4 percent of Whites scored at the 300 skill level or higher, only 37.0 percent of Hispanics and 19.8 percent of Blacks scored at the 300 skill level or higher. Note that, if one focuses on the fact that only one in five Blacks were at the 300 skill level or above on the Document scale, one might infer a very low performance level for Blacks on Document tasks. Yet, overall, Blacks performed over 70 percent of the Document tasks correctly. This apparent contradiction results from the fact that to be at the 300 level of skill requires that people possess an 80% probability of being able to perform tasks that are at that level of difficulty. But people with lower levels of skill have a greater than zero probability of being able to correctly perform 300 level tasks. When the latter are taken into consideration, as in calculating the overall average percent correct, then a much greater percentage of the population may be seen to be able to perform Document tasks across the full range of difficulty levels, from easy to hard, than are able to perform tasks at the 300 level of difficulty or above.
In the construction of the YALS assessment it was thought that the materials and tasks selected were representative of "real world" tasks that adults would encounter. If that were so, and if Hispanics and Blacks scored over 70 percent correct, then one might think that overall these young adults were fairly capable in their abilities to perform representative "real world" tasks. But the scaling procedure, that is the IRT scaling methodology, provided a different perspective.
How "real world" the tasks were on the YALS, or are on the other assessments that came later is questionable. For instance, on the YALS and the NALS one Prose literacy item was as follows: The following poem was read: "The pedigree of honey does not concern the bee-a clover, any time, to him is aristocracy." The task was then to answer the question "What is the poet trying to express in this poem?"
As it turns out, this was one of the most difficult of all questions, with a difficulty level of 387 placing it in NALS level 5. But the question is, how "real world" is this task for most adults?
My concerns for how valid such assessments are for telling us how well people might succeed in the "real world" was aroused when during the Vietnam war over 350,000 young men were inducted who had literacy scores on average below the 6th grade level. They had all been excluded as unfit for service due to low cognitive ability. However research showed that later on some 85 percent performed their jobs well and completed their military service. In one study even though Black soldiers scored 20 percentile points lower than Whites on the cognitive/literacy tests, they performed about as well as White soldiers on job knowledge and job performance tests.
In some studies Blacks actually outperformed whites.
Later, as a member of the National Commission on Testing and Public Policy, I argued strongly for positions that were included in the recommendations of the Commission, including "#1. Testing policies and practices must be reoriented to promote the development of all human talent. We must reevaluate how we judge the quality of tests, the names we give them, the ways we report results, and the ways we use them. No testing program should be tolerated that classifies people as unable to learn; potentially negative classification in school or the workplace should be accompanied by learning opportunities."
And "#4. The more test scores disproportionately deny opportunities to minorities, the greater the need to show that the tests measure characteristics relevant to the opportunities being allocated."
Clearly, today the results of the NAAL paint a disproportionately negative picture of the literacy skills of Hispanics and Blacks. They are classified as being unable to adequately perform literacy tasks needed for education and work. But even though this denigrates the skills of these minority groups, designating them as inferior in our educational and workforce activities, there has been no major commitments to providing accessible, sustainable, life-relevant educational opportunities for them in the Adult Education and Literacy System (AELS) of the United States.
Indeed, from fiscal year 2000 to fiscal year 2004 there was a decline of 12.8 percent (78,986) for Blacks enrolled in the AELS.
In my view, it is entirely possible that our adult literacy assessments and their classification of some half of adults, and even more who are Hispanics or Blacks, as able to meet only the lowest demands for literacy for learning, working, and earning, may be doing more harm to our international competitiveness than the actual skills of our workforce. How many international corporations will want to work in a country with half the workforce declared unfit for productive work in contemporary times?
Degradation by classification, without adequate opportunities for education, should not be undertaken.
If you have any additional questions about NAAL in the future, please feel free to contact me directly. My e-mail address is firstname.lastname@example.org. If I can't answer your question, I'll try to identify someone who can.
PovertyRaceWomen and Literacy List Facilitator
Please note: We do not control and cannot guarantee the relevance, timeliness, or accuracy of the materials provided by other agencies or organizations via links off-site, nor do we endorse other agencies or organizations, their views, products or services.