Full Discussion - Gender, Race, Socioeconomic Status (SES), and Adult Literacy: What Does the National Assessment of Adult Literacy (NAAL) tell us May 21 - 29, 2007



From May 21 - 29, 2007, Elizabeth Greenberg was a guest
facilitator on the Poverty, Race, Women & Literacy listserv.
The focus of her discussion was the findings of the National
Assessment of Adult Literacy (NAAL) as they pertained to
gender, race, and socioeconomic status (SES).

Elizabeth Greenberg is a principal research analyst at the
American Institutes for Research (AIR), and is AIR's Project
Director for the 2008 National Assessment of Adult Literacy
(NAAL) Special Studies contract. She was also AIR's Deputy
Project Director for the 2003 NAAL Design, Analysis, and
Reporting contract. In her role as Deputy Project Director for
the 2003 NAAL, she led the development of the NAAL
background questionnaire and assessment items. She is a lead
author or co-author of several reports based on the 2003 NAAL, including A First Look at the Literacy of America's Adults in the 21st Century, The Health Literacy of America's Adults, Literacy in Everyday Life, Literacy Behind Bars, and the 2003 NAAL Public-Use Data File User's Guide. Elizabeth is also an author or co-author of several reports and articles based upon the 1992 adult literacy data, including English Literacy and Language Minorities in the United States.

Thanks to Ryan Hall, a graduate student at Georgia State University, the following represents a compilation of the various topics discussed by listserv members while Elizabeth Greenberg facilitated the discussion. Each topic contains one or more discussion threads arranged by questions and answers. All of Elizabeth's questions and comments are labeled with her name, while questions and comments from listserv members are labeled with first and last initials. Most of the postings were copied and pasted verbatim, with a few words edited here and there to facilitate reading. For complete postings, along with author information, go to the Poverty, Race, Women & Literacy Archives and look at postings between May 21 - 29, 2007.

1. Introduction

Elizabeth: The 2003 National Assessment of Adult Literacy (NAAL) assessed the English literacy of adults in the Untied States for the first time since the 1992 National Adult Literacy Survey (NALS). The assessment was administered to more than 19,000 adults (ages 16 and older) in households and prisons. The assessment measured English literacy directly through tasks completed by the participants. These tasks represent a range of literacy activities that adults are likely to face in their daily lives.

Three types of literacy were measured by the assessment:

Prose Literacy. The knowledge and skills needed to search, comprehend, and use information from continuous texts. Prose examples include magazine articles, news stories, and informational brochures.

Document Literacy. The knowledge and skills needed to search, comprehend, and use information from noncontinuous texts, such as tables, graphs, and maps. Document examples include job applications, payroll forms, transportation schedules, maps, and food and drug labels.

Quantitative Literacy. The knowledge and skills needed to identify and perform computations using numbers that are embedded in printed materials. Examples include figuring out a tip, filling out an order form, and comparing costs of employee benefits.

The 2003 NAAL also measured health literacy. The health literacy tasks were organized around three domains of health and health care information and services: clinical, prevention, and navigation of the health system. All the health tasks were also prose, document, or quantitative tasks.

If you are interested, you may want to look at the following 2003 NAAL reports:

A First Look at the Literacy of America's Adults in the 21st Century

The Health Literacy of America's Adults

Literacy in Everyday Life

Literacy Behind Bars

AR: I understand oral reading fluency was measured in this assessment for the first time ever. I am interested in the relationship between oral fluency and the development of cognitive complexity. Were there any items that could be used as markers for cognitive complexity? I'm also interested in fluency and vocabulary development. Is there a way to connect the fluency scores to anything that would reflect vocabulary knowledge?
Elizabeth: With regard to oral reading fluency, it is my understanding that NCES is planning to post results from the 2003 oral reading fluency assessment on the Internet, but those results have not yet been released. I don't think NCES is planning to link oral reading fluency to results on individual NAAL items, just to scores on the scales. However, the oral reading fluency data will be released as part of the restricted-use data file, so a secondary analyst could do additional analyses with that data.


NAAL is planning to assess vocabulary as part of a special study in 2009, but there are not any plans to also measure oral reading fluency as part of that special study. Therefore, I don't think it will be possible to directly link oral reading fluency with vocabulary knowledge.
KMG: Is it possible to see the actual NAAL test that was administered?
HS: You'll find descriptions and sample questions from the 2003 NAAL here


Elizabeth: Some of the older adult literacy items that were originally developed for the 1985 Young Adult Literacy Survey or for the 1992 National Adult Literacy Survey have been released and are available on the NAAL website (nces.ed.gov/naal). NCES generally releases older items that have been used in at least two assessments because they are dated and cannot be used in future assessments. They do not release most of the newer items because they try to use items in at least two assessments in order to be able to measure trend (changes in the population) between assessments. If all the items were new each time, it would not be possible to link results from one assessment to the results from the previous assessment.


JBS: The NAAL Test Questions Tool provides access to 109 questions and answers from the 1985 and 1992 assessments. We recently added a few health related questions that you can access through entering "Health" term in the database at http://nces.ed.gov/NAAL/SampleItems.asp?PageId=138. The 2003 NAAL questions are not available on the web. However, people who are interested to see them can make appointment and review them in our office in DC.

2. Some Specific Findings Regarding Gender, Race, and SES

Gender

  • Between 1992 and 2003, women's average document and quantitative literacy scores increased. During the same time period, men's average document literacy score decreased and there was no statistically significant change in average quantitative literacy for men.
  • Between 1992 and 2003, women's average prose literacy score stayed the same, while men's average prose literacy score decreased.
  • In 2003, women had higher average prose and document literacy than men, and men had higher average quantitative literacy than women. In 1992, there was no statistically significant difference between men and women in their average prose literacy, but men had higher average document and quantitative literacy than women.

Race

  • Between 1992 and 2003, average prose, document, and quantitative literacy increased for Black adults.
  • Between 1992 and 2003, average prose and document literacy decreased for Hispanic adults. Average quantitative literacy did not change for Hispanic adults. The percentage of the adult population (age 16 and older) that identified themselves as Hispanic increased from 8 percent in 1992 to 12 percent in 2003.
  • Between 1992 and 2003, average prose literacy increased for Asian/Pacific Islander adults and there was no statistically significant change in average document and quantitative literacy for this group.
  • Between 1992 and 2003, there was no statistically significant change in average prose and document literacy for white adults, but there was an increase in quantitative literacy.

SES

  • Among adults with Below Basic prose literacy, 26 percent lived in households with average incomes of less than $10,000 and only 7 percent lived in households with average incomes of $60,000 or greater. Among adults with Proficient prose literacy, 2 percent lived in households with average incomes of less than $10,000 and 65 percent lived in households with average incomes of $60,000 or greater.
  • Higher percentages of adults with higher literacy levels than adults with lower literacy levels were employed full-time, and lower percentages were out of the labor force. Sixty-four percent of adults with Proficient prose literacy were employed full-time, compared with 29 percent of adults with Below Basic prose literacy. Eighteen percent of adults with Proficient prose literacy were not in the labor force, compared with 57 percent of adults with Below Basic prose literacy.
  • The occupational groups with the highest average prose, document, and quantitative literacy scores were Professional and related and Management, Business, and Financial. The occupational groups with the lowest average prose document and quantitative literacy scores were Service; Farming, Fishing, and Forestry; Transportation and Material Moving; Production; and Construction and Extraction.

3. What about Learning Disabilities?

MT: Is there a plan for looking at the performance of students with disabilities?
Elizabeth: The Literacy in Everyday Life report did report results separately for adults with learning disabilities. Because this is a survey/assessment of adults, the NAAL does not have any school data, so we have to rely upon individuals' self reports. To try to get away from self-diagnosis, the background questionnaire asked participants if they had ever been "diagnosed or identified" as having a learning disability (rather than just asking them if they had a learning disability) and 6 percent of adults said they had. (Note that the NAAL does not have an upper limit on age participation, so there may be older participants who had learning disabilities but were not tested for them at the time they were in school.)


Adults with learning disabilities had lower average prose, document, and quantitative literacy than adults who had never been diagnosed with a learning disability. To me, some of the most striking findings are among adults with Below Basic literacy. Among adults who reported they had learning disabilities, 24 percent had Below Basic prose and document literacy and 38 percent had Below Basic quantitative literacy. Among adults who did not report they had been diagnosed or identified as having a learning disability, 13 percent had Below Basic prose literacy, 12 percent had Below Basic document literacy, and 20 percent had Below Basic quantitative literacy. The overall results and results by level are summarized in figures 2-15 and 2-16 (page 30) of the Literacy in Everyday Life report.

MT: My initial reaction is about systematic use of LD "self-report". It's clear you made the best choices in dealing with this issue, though the choices available to you will not solve the problem.



To focus on the systemic issue rather than on NAAL, the continued use of LD "self-report" is a disservice to people with LD. I am reminded of Ellison's Invisible Man. In this case, people with LD are invisible because they are not definitively a group.



It reminds me of the race politics that Native Americans (as well as other groups) went through. In one decade they are "prisoners of war", in the next decade they are "citizens", then "foreign nationals", then "wards of the government", then "Indians if they have the right blood quantum", then "Indians if they belong to a recognized tribe", the point being if the government can blur the group affiliation, the tribes too become invisible.



To stay with Indians a bit longer, I have seen studies that didn't report on Indians because the sample was too small. While I understand the statistical issue, I worry that this too can be complicit with efforts to obfuscate real issues. While the "n" for people with LD will always be large enough to be included in studies, policy-makers may ignore the findings if the group definition is squishy.
Elizabeth: The only question for which detailed literacy results have been reported in any of the NAAL reports is the learning disabilities question. Although detailed literacy results have not been reported for any of the questions other than learning disabilities, the first NAAL report (A First Look at the Literacy of America's Adults in the 21st Century) did examine the percentage of adults in the total NAAL population and in the Below Basic prose literacy level who reported having any of these disabilities. That report is also available on the NAAL website. You want to look at Table 2 on page 5 of this report. As reported in this table, almost half (46 percent) of the Below Basic population reported having one or more disabilities (uncorrected vision, uncorrected hearing, learning disability, or other disability), while 30 percent of the total adult population reported having one or more disabilities.


These questions about disabilities are included on the NAAL public-use data file, which is also available on the NAAL website, so it should be relatively easy for a secondary user to explore the relationship between these questions and literacy in more depth if anyone wants to.

MT: The LD self-report issue could be addressed by asking follow-on questions to see if the respondent has any documentation such as an IEP available about their LD.



It also occurred to me that some of the questions are questions one would find on several of the LD screening tools. I was thinking that one way to get around the "LD self-report" issue next time, if there is a next time, would be to select questions from one of more LD screeners that consistent indicators of a likely LD, and use them as a group as an indicator of LD. All this would do is increase the probability that the respondent might have an LD; it wouldn't be a proxy measure.



Finally, I have seen some research that indicates that adults are very accurate and honest in assessing their educational levels, so that may hold true as well for self-report about disabilities. I do have some concerns that the stigma of disabilities might affect the willingness of some students to self-report. Has anyone seen research about the self-report of disabilities?
MT: The other idea I had was to do a comparison of scores on the three tests to see if there are any significant discrepancies. The theory here is that people with LD typically have patterns of performance that show unexpected low scores in some areas while other scores are in the normal
or exceptional ranges.


Finally, the idea that a low-test score goes hand-in-hand with LD was concerning. It might be true for a person with LD who has not been able to figure out how to work around an LD (self-accommodate), or who has not gotten exposure to evidence-based approaches like strategy instruction, or who has learned to compensate by using a technology.



I don't think it is accurate (or helpful) to think of LD students as bottom-tier students, as I am sure you know. The LD self-report issue rears its head here. What we may be seeing in the 6% self-report LD respondents are people who have been unsuccessful in school for several reasons (poor instruction, frequent absence, social promotion, etc.) one of which may have been having a learning disability that was either not identified or was not accommodated.




While we've talked about LD a bit, do you have any data on physical or sensory disabilities? I'd hypothesize that blind/low-vision respondents might show a very low score in document literacy even when they could use a screen reader. I haven't looked at the definition of document literacy for a while, but I'm thinking it would include the kinds of documents one encounters in real-life, like forms, which can be problematic for people using screen readers.

Elizabeth: The NAAL background questionnaire asked people about a series of disabilities including vision problems that were not corrected by glasses or contact lenses, hearing problems that were not corrected by hearing aids, learning disabilities, and any other health problems that interfered with activities of daily life (housework, school, etc.). If you want the exact wording of the questions, the household and prison background questionnaires have been posted on the NCES website (toward the bottom of the page on this link): http://nces.ed.gov/NAAL/index.asp?file=
DesignDevelop/SInstruments/BackQuestion.asp&PageID=116

4. What about Test Accommodations?

MT: I would also like to hear any perceptions about the impact of test accommodations on the participation of the students who took an accommodated test form, and on validity and reliability.
Elizabeth: Many things that would require accommodations for people with disabilities in other assessments did not require special accommodations in the NAAL because of the way the NAAL was administered: one-on-one in respondents' homes. The assessment was not timed for anyone; although, the interviewers were instructed to try to gently move respondents along to the next question or block of questions if they seem to be struggling and not making any progress. Respondents were told that they can use whatever they normally use when reading and writing (a ruler, magnifying glass, etc.) and since they were in their own homes those things should have been available to them. The only special accommodation that was made was for respondents who were unable to write their responses: they were able to dictate a response to the interviewer who recorded it for them. There were no special forms for respondents with disabilities. However, respondents whose responses to the first seven questions on the assessment (the core questions) indicated that they were unlikely to be able to answer most of the questions on the remainder of the assessment were given an alternative assessment (the Adult Literacy Supplemental Assessment or ALSA). All adults with very low literacy were administered the ALSA, not just adults with learning disabilities. I believe that NCES is working on a report based upon the ALSA data, but that report has not been released yet.
MT: I think the way the interviewers went about providing modifications and adjustments for people deserves an award. This was a very humane approach (and practical and beneficial to the study).


Did the interviewers track how many people used a specific modification? This could give some insight about how many people with LD were involved in the study. If 6% self-report LD, but if 20% ask the interviewer if they could dictate responses, it might tease out a more reliable number. It would, of course, have to be disentangled from those who asked for a modification because it was a preference or because it suited them better as a respondent, and those who can't write, and those with disabilities other than LDs that interfere with writing.

Elizabeth: The interviewers were asked to note if they wrote responses for any of the respondents. None of the other "accommodations" (such as use of a magnifying glass or a ruler to read) were considered special accommodations since they were things that are available to people in their everyday life and so they were not tracked. I will see if I can find any information on how many respondents dictated their answers, but I cannot lay my hands on that information right now. My memory is that it was a small number. The interviewers did not volunteer that they could write the responses. They only offered to do it if a respondent said that he or she could not participate in the assessment because s/he could not write a response.

5. What about Health Literacy?

AR: How did so many more people score in the Intermediate level on health literacy than in regular prose literacy? I expected the opposite. Were the scoring mechanisms different?
Elizabeth: With regard to why a higher percentage of people scored in the Intermediate level on the health literacy scale (53 percent) than on the prose literacy scale (44 percent), I do not have an explanation. That's just the way the data came out. Note that the health literacy scale is composed of questions from the prose, document, and quantitative scales. Of the 28 questions on the health literacy scale, 12 are prose items, 12 are document items, and 4 are quantitative items.
DG: More and more people are getting health information from the internet. Here are the stats from the 2003 NAAL report. You can find these at: http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2006483(page 35)




What percentage of people with various levels of literacy proficiency get some health information from the internet?

Below Basic Literacy Skills ---- 19% get health info from internet

Basic Literacy Skills ---- 42% get health info from internet

Intermediate Literacy Skills ---- 67% get health info from internet

Proficient Literacy Skills ----- 85% get health info from internet

Does this surprise you? Do you have any questions about this for our guest facilitator, Elizabeth?

Elizabeth: I think [this] topic is very important. More and more health information is available on the Internet, but the people with the lowest literacy -- who, on average, report relatively poor levels of health -- are the least likely to get information from the Internet. Do people have any ideas on how to address this?
JM: I thought it was interesting that, while those with the lowest literacy levels did get less information from the internet, they seemed to get less health information from all sources as compared to those with higher levels of literacy (even non-print sources like friends and family: see Fig. 3-7). It seems as if they simply do not seek out health info as much. Any ideas why this is? I think it is a slippery slope in today's environment to trust all the information we get from the internet, so in a way I don't think it is necessarily bad that those with low literacy levels do not spend a lot of time surfing for health info. How to encourage it? I think there needs to be guidance in how to evaluate health (and indeed all) information from the internet. World Education has been working a lot with using a collaboration between adult literacy programs and health agencies as a way to help improve health literacy. For example, when a class of literacy students explore a health issue online with the support of their teacher and classmates, they can learn to evaluate the information, find appropriate sites, and process the health information that they find. I am a big supporter of this method.



Here are two resources that World Education has developed that encourage this:
The Health & Literacy Special Collection
Family Health and Literacy
AM: It seems that there is an area where limited literacy skills translate into people's abilities to perform tasks. That is health literacy. Limited literacy skills seems to have an impact in people performing health tasks.
HVS: As part of the College of DuPage's Library Learning Network, a "Library Challenges & Opportunities" teleconference on March 23 featured guest Lucy Hansen.
http://www.dupagepress.com/COD/index.php?id=1250


Ms. Hansen is the Lead Librarian in the South Texas Independent School District and Biblioteca Las Américas in Mercedes, Texas. Among other programs, her library sponsors the ¡VIVA! Peer Tutor Program, a student-centered project to improve community health literacy by using MedlinePlus. http://bla.stisd.net/viva.htm



Available at least partially in Spanish, Medline Plus is a service of the U.S. National Library of Medicine and the National Institutes of Health, bringing together authoritative information such as medical journal articles, extensive information about drugs, an illustrated medical encyclopedia, interactive patient tutorials, and latest health news. http://medlineplus.gov/




Although the ¡VIVA! Peer Tutor Program is targeted at high school students in this border community, the community outreach and peer tutoring aspects of this award-winning program would lend themselves to partnership with adult literacy programs.

AM: Here is a brief summary that I made of the health NAAL.




National Assessment of Adult Literacy (NAAL) Health Literacy of America's Adults




Four levels:

Below Basic: indicates no more than the most basic and concrete literacy skills.

Basic: indicates skills to perform simple and everyday literacy activities.

Intermediate: indicate skills to perform moderately challenging literacy activities

Proficient: indicates skills to perform more challenging and complex literacy activities



I equate below level to reading skills and math skills of someone in the third grade. Basic would be someone with reading and math skills below seventh grade. Intermediate would be someone with the skills of a high school student. Proficient would define the skills of a high school graduate. Note that this in not a scientific comparison but my own estimation based on my years in the education field.


Result of the health NAAL:

Percentage of adults in each literacy level Percentage of males in each literacy level Percentage of females in each literacy level Percentage of adults over 65 yearsof age
Below basic: 14%
Basic: 22%
Intermediate: 53%
Proficient: 12%
Below basic: 16%
Basic: 22%
Intermediate: 51%
Proficient: 11%
Below basic: 12%
Basic: 21%
Intermediate: 55%
Proficient: 12%
Below basic 29%
Basic 30%
Intermediate 38%
Proficient 3%




Note that women scored higher than males. This is not surprising since women are usually the health providers of the entire family.


Also note that close to 60% of seniors have very limited understanding of health related print. This limitation may be related to limited vision. So, seniors may require extensive verbal support. Also, many seniors have hearing loss. This would require that health providers speak slowly, clearly and in a loud voice. Finally, seniors have declining memory and decreasing cognitive skills. Considering that the average time that doctors spend with patients nationally is seven minutes, it is doubtful that seniors get the needed support.



Percentage of Whites in each literacy level Percentage of Blacks in each literacy level Percentage of Hispanics each literacy level
Below basic: 9%
Basic: 19%
Intermediate: 58%
Proficient: 14%
Below basic: 24%
Basic: 34%
Intermediate: 41%
Proficient: 2%
Below basic: 41%
Basic: 25%
Intermediate: 31%
Proficient: 4%




Note the very high numbers of Hispanics below basic. This is likely because many are not native English speakers.



Based on this assessment, one third to one half of all adults does not understand written information related to health well or at all. There is a third that understands information better. Only 14% of all adults can understand health related information well.



For more info go to: http://nces.ed.gov/naal/

6. Interpreting the Findings

Elizabeth: One of the first things I noticed in the 2003 NAAL results was the increase in literacy among Black adults between 1992 and 2003. A lot of people have focused on the decrease in literacy among Hispanics (much of which is probably the result of demographic changes), but I haven't seen much discussion anywhere about the increase in literacy among Blacks. I was hoping that this group might have some ideas as to why the literacy increase occurred among Blacks. I have my own ideas about the cause of this, I think that increasing educational opportunities for Black children probably played a role, but I'm curious about what other people think may have caused this change and whether you think this trend is likely to continue.
JI: These latest threads about race and learning and health/health literacy also make me wonder about the broader implications of the report.



While reports tend to produce "just the facts," the questions they seek to address and the ways in which they address them are never neutral. How/can/does anyone consider a broader analysis of the, say, the instructional implications, if any of this report?



I also wonder what measures may have been considered to ensure that there were no biases against minority groups in the items used. For example, have there been any analyses to see if there are gender differences in how people respond to items?



For my own part, I'm curious about how information gleaned from the internet is understood? Who has access to online technology? Anyone who has explored a health-related topic online has probably experienced the overwhelming number of hits and the difficulty of making sense of this. How/does education and access to language help or hinder and who are the people who can reasonably expect to have questions answered via online resources? How/do we learn how to figure these questions out, and, within/around them - look at how race and class play a part in it all?
Elizabeth: The U.S. Department of Education reports definitely produce "just the facts," so it is up to groups such as this one to put some interpretation and explanation around those facts.




You asked about measures to ensure that there are no biases against minority groups or women/men in the items used. The NAAL staff was very concerned about this issue. As the items were developed, outside reviewers with a broad range of backgrounds (including reviewers who were personally members of minority groups as well as reviewers who professionally worked with diverse populations) were brought in to review the items and also to suggest sources for new items. All the NAAL items are based upon "real" reading materials (none of them were written just for this assessment) and the reading materials come from a wide variety of sources, including magazines and publications whose target audience is minority groups.




After the items were field tested, the NAAL technical staff looked at each item for differential item functioning (DIF). The object of DIF analysis is to identify any items on which members of a target group perform worse (or better) than their performance on the other items would suggest they should do. DIF analysis was done for racial/ethnic groups (Black, Hispanic), gender, and age (60+). Based upon the DIF analysis of the field test data, a few items were identified as unfairly advantaging or disadvantaging one or more of those groups and those items were eliminated from the final item pool.

JI: Seems the next question would be around how the groups surveyed responded to the results of the survey (something along the lines of back-translation - where one translator works on a text from language A to language B, and then another person translates that language B text to see how closely it matches the original. In other words, how do people make sense of the data about them? How/do they have access to it?

7. Perceptions

TS: The following note raises questions about the validity of the NAA performance assessments in relation to the adult's self-perceptions of their literacy abilities. It also raises questions about the use of the NAAL in policymaking. It may be of interest to the ongoing discussion of the NAAL.




April 9, 2007

Who Believes America Has An Adult Literacy Problem?

Tom Sticht

International Consultant in Adult Education




"The NAAL, the first assessment of adult English reading and writing ability in the U.S. since 1992, estimated that 30 million people over age 16 are barely able to read and write."
Robert Wedgeworth, president and CEO of ProLiteracy Worldwide



This statement by the head of the largest organization of adult basic education and literacy in the world occurs in the midst of a longer message calling for increased funding for the Adult Education and Literacy System (AELS) of the U. S. so it can serve more than the 3 million or so adults that it presently serves. Yet in the past, when similar pleas have been made for increases in adult literacy funding, in the wake of surveys showing 30 to 40 million adults with low literacy skills, the President and the Congress have responded with either no or very little increases in adult literacy education resources.



Why is this so? Is it possible that neither government officials or any one else for that matter, actually believes what the National Assessment of Adult Literacy (NAAL) or its predecessor, the National Adult Literacy Survey (NALS) reports about the literacy skills of America's adults?



The latest report on adult literacy in the United States, entitled Literacy in Everyday Life (LEL) (Kutner et. al, 2007), once again reports the litany of social problems that have been shown to be correlated with low cognitive skills, including low literacy, for almost a century. Adults with lower literacy skills tend to be in the lower socioeconomic classes, to be in minority ethnic groups, to not be native English language speakers, they are more likely to be unemployed or to hold low wage jobs, are less likely to vote or participate in other civic and community activities, are less likely to read to their children a lot, and more likely to be on welfare or other forms of public assistance (see Sticht & Armstrong, 1994 for an historical review of adult literacy assessments from 1917 to the present).




The LEL report presents data from the 2003 National Assessment of Adult Literacy (NAAL) which measured literacy using three literacy scales: Prose, Document, and Quantitative and reported results in four major categories of skills: Below Basic (the lowest level), Basic, Intermediate and Proficient. Another category, those non-proficient in English or Spanish could not take the test, and made up about 2 percent (4 million) adults. Some 14 percent of adults were in the Below Basic category for Prose literacy and 12 percent for the Document literacy scale. The Quantitative scale had about 22 percent in the Below Basic category.




But does being in the lowest level of literacy, the Below Basic level, indicate that these adults can "barely read and write?" According to the LEL report, being placed in the Below Basic level "indicates no more than the most simple and concrete literacy skills. * Adults at the Below
Basic level range from being nonliterate in English to having the abilities listed below:

  • locating easily identifiable information in short, commonplace prose texts [Prose scale]
  • locating easily identifiable information and following written instructions in simple documents (e.g., charts or forms) [Document scale]
  • locating numbers and using them to perform simple quantitative operations (primarily addition) when the mathematical information is very concrete and familiar." [Quantitative scale]

The foregoing indicate that adults in the Below Basic level of literacy may, in fact, be able to read and write with some real degree of skill above the ability to "barely read and write." One suggestion that adults in the Below Basic literacy level have some degree of functional literacy was the finding that only around a third (34-35 percent) of adults with Below Basic Prose and Document skills thought their reading skills limited their job opportunities "a lot." Another third (33-35 percent) thought that their reading skills limited them "some" or "a little" and a final third
(32-33 percent) of these adults reported that their reading skills limited their job opportunities "not at all." So two-thirds of adults categorized as possessing Below Basic literacy skills did not seem to perceive themselves as very limited in their job opportunities due to their poor reading skills.

Why don't these adults in the Below Basic category of Prose and Document literacy, the lowest level, perceive themselves as limited in their job opportunities? There is no information explicitly given in the LEL report about this. However, there are some data in the report that may be relevant to addressing this issue.

The LEL report states that the correlations among Prose, Document, and Quantitative scales were between +.86 and +.89 out of a perfect correlation of +1.00. This indicates that all three scales placed people in roughly the same rank orders. This means that those who scored poorly on the Prose scale were likely to be low on both the Document and Quantitative scales, those in the middle range of the Prose scale would be in the middle of the ranges of the Document and Quantitative scales, and those with the higher Prose scores would also tend to have the higher Document and Quantitative scales.

This suggests that in estimating people's literacy skills we should consider the sum of the Prose, Document, and Quantitative skills for any given person. Presumably all adults have some of each sort of literacy. But when the skills are discussed, they are discussed as separate in terms of a particular scale. But persons at the Below Basic level on Prose literacy presumably also have Below Basic skills on both Document and Quantitative scales, too. But without adding up skills across the three scales it is not certain how to characterize these people in terms of what they can actually do in the real world of literacy. They would seem to have some ability in all three domains, but exactly how these add together to form an overall estimate of what the Below Basic adults can do with their Prose, Document, and Quantitative literacy combined is not clear.

It may be that the combined literacy skills across the three NAAL scales render people more capable at getting, keeping, and progressing in a job than a discussion of just one scale would suggest. In turn, this might influence people's judgments about how little or how much their literacy skills limit their job opportunities.

For policymakers in federal or state governments, low unemployment rates, below 5 percent, may render the results of the NAAL and the claim that 30 million or so adults are functionally illiterate less than critical. In this case, the policymakers seem to be of the same mind as the adults in the Below Basic literacy level themselves. In both groups, adult literacy may seem to be somewhat of a problem, but not one serious enough to require a major increase in funds for adult literacy education.

Who believes America has an adult literacy problem? If most of the adults with Below Basic literacy skills themselves do not think they have much of a literacy problem, why should their governmental representatives or anyone else think so?

References

Kutner, M., Greenberg, E., Jin, Y., Boyle, B., Hsu,Y., & Dunleavy, E. (2007). Literacy in
Everyday Life: Results From the 2003 National Assessment of Adult Literacy (NCES 2007-480). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Sticht, T. G. (2001). The International Adult Literacy Survey: How Well Does It Represent the
Literacy of Adults? The Canadian Journal for the Study of Adult Education, 15, 19-36.

Sticht, T. & Armstrong, W. (1994, February). Adult Literacy in the United States: A
Compendium of Quantitative Data and Interpretative Comments. Washington, DC: National Institute for Literacy.

Wedgeworth, R. (2007, April). ProLiteracy Worldwide's President Reacts to the NAAL Comprehensive Report. Downloaded April 9, 2007 at http://www.proliteracy.org/news/index.asp?aid=235

KMG: "If most of the adults with Below Basic literacy skills themselves do not think they have much of a literacy problem, why should their governmental representatives or anyone else think so?"



My knee-jerk reaction to this is if you have a low literacy level, perhaps you don't fully understand the kinds of jobs and advancement available to those with higher literacy levels. Or perhaps you are resigned to your current socio-economic position. Or perhaps you are content. What I am trying to say is that there could be dozens of reasons why adults with lower literacy don't believe they are being held back because of it, but that doesn't mean governmental representatives should look at the issue on such a superficial level. Even if an adult is low performing in a single area, that deficit is reflected in career choice and advancement potential.


DB: Here's a link that speaks to this issue.

The Santa Ana Chamber of Commerce has taken it upon themselves to encourage the populace to avail themselves of the opportunities to learn English that will help people improve their employment opportunities.



Most of the populace can get along without learning English because Spanish is spoken and understood by so many, but their job opportunities are limited. An interesting approach by the Chamber of Commerce, and one that deserves emulation.



DR: Two other possible explanations:

1. Perhaps adults thought to have low literacy skills don't feel that they have a problem because they have developed successful workarounds such as:

  • A spouse who reads and writes
  • A group of friends or buddies who learn and teach each other in an oral culture
  • Avoidance of situations (e.g. jobs) where reading, writing or numeracy are required
  • Lies or excuses such as "I left my glasses at home"
  • Getting information from non-print media such as radio and TV
  • Having sharpened their memory skills

These strategies, however, don't always work. Workarounds are sometimes fragile, and when they break down, it can be an emotional challenge or a crisis in person's life.



2. There is an emotional charge around words like "literate," "low-literate" and of course "illiterate." People who have difficulty reading often do not identify themselves with these words.

TS: While the reasons you [all] give for why adults may judge their literacy skills as higher then their assigned test scores would suggest are all possibilities, there are some other things to consider, too.




First, as I noted, the adults may actually have higher literacy skills than the tests portray due to summing across the three scales. You didn't address this point. Also, adults at Below Basic can actually be expected to perform some tasks at the Basic, Intermediate, and Proficient levels only without the degree of probability (.67) as they can perform average tasks in their assigned level of Below Basic proficiency. So people with Below Basic skills can actually be expected to perform some literacy tasks at higher levels. This may influence their judgments of their skills.



Second, large percentages of the adults with Below Basic skills are older adults for whom the complex information processing tasks of the NAAL with their heavy emphasis upon overloading working memory, particularly at the higher levels, are not valid indicators of their ability to read and understand novels and informational books or reports drawing upon their general and specific areas of knowledge. The NAAL may not be valid across the lifespan for representing what adults know and can do with their literacy skills.



Third, high levels of employment, over 95%, may lead adults to think they possess literacy skills sufficient for working and getting by on a daily basis. This is supported by the earlier ALL (Adult Literacy and Lifeskills) survey which measured the literacy skills of the workforce-aged population (16-65). That report indicated that 80 percent of the workforce had literacy skills that matched or exceeded the demands for these skills on their jobs. Interestingly, 20 percent were estimated to be working in jobs for which their literacy skills were too low. But still they were working and earning a living.



Whatever the reasons that adults with Below Basic scores on the NAAL my have for thinking that their literacy skills are better than suggested by their NAAL level, it is clear that the NAAL has not influenced the Bush administration to ask for increases in the adult education budget, and the Congress has flat-lined funding for several years, too. So there does not seem to be much of an urgency about adult literacy among federal policymakers.



Also, philanthropic organizations do not appear to have rushed in to make up for the lack of federal efforts to help poorly literate adults achieve well and advance in the workforce or to seek higher education.



Given all this, I don't thank that declaring 30, 40, 50 or more million adults with Below Basic or just Basic literacy has had much of an impact upon federal policymakers, other funders, or even the general adult population itself. There certainly are not some 30 million adults presenting themselves for instruction in adult education programs each year. And of the 2.7 or so million adults in the Adult Education and Literacy System in program year 2001-02, fewer than one in six said they were there to improve their employment situation and of these only some 42 percent actually reported improving their employment situation after enrolling in a program.



So like I said, if adults don't think they have a problem, and with high levels of employment policymakers and others don't think of adult literacy as a national problem, how will the field attract adult learners into the system and acquire sufficient funding to provide adequate education for those who do show up?



Perhaps it is time for new thinking about how to establish the scale of need for adult literacy education.

DG: It is interesting that adults with low literacy skills self report as not having as low literacy skills while their test scores would indicate that they do have low literacy skills. I have 3 hypotheses to explain this:
  1. They are too ashamed to admit to a stranger that they have difficulty reading (research has indicated that the shame factor associated with difficulty in reading is VERY high).
  2. They do not know how deficient they are in the specific literacy skills assessed in the NAAL. For example, it is not uncommon for a person on the second grade reading level to think that [s/he] can get a GED in a few months. This makes sense to me. We often do not know what we do not know until we are deeply into it. For example, I am clueless at how deficient my knowledge is of advanced calculus.
  3. They are completely proficient in their literacy NEEDS. This also makes sense to me. For example, I am perfectly comfortable not knowing advanced calculus. If someone were to ask me to rank my mathematical proficiency, I might be inclined to rank it very highly because I know what I need to know.
AW: Concerning the learners' self-estimates of their reading skills, I've never met one who under estimated his/her reading ability. Often, the realization that they aren't as good as they thought leads to withdrawal and in some cases animosity to the tutor. Like DG, I've had learners who were two or three lessons into Laubach levels two or three ask me how many months until they could get their GED. My stock answer is that it's too soon to tell, so we'll set a goal for finishing this level and see. Unfortunately, I've never had [one] who would or could stick it out for that long a haul.



In Alabama, with its rating of third worst (tied with Florida and South Carolina) in the country for Level one readers, it's hard to get some people interested in improving their reading, especially in the rural areas. Many people have "made do" with what they had, are proud of it, and if they have no more responsibilities, are content to live out their lives as they are. There is also the fact that many don't know how much better they can be doing. The coordinator of the welfare to work class that was part of our program from 1991 to 2002 took the class (most were single mothers) to one of our shopping malls and only one two had ever been there before. I have a young male student right now who is trying to talk me out of his placement in Laubach Book four, but the more he tries to do the next level work, the deeper into trouble he gets.

8. Implications/Uses of the NAAL

DG: Do you find the NAAL report findings useful/helpful to the adult literacy field?
KMG: I find data driven reports like this useful in terms of numbers of needs and a wide-lens view. However, the numbers without the faces (as we see in more qualitative research) do not often translate directly into the adult education classroom via practice tips and recommendations. It is difficult to look at a report like this and say, "Okay. Now I can see more clearly how I can help my students" (other than avoiding having them look up health information on the Internet, which can be confusing for users at any literacy level). Can someone help put it into the classroom perspective?
DG: You ask an interesting question. Elizabeth-correct me if I am wrong, but doesn't the NAAL use items that are from authentic sources? If that is the case, I find that very interesting. Currently, adult literacy programs use tests like the TABE, which are very unauthentic, in order to assess progress of students. However, the country's literacy levels are being assessed on authentic materials. Is this a mismatch? Should adult literacy programs be using similar tasks that are used on the NAAL to assess progress? Or is the assumption that if one improves on a test like the TABE, they will improve on NAAL like tests? And which kind of test is less susceptible to biases around gender, race, and socioeconomic factors-the TABE or the NAAL?
Elizabeth: Those are good questions. All the NAAL prose, document, quantitative, and health items are from authentic sources and the NAAL questions try to simulate the types of activities adults would do with those items, such as reading a medicine label to figure out when to next take the medicine. The NAAL oral reading fluency assessment, which was administered in 2003, was based on lists of words and pseudo-words, as well as passages from various sources, most of which were shortened and some of which were edited to make them easier to read aloud.


I don't have enough expertise on adult literacy programs to feel entirely comfortable giving an opinion on how those programs should assess student progress. However, it's my impression that if you want to test components of reading (such as fluency or vocabulary) to figure out what aspects of reading a student needs to work on, you may need to move away from authentic materials. That's why the NAAL oral reading fluency assessment included things such as lists of words and pseudo-words, which I don't think anyone would consider to be "authentic" materials. While the goal is for everyone to be able to read, understand, and use authentic materials, a test based only on authentic materials may not provide the diagnostic information you need to help an individual student.


However, as I said, I am far from an expert in this area, so I would love to hear what other people have to say about this.

JM: ...I think it can be helpful to the field by helping us to know where to focus our efforts. For example, the data shows that Hispanics have significantly lower health literacy skills than anyone else. (I am curious why this is because Spanish is the most common form of non-English health information available, but I suppose it may have to do with literacy skills in general. I'd love to hear thoughts on this!) This is a flag for us to do more work with the Hispanic population in regard to health literacy.


AW: I've used the NAAL in my literacy program as part of our advocacy efforts. I'd been using the information from the NIFL publication of state and local results of the 1993 NALS in my pitches to the local population to raise funds and recruit volunteers for our tutoring. I read the NALLS reports to the point that I found the similarity of results in prose literacy between the two studies (actually a loss on the high end) and added that chart to my presentation and informed them that our new study showed no progress made in 10 years. I used some points raised by ProLiteracy in their 2006 literacy status on lack of funding as one of the primary reasons for the lack of progress (most of our funding is locally raised, except for some grants).



My affiliation with ProLiteracy has helped since they've taken a pretty proactive approach to the NAAL. In January, 2006 they announced that they/we were concerned with both the below basic and the basic category people (After a little more reading, I think we have to look at some of the intermediate people, too). In the web presentation by the NIFL on the below basic and basic categories, the Harvard professor said the basic level reads at the seventh grade level. ProLiteracy has used that information in some of their advocacy statements and I think I've seen it on one of the other listservs I'm following.

DG: Do you find the NAAL confusing?

I have heard from many people that they wish that the information from the NAAL report were more simply presented.

  1. People do not necessarily think of literacy in terms of document, prose and quantitative literacy. There is specific confusion between document and prose literacy.
  2. There is also confusion about what the various levels mean: below basic, basic, proficient, etc. I have heard people say that too much reading needs to be done to figure out how each level translates into levels which are more commonly used.
  3. Finally, people have shared that it would be helpful to have findings reported both in terms of percentages and numbers.

Because of a-c, I know that at least some adult literacy practitioners do not relate to the report. It is not written nor presented in language that is useful to them. Off list, some people shared with me that that is why they didn't get involved with a discussion about the meaning of the report in terms of gender, race, and socioeconomic status. They were having a difficult enough time trying to figure out what the report means in simple English, and could not focus on implications.

AW: Concerning percentages and numbers, I think I found them in the initial NCES documents, but not in the same place. The percentages are available in the 2003 vs. 1993 comparison charts and the numbers are listed in one of the category descriptions. Putting them together for prose literacy (the only one that matters for me), I think we have:

  • below basic - 14 percent, 30 million people
  • basic - 29 percent, 63 million people
  • intermediate 44 percent, 95 million people
  • proficient - 13 percent, 29 million people

Right now, I'm uncertain if I've adequately considered the "not literate" group, but I felt OK with them when I came up with the figures.


JBS: Percentages and numbers of the NAAL data are available on NAAL web site under Key Findings at:
http://www.nces.ed.gov/NAAL/index.asp?file=
KeyFindings/Demographics/Overall.asp&PgeId=16



JM: Confusing, you ask? To me the most confusing thing is to figure out which levels are considered adequate for living a good life. Is "basic" good enough? How about "intermediate"? The last survey came back with a figure of "90 million Americans" (have trouble functioning due to literacy constraints) ---what is the comparable figure for this assessment?



AW: I agree that the NAAL report was a pretty wordy document and difficult to stay with and I must admit that I haven't tried. I've downloaded and printed every report that's come out, but it's in a growing stack that will go with me on my next vacation. I feel I've gotten what I need from it for my program right now and that a lot of information is of little need to me other than just to know. I do look forward to further reports and further discussions such as this. It was good to see that others shared my ideas and concerns.

DG: Researchers often use the information in the report to use as justification for receiving money to conduct research in the field of adult literacy. They often use the information in the introduction of their research articles. IES is going to be conducting a training seminar this August for researchers who want access to the NAAL data for further analysis purposes.
JBS: Your questions are excellent. We actually need to have a better understanding of the real impact of 2003 NAAL (National Assessment of Adult Literacy) or 1992 NALS (National Adult Literacy Survey) on the field from different angles. We need to know how the literacy field is using these reports and in what ways they find them effective. For example, as you mentioned, we need to know the policy implications of these reports and if they have resulted in effective policies and systematic increase of funds and resources at the local, state and national levels, and if so, to what extent? Also from the angle of research, it is important to know if these reports are encouraging more in-depth analysis of the issues that are of concerns to the field? How much researchers are relying on the data from both assessments in their work?



And what has been the outcome--it may be too early for 2003 NAAL, but we can look at the results from the 1992 NALS? Another angle is evaluating the impact of the reports on building public awareness and recruiting resources. As a community, we need to evaluate if we have been successful in using the NAAL (or NALS) data as a powerful tool to our benefit...
DG: I have never heard of teachers using the report to make classroom-based decisions. However, I have heard of administrators using the state/county/district report of 1993 (and I know that a report of the 2003 will be out in the future) as a way to try to get funding from corporations for local literacy efforts. Are teachers/administrators using the report to make decisions about their classrooms?
AW: I haven't seen a use for the NAAL in my tutoring, but hope that further insights could provide some information that could be of some use. Recently, I used it to inform our adult learners that their reading problems don't make them unique because they have a lot of company (30 or 63 million more), but they are in a small group in that they're are getting help.
DG: Do you think that the NAAL report can have useful policy implications? Are policy makers using the report to make decisions regarding allocation of resources? It doesn't appear to me that policy makers use the report all that much-but I could be wrong. It would seem to me that the data speak for themselves. The NAAL clearly shows every decade that we have significant literacy issues in our country, but I have never heard a loud appeal for a significant raise in funding for literacy as a result of this report. And if there is a loud appeal that I have missed, for some reason it has not succeeded in getting government to care enough to adequately fund literacy efforts.
AW: I'm pretty convinced that policymakers won't use the report unless the people they represent (and who put them there) make an issue of it. Again, ProLiteracy has provided some excellent guidance on this, but I believe any further discussion on this would need to be done on a different list. I think I'll bring it up if someone doesn't because it needs to be pursued.


BM: We used NALS findings to defend literacy programs in federal prisons at a time when other programs (college programs, wellness programs) were being stripped away (with the loss of Pell Grants in the mid-90's, for example). Literacy seemed to be the one program that even the most get-tough-on-crime legislators championed. Perhaps too a fault: it's also the era when many prison literacy programs were mandated (not only in policy, but also in law). I think NALS did have an impact on prison literacy policy, maybe more so than in the community...



JM: In terms of policy makers' use of the report, it seems to me that the "90 million Americans" figure from the 1993 report had a pretty big impact on the health field and has been used to garner support for some important initiatives in the area of health literacy. I think that figure was striking enough that it opened a lot of eyes. I would still like to know what the comparable figure is from the 2003 report.


JBS: [DG and TS] have raised valid questions regarding significance of the NAAL for policymakers and other funding sources. [DG] wrote: "It doesn't appear to me that policy makers use the report all that much-but I could be wrong...but I have never heard a loud appeal for a significant raise in funding for literacy as a result of this report".


[TS] points out that "it is clear that the NAAL has not influenced the Bush administration to ask for increases in the adult education budget, and the Congress has flat-lined funding for several years, too. So there does not seem to be much of an urgency about adult literacy among federal policymakers".



This discussion made me to look back at NIFL's approach to NALS (National Adult Literacy Survey) and share a short history that may help us in considering our approach to NAAL. Under the leadership of Andy Hartman (NIFL Director between 1994-2000), and hard work of Carolyn Staley (NIFL Deputy Director1994-2002) and Alice Johnson (Policy Analyst 1994-2002), in 1996, NIFL initiated and funded the Public Awareness Campaign that was based on the NALS data. As an important component of the campaign, NIFL published State of Literacy in America report (prepared by Stephen Reder), which repackaged the NALS data with color-coded maps of literacy levels in every state, county, and municipality. There was a policymaker component, which was directed at both federal and state policymakers, and also business and general public components. The campaign resulted in:

  1. Increase of federal funding for adult education: The report and the campaign definitely caught the attention of some Congressional offices and NIFL got positive feedback on it from the Hill and actually did help in getting additional federal funding for literacy. In addition, armed with these books and this information, local and state programs were able to call on members of Congress and provide them with statistical information on their constituency. This made a huge impact in building awareness and support from Congress.
  2. Increase of corporate support in general for literacy--as awareness increased: Verizon stepped in the literacy field and Wal-Mart began campaign for literacy and established a call center including a CD about reading, and financial support for local literacy organizations from local Wal-Mart stores increased. In addition, we see at this time Dollar General initiated their literacy campaign, and Faith Hill (country singer) launched PSA for literacy.
  3. Change in legislation: As HHS changed regulations requiring "workfirst" and curtailing education as counting for work credits (TANF), many states stepped up and included adult education and literacy courses as eligible activities for "work" so that low-income adults could continue literacy training as they moved toward work.
  4. Others: The issue of literacy became much more widely understood and supported across the country. Many literacy organizations reported to NIFL a marked increase in local business and corporate support as a result of the campaign. The campaign's professionally-created materials that were provided to local literacy groups, including broadcast quality TV and radio tapes/CDs, and print-ready materials made it much easier for local and state literacy programs to have PSAs running with their local program information provided as well.

The overall lesson learned was the way you present information to policymakers and funding sources makes a big difference in terms of how it is received and also that folks on the Hill like to have data broken down by state and especially by Congressional district.

DG: Thanks for sharing what was done with the previous report in 1996. It is good to have a historical perspective! It does seem that the campaign had many benefits, and I agree with your closing statement that: "The overall lesson learned was the way you present information to policymakers and funding sources makes a big difference in terms of how it is received..."
So, how do we get a repeat of the activities of 1996 to happen again? Sounds like we need a well crafted campaign using the 2003 data!

9. Commentary from Tom Sticht

The Assessment of Adult Literacy Without Adequate Opportunity
for Adult Literacy Education Should be Forsaken

Tom Sticht

The National Assessment of Adult Literacy (NAAL) report released in 2006 indicated that almost half the nation's adults were so poorly literate as to be unable to function well in our contemporary society. This problem was exacerbated when talking about Hispanics and Black Americans. They fell well below the White adults in their literacy skills. All this lead some to suggest that our nation's international competitiveness is at risk because of the lack of functionality of our workforce. But is this all true? Some background and additional research information provides a basis for questioning these results and inferences from the NAAL.

Background

The Young Adult Literacy Survey (YALS) of 1985 was the forerunner to the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), Adult Literacy and Life Skills (ALL) survey, and the National Assessment of Adult Literacy (NAAL) of 2003. The YALS provided the basic methodology for all these assessments in scaling literacy on Prose, Document and Quantitative scales. These assessments all used Item Response Theory (IRT) to scale the literacy abilities of adults and the difficulty levels of the test items. This produces some interesting comparisons.

For instance, on the 1985 YALS, in the Document literacy assessment, 73 percent of the tasks demanded 300 level skills or lower, while 57.2 percent of young adults possessed 300 level skills (about the middle of Level 3 on the later NALS) or higher. Thus, the Document tasks tended to be skewed toward the easy end of literacy task difficulty. Overall, the average percent correct for Document literacy tasks was 83.3. Whites scored 85.9, Hispanics 77.6, and Blacks 71.8 percent correct on the average for Document tasks. Thus, using percentage of items correct, all ethnic groups appeared fairly capable, even though there are clear ethnic differences.

However, while 65.4 percent of Whites scored at the 300 skill level or higher, only 37.0 percent of Hispanics and 19.8 percent of Blacks scored at the 300 skill level or higher. Note that, if one focuses on the fact that only one in five Blacks were at the 300 skill level or above on the Document scale, one might infer a very low performance level for Blacks on Document tasks. Yet, overall, Blacks performed over 70 percent of the Document tasks correctly. This apparent contradiction results from the fact that to be at the 300 level of skill requires that people possess an 80% probability of being able to perform tasks that are at that level of difficulty. But people with lower levels of skill have a greater than zero probability of being able to correctly perform 300 level tasks. When the latter are taken into consideration, as in calculating the overall average percent correct, then a much greater percentage of the population may be seen to be able to perform Document tasks across the full range of difficulty levels, from easy to hard, than are able to perform tasks at the 300 level of difficulty or above.

In the construction of the YALS assessment it was thought that the materials and tasks selected were representative of "real world" tasks that adults would encounter. If that were so, and if Hispanics and Blacks scored over 70 percent correct, then one might think that overall these young adults were fairly capable in their abilities to perform representative "real world" tasks. But the scaling procedure, that is the IRT scaling methodology, provided a different perspective.

How "real world" the tasks were on the YALS, or are on the other assessments that came later is questionable. For instance, on the YALS and the NALS one Prose literacy item was as follows: The following poem was read: "The pedigree of honey does not concern the bee-a clover, any time, to him is aristocracy." The task was then to answer the question "What is the poet trying to express in this poem?"

As it turns out, this was one of the most difficult of all questions, with a difficulty level of 387 placing it in NALS level 5. But the question is, how "real world" is this task for most adults?

Some Research

My concerns for how valid such assessments are for telling us how well people might succeed in the "real world" was aroused when during the Vietnam war over 350,000 young men were inducted who had literacy scores on average below the 6th grade level. They had all been excluded as unfit for service due to low cognitive ability. However research showed that later on some 85 percent performed their jobs well and completed their military service. In one study even though Black soldiers scored 20 percentile points lower than Whites on the cognitive/literacy tests, they performed about as well as White soldiers on job knowledge and job performance tests.

In some studies Blacks actually outperformed whites.

Later, as a member of the National Commission on Testing and Public Policy, I argued strongly for positions that were included in the recommendations of the Commission, including "#1. Testing policies and practices must be reoriented to promote the development of all human talent. We must reevaluate how we judge the quality of tests, the names we give them, the ways we report results, and the ways we use them. No testing program should be tolerated that classifies people as unable to learn; potentially negative classification in school or the workplace should be accompanied by learning opportunities."

And "#4. The more test scores disproportionately deny opportunities to minorities, the greater the need to show that the tests measure characteristics relevant to the opportunities being allocated."

Clearly, today the results of the NAAL paint a disproportionately negative picture of the literacy skills of Hispanics and Blacks. They are classified as being unable to adequately perform literacy tasks needed for education and work. But even though this denigrates the skills of these minority groups, designating them as inferior in our educational and workforce activities, there has been no major commitments to providing accessible, sustainable, life-relevant educational opportunities for them in the Adult Education and Literacy System (AELS) of the United States.
Indeed, from fiscal year 2000 to fiscal year 2004 there was a decline of 12.8 percent (78,986) for Blacks enrolled in the AELS.

In my view, it is entirely possible that our adult literacy assessments and their classification of some half of adults, and even more who are Hispanics or Blacks, as able to meet only the lowest demands for literacy for learning, working, and earning, may be doing more harm to our international competitiveness than the actual skills of our workforce. How many international corporations will want to work in a country with half the workforce declared unfit for productive work in contemporary times?

Degradation by classification, without adequate opportunities for education, should not be undertaken.

10. Closing Remarks from Elizabeth Greenberg

I want to thank everyone who participated in last week's discussion about the National Assessment of Adult Literacy (NAAL). Your comments provided those of us who work on the project with much to think about.



If you have any additional questions about NAAL in the future, please feel free to contact me directly. My e-mail address is egreenberg@air.org. If I can't answer your question, I'll try to identify someone who can.



Elizabeth Greenberg

11. Thank-you from Daphne

I would like to thank Elizabeth for taking the time out of her busy schedule to spend some time with us on the list as we tried to make sense of the meaning of the NAAL 2003 data. I thank all of the list contributors to the topic, all of you who read, but didn't send a post, and all of you who emailed me your thoughts regarding the topic off line.



Daphne Greenberg

PovertyRaceWomen and Literacy List Facilitator




Please note: We do not control and cannot guarantee the relevance, timeliness, or accuracy of the materials provided by other agencies or organizations via links off-site, nor do we endorse other agencies or organizations, their views, products or services.