Skip to main content

NIFL-ASSESSMENT 2005: [NIFL-ASSESSMENT:921] RE: Voice in writing

Archived Content Disclaimer

This page contains archived content from a LINCS email discussion list that closed in 2012. This content is not updated as part of LINCS’ ongoing website maintenance, and hyperlinks may be broken.

From: George Demetrion (
Date: Thu Feb 17 2005 - 15:30:03 EST

Return-Path: <>
Received: from literacy (localhost []) by (8.10.2/8.10.2) with SMTP id j1HKU3C00404; Thu, 17 Feb 2005 15:30:03 -0500 (EST)
Date: Thu, 17 Feb 2005 15:30:03 -0500 (EST)
Message-Id: <681A95205B5ACB4AAD697401486AE71206DF8E@hal9000.lvgh.prv>
Precedence: bulk
From: "George Demetrion" <>
To: Multiple recipients of list <>
Subject: [NIFL-ASSESSMENT:921] RE: Voice in writing
X-Listprocessor-Version: 6.0c -- ListProcessor by Anastasios Kotsikonas
Content-Transfer-Encoding: 8bit
Content-Type: text/plain;
Status: O
Content-Length: 10754
Lines: 299

The CASAS writing assessment is valuable in assessing independent
writing skills.  I would question its value in evaluating "voice" in
that the writing prompts are highly selective in asking students to
respond to one of several descriptive scenarios.

In measuring accuracy of response based on the 4-5 rubric categories,
it's not particularly supportive of process approaches to writing, which
often times provide the idiosyncratic format wherein "voice" might

This is not to take away from what CASAS does measure--accuracy and
fullness of response to a specific prompt--and there is much merit to
that kind of measurement. Voice, in my view, requires a different sort
of measurement.  For example, one might get at that by evaluating a
collection of student writing in a given program according to the
literary quality of the expression.  

I'm not sure a rubric would be the best form of measurement for that,
though I would not rule that out.  Also, on the CASAS writing
assessment, the resulting essay might be viewed as a manifestation of
authorial voice, but that's not what I would be primarily looking for in
such an "artificially" constructed essay.

While there may be (and ideally should be) convergences in underlying
pedagogical assumptions undergirding the type of writing fostered by the
CASAS writing prompts and a more free flowing "existential" narrative
fostered in process writing schools of thought, the differences may be
even more critically important.

Stating this, I believe a worthy discussion could ensue here on the
multi-purposes of a writing program in adult literacy education below
the GED level--a discussion that could be stimulated in reflecting on
the differences in the types of writing that CASAS prompts and process
writing orientations stimulate.

What also would be of interest are the ways in which the REEP rubric
relates to the two types of writing.

George Demetrion

-----Original Message-----
From: [] On
Behalf Of Dianne Glass
Sent: Thursday, February 17, 2005 2:55 PM
To: Multiple recipients of list
Subject: [NIFL-ASSESSMENT:920] RE: Voice in writing

Kansas has used the CASAS Functional Writing Assessment (FWA) for almost
10 years.  While it requires an enormous commitment of time and energy
to ensure that the scoring of a performance-based assessment is
standardized, Kansas adult educators have responded positively to the
lengthy process of being "certified" to use the FWA and to maintaining
certification.  They report that the process has helped them become much
better teachers of writing.  

Dianne S. Glass
Director of Adult Education
Kansas Board of Regents
1000 SW  Jackson Street, Suite 520
Topeka, KS  66612-1368
Phone:  785.296.7159
FAX:  785.296.0983

>>> 2/17/2005 12:18:12 PM >>>

Howard has articulated the main reason that the CASAS rubric is for
both ABE
and ESL learners. He said, "We don't hold learners to different
Our instructors see 'good writing' as 'good writing' whoever is doing

We would add that employers and others on the receiving end of our
writing don't have different standards, either. 

We would recommend placing ESL and ABE students in different classes
instruction and the kinds of strengths and errors will be very
different for
the two groups, but the general characteristics of writing for both
can be described within a single rubric. We have been working with this
nearly ten years and have become very comfortable with scoring both
types of
learners on the same rubric, though it is often necessary to be careful
to over-reward ESL learners for "trying" when they haven't quite
in writing at a certain level.

In answer to your earlier questions about writing prompts, I can
with respect to the CASAS Functional Writing Assessment Picture Task,
is currently being used for accountability reporting in Kansas, Iowa,
Connecticut, Oregon, Indiana, Vermont and New York Even Start. Prompts
this task are line drawings showing a scene with a central critical
as well as a number of other things happening in the picture. This type
prompt can be answered by students from beginning to advanced levels in
ASE and ESL programs.

It takes a long time to develop a viable prompt, with many rounds of
revisions based on field-testing input from teachers and students and
and forth work with an artist. They are written by a small team of
developers who have extensive experience as adult ed. teachers. Topics
the prompts come from needs assessments from adult ed. programs and
workplace surveys. We currently have seven prompts - four that are on
general life skills topics (a car accident scene, a grocery store
scene, a park scene, and a department store scene). There are three
prompts that have a workplace focus - a restaurant kitchen scene, a
scene and a warehouse scene. 

Like the REEP, these prompts are scored with an analytic rubric, but
slightly different categories: Content; Organization; Word Choice;
and Sentence Structure; and Spelling, Capitalization and Punctuation.
categories are weighted, with more importance given to the first three
categories to emphasize the importance of communication of ideas in
We have recently completed a study to convert the rubric scores to a
IRT scale, which provides a more accurate means of reporting results
prompts. We have also just completed a cut score study to refine the
relationship of the CASAS Picture Task writing scores to the NRS

With all of the work that goes into developing and standardizing a
prompt, it is not made available for classroom practice. However, we
found several published materials that contain similar types of
that can be used for classroom practice. 

We encourage programs to share the rubric with students for
instruction, in
addition to using it to communicate test results to teachers and
Many teachers tell us that completing the training for the writing
assessment, which focuses on the scoring rubric, has given them a
understanding of how to approach the teaching of writing. The analytic
rubric provides clear diagnostic information about students' strengths
weaknesses in the different rubric categories.

I am very pleased that some states are choosing to include writing in
mix of assessments that can be reported for accountability purposes. It
more work to include performance assessment in a state's
system, due to the additional training and scoring demands, but the
that are doing it have found it to be worth the extra effort.

Linda Taylor, CASAS
(800) 255-1036, ext. 186

-----Original Message-----
From: [] On
Of Marie Cora
Sent: Thursday, February 17, 2005 9:54 AM
To: Multiple recipients of list
Subject: [NIFL-ASSESSMENT:914] RE: Voice in writing

Hi Bonnie, thanks for this.

Yes, I think that it would have been real tricky for me to have a
that didn't distinguish between ESOL/ABE students.  Unless they are
transitioning from ESOL to ABE perhaps.  It's tricky enough, as you
note, to adhere to rubric anchors and so forth, so adding that you are
working with different populations with the assessment would add a
that I would also find difficult.

CASAS folks:  can you tell us why the writing rubric is not separate?
What's the rationale there?  It seems like the needs, esp. at the
levels, would be very different.

REEP folks:  what do you think about that?  Perhaps that was never a
consideration for you though, since REEP serves the ESOL population
that right?).


-----Original Message-----
From: [] On
Behalf Of 
Sent: Tuesday, February 15, 2005 1:21 PM
To: Multiple recipients of list
Subject: [NIFL-ASSESSMENT:908] RE: Voice in writing

I, too, have been intrigued by the idea of "voice" in the rubric, and
I intuitively "know" what it means, I'm interested as an emerging
specialist as to what elements would constitute voice, beyond more
traditional "academic" ways of "measuring" it. I think of the clarity
persuasiveness of a point of view supported with meaningful examples,
personal voice in a narrator struggling with complex questions,
emotion strikingly articulated with imagery or other means, an attempt
critical thinking, or "learning to learn," self-reflectiveness... I'd
interested in hearing from others.
Another point I encountered when I was involved with CT's working with
CASAS writing assessments: the rubric was not meant to distinguish
ABE and ESL students. As an evaluator, I as an ESL specialist was at a
disadvantage: having attained a certain level of skill in
English learners' language into meaningful utterances, I'd
bring that to my evaluation: it was extremely difficult to adhere to
rubric controls and anchors, and not want to commend the ESL learner
attempting with limited language ability to voice something difficult
articulate in another language, as having communicated more than in
they did. 
Bonnie Odiorne, Ph.D.
Writing Center, English Language Institute
Post University, Waterbury, CT

Original Message:
From: Marie Cora 
Date: Tue, 15 Feb 2005 12:02:28 -0500 (EST)
Subject: [NIFL-ASSESSMENT:906] RE: Voice in writing

Hi everyone,

A couple of observations:  

First, please do note that this assessment is a fine example of a
performance-based assessment that has been standardized.  So if anyone
still thinks that standardized assessments all look like TABE,
your myth debunked.

I think that capturing voice in writing is quite important, and I'm
that the REEP rubric includes this area.  If not for voice, the rest
the examination of the writing is based on the 'academics' of the
writing - and I feel like that leaves out the writer's (emerging)
personality.  I note in looking around a little bit, not a whole bunch
of other writing assessments take voice into account (the GED does not
for example). I also think that because voice is a dimension of the
rubric, students will pay more attention to that area and view it as
equally important as the other dimensions.  (A bit of "what counts
counted" there.)

What do others think about voice and the other dimensions?


-----Original Message-----

This archive was generated by hypermail 2b30 : Mon Oct 31 2005 - 09:48:46 EST