[Assessment 1523] Re: Getting staff used to using data

Archived Content Disclaimer

This page contains archived content from a LINCS email discussion list that closed in 2012. This content is not updated as part of LINCS’ ongoing website maintenance, and hyperlinks may be broken.

gdemetrion at msn.com gdemetrion at msn.com
Fri Dec 12 08:47:32 EST 2008



Good morning all,

One of the dilemmas of taking on such studies is that of creating another level of work, and therefore time allocation which in turn would need justification and therefore "evidence" that such time expenditure is a worthy fiscal investment, often when time and money are scarce commodities.

On the question at hand, I'm wondering to what extent such correlations can be made when (a) there are so many intervening variables, and (b) when relationships between professional development and direct program improvement are typically longer term and perhaps more typically reflect changes in the "softer" organizational climate of a learning organization. This is not to deny that there can be some direct impacts, which in principle, can be measurable at least in some instances, though I'd be weary of justifying PD on such terms (cost-benefits metaphors--and let's not forget the metaphorical dimension of language in play here). I think broader arguments can be made for Professional Development which perhaps can be drawn in from research both in our own field (NCSALL has done some work here as have other institutes) and in other fields. One needs to consider as well, the quality, content, and context in which PD takes place, how it is internalized within the thinking and practice of practitioners and its role within broader adult education programs, including the organizational and pedagogical development of such organizations on a system-wide basis. These are matters of major consequences, a discussion of which perhaps better belongs on the PD forum.

On assessment, if one, year after year is working with students at a broadly similar level and range, and if year after year the standardized testing scores (pre and post) are broadly the same, then perhaps the issue there is the purpose or the purposes of adult literacy education where "growth" (a Deweyan metaphor of intriguing consequences) does take place over time, typically in ways that are subtle, but not so effectively "measured" by the various "instruments" available to document such impact.

Pragmatically we need to come up with some workable solutions that at least move toward viable mid-level solutions, but to the extent that broader definitions of the meaning and purpose of adult literacy education is more clearly and realistically built into policy formulations we're going to remain stymied in the most fundamental sense even when some progress is being made in sharpening our understanding of assessment tools and fitting them into our programs. Such latter work is essential, though I think it's important that we not allow various mystifications that we are actually measuring literacy growth (when we haven't defined the terms) to seep in, such as, for example, the conflation between reading and literacy.


Best,

George Demetrion



From: nfaux at vcu.eduTo: assessment at nifl.govDate: Thu, 11 Dec 2008 18:07:48 -0500Subject: [Assessment 1520] Re: Getting staff used to using data
Hi Barry,Could you please explain how you are correlating student attendance and retention with teacher participation in professional development, or the workshops that you offer? We are exploring ways of doing this, also.Nancy
*********************************************************Nancy R. FauxESOL SpecialistVirginia Adult Learning Resource CenterVirginia Commonwealth University3600 W. Broad Street, Suite 669Richmond, VA 23230-4930nfaux at vcu.eduhttp://www.valrc.org1-800-237-0178
-----assessment-bounces at nifl.gov wrote: -----
To: "The Assessment Discussion List" <assessment at nifl.gov>From: "Bakin, Barry" <barry.bakin at lausd.net>Sent by: assessment-bounces at nifl.govDate: 12/11/2008 12:06PMSubject: [Assessment 1510] Re: Getting staff used to using dataData is not just for classroom instructional staff to analyze. Our staff meeting yesterday(of teacher trainers responsible for staff development)focused on using attendance and ADA statistics collected since 1999 as a way to determine whether or not our team's staff development efforts over the last several years has resulted in increases in student attendance and retention by students whose teachers have taken staff development workshops. We have an immediate and pressing interest for doing so, as expected district-wide budget shortfalls of millions of dollars are leading some at the district level to advocate for the elimination of staff-development programs in the coming year. We obviously feel that teachers who improve their skills will retain students better than those who don't, but we'd like to be able to point to data that demonstrates that.Barry BakinESL Teacher AdviserDivision of Adult and Career EducationLos Angeles Unified School District-------------------------------National Institute for LiteracyAssessment mailing listAssessment at nifl.govTo unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessmentEmail delivered to nfaux at vcu.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lincs.ed.gov/pipermail/assessment/attachments/20081212/a77ace52/attachment.html