[Assessment 1538] Re: Getting staff used to using data

Share: Share on LinkedIn! Print page! More options

Archived Content Disclaimer

This page contains archived content from a LINCS email discussion list that closed in 2012. This content is not updated as part of LINCS’ ongoing website maintenance, and hyperlinks may be broken.

George Demetrion gdemetrion at msn.com
Wed Dec 17 08:43:14 EST 2008

Hello David, Barry and all,

David's point speaks to the importance of small case study comparative analyses, which then can provide a basis for a more extended research project. While one case study may be deemed as anecdotal, which is often characterized unjustly in pejorative terms, 10 such studies point to broader trends.

One of the best places to access such studies is on the National Adult Literacy Database, particularly their research base. One of the core key challenges is to take a close look at 10-12 of such studies on a given topic and critically identify key variables evaluate broad trends. In general, the field of adult literacy research does not yet rise to such a level of sophistication that would encourage such work--good topics for PhD programs and for folks who can dedicate a lot of unhurried quality time to such work and to make the basic insights gleaned publically accessible.

One of the key problems is the limitations in our research design. The extent to which experimental design is referred to as "the goal standard," other intellectual frameworks that can and do guide quality research get subsumed within an underlying and often unconscious positivist mindset. There is much good work, both of a theoretical and empirical nature that bespeaks of a broader vision of research and of the many values of adult literacy education. Their lack of legitimacy is a principle stumbling block as is the need to greatly enrich both the content and methodological framework that underlies so many reports and research projects in adult literacy. Schools of thought that build primarily on the "thick description" of ethnography as well as the theoretical impetus of critical pedagogy help, but need to be cross-evaluated from perspectives other than their own.

I do think the ongoing flow of work coming out of the new literacy studies holds a great deal of potential. Yet unless the theoretical constructs that underpin such work get greater play in various policy and broad-based policy circles they will tend to remain isolated in v arious academic and practitioner-based enclaves.

Here are a few resources on the new literacy studies:

In my work I refer to a U.S. and UK version of the new Literacy Studies. I believe the EFF project is an example of the former, though given Juliet Merrifield's UK heritage, it may be more of a blending in intent; though in primarily focusing on literacy "practices" rather than the social and cultural contexts in which such practices are embedded, the bring of the EFF project focuses, and rightly so, more on the US than UK version.

Clearly there's a great deal of room for cross fertilization as well as plenty of opportunity to incorporate a diversity of research methodologies, particularly when methodology is viewed as a tool, not the resource itself of insight and knowledge.

No doubt there's a lot of work that needs to be done.


George Demetrion
PS for the computer specialists: How does one do spell check in hotmail?

From: DJRosen at theworld.comTo: assessment at nifl.govDate: Tue, 16 Dec 2008 22:59:55 -0500Subject: [Assessment 1537] Re: Getting staff used to using data

Hi Barry and others,

Of course, teachers who sign up for (voluntary?) PD might be the kind of teachers who actively seek solutions to teaching/learning problems. It may be that you are measuring the relationship between learners' retention and their teachers' motivation to solve classroom problems, not increased (or decreased) retention as a result of the professional development itself.

It is difficult to isolate variables in adult ed. One of the hardest to isolate is teacher training. I think the most promising way to measure impact of teacher training on learner outcomes is in a content or skills area where a control group has a teacher with little knowledge and no training in the area being measured and the experimental group has a teacher who gets specific teacher training (including training on content and skills for herself) in the area. Learning outcomes in that area are then measured and compared for both groups of learners.

Not perfect, of course, but it has potential. Numeracy and certain computer skills might be two such learning/teaching areas where a control group teacher does not have skills or knowledge and where an experimental group teacher gets specific training related to teaching numeracy or computer skills (such as student web page design, or using a classroom wiki or a blog to promote writing).

Has anyone tried an experiment like this?

David J. Rosen
djrosen at theworld.com

----- Original Message -----
From: Bakin, Barry
To: The Assessment Discussion List
Sent: Monday, December 15, 2008 7:20 PM
Subject: [Assessment 1533] Re: Getting staff used to using data

The attempt to correlate student attendance and retention with teacher participation in professional development is still in very preliminary stages and a professional statistician might say that we’re going about it in the wrong way but one aspect of what is being discussed relates to a “retention” score being looked at that is derived from the total number of hours all students enrolled in a class could potentially have attended during a certain time period (if all of those students had attended every hour from the time they had enrolled to the time they left the course or the specified time period was reached) divided by the actual hours those same students attended. So let’s say that some 42 students could have attended a maximum total of 3000 hours of class time during the time period being examined. The actual attendance of those students during that time period was 1500 hours. 1500 total actual hours divided by 3000 total possible hours gives a 50 percent figure. By doing the same calculation for every class offered, a division-wide “average retention” figure can be established for a particular type of class.
The idea is that by identifying teachers who have taken staff development courses, and then looking at their individual average retention figure “pre” and “post” training, the effect of the training on an individual teacher’s retention might be demonstrated and in turn the effect on all teachers who have attended trainings as a group. I’m not sure what variables other than the training are being considered. Again, these ideas are all preliminary and experimental so they’re not for wider dissemination. It would obviously be preferable to have a controlled double-blind study but that seems to be out of reach at the moment…


From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Nancy R Faux/AC/VCUSent: Thursday, December 11, 2008 11:23 AMTo: The Assessment Discussion ListSubject: [Assessment 1521] Re: Getting staff used to using data

Hi Barry, Could you please explain how you are correlating student attendance and retention with teacher participation in professional development, or the workshops that you offer? We are exploring ways of doing this, also. Nancy *********************************************************Nancy R. FauxESOL SpecialistVirginia Adult Learning Resource CenterVirginia Commonwealth University3600 W. Broad Street, Suite 669Richmond, VA 23230-4930nfaux at vcu.eduhttp://www.valrc.org1-800-237-0178

"Bakin, Barry" <barry.bakin at lausd.net> Sent by: assessment-bounces at nifl.gov
12/11/2008 12:21 PM

Please respond toThe Assessment Discussion List <assessment at nifl.gov>


"The Assessment Discussion List" <assessment at nifl.gov>



[Assessment 1510] Re: Getting staff used to using data

Data is not just for classroom instructional staff to analyze. Our staff meeting yesterday(of teacher trainers responsible for staff development)focused on using attendance and ADA statistics collected since 1999 as a way to determine whether or not our team's staff development efforts over the last several years has resulted in increases in student attendance and retention by students whose teachers have taken staff development workshops. We have an immediate and pressing interest for doing so, as expected district-wide budget shortfalls of millions of dollars are leading some at the district level to advocate for the elimination of staff-development programs in the coming year. We obviously feel that teachers who improve their skills will retain students better than those who don't, but we'd like to be able to point to data that demonstrates that.Barry BakinESL Teacher AdviserDivision of Adult and Career EducationLos Angeles Unified School District-------------------------------National Institute for LiteracyAssessment mailing listAssessment at nifl.govTo unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessmentEmail delivered to nfaux at vcu.edu

-------------------------------National Institute for LiteracyAssessment mailing listAssessment at nifl.govTo unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessmentEmail delivered to djrosen at theworld.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lincs.ed.gov/pipermail/assessment/attachments/20081217/faf798cf/attachment.html