[Assessment 1513] Re: [BULK] Getting staff used to using data

Archived Content Disclaimer

This page contains archived content from a LINCS email discussion list that closed in 2012. This content is not updated as part of LINCS’ ongoing website maintenance, and hyperlinks may be broken.

Michael Tate mtate at sbctc.edu
Thu Dec 11 12:26:36 EST 2008


Hafa dai Barbara! Washington State's assessment policy requires a minimum of 50% on post-test ratio, and the state average is 59%.



We had 6 (out of 50 some) programs below 50%, with 2 programs in the mid-40% range, 1 in the high 30% range, 2 in the low 30% range, and 1 in the high 20% range. These programs will be developing action plans to improve their results on this data element.



Often, their success is dragged down by poor data processes (not just post-testing processes) in outreach sites and not limiting enrollment to inmates with longer sentences in county jail sites.



Programs can receive performance funding for 3-5 point gains on the CASAS test, as well as on other student successes.



I hope all is well with you.



Michael Tate



From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Barbara Jacala
Sent: Wednesday, December 10, 2008 4:00 PM
To: cathayreta at sbcglobal.net; 'The Assessment Discussion List'
Subject: [Assessment 1505] Re: [BULK] Getting staff used to using data



Hello. I am a latecomer to this discussion and I hope not too late to get feedback from experienced practitioners. I am the program specialist for adult education at Guam Community College. The difficulty we encounter is in raising the level of our paired tests (pre and post). I am getting only about 60%. Is this average for this population? How are others encouraging students to take the post before leaving? Or how are they making sure that everyone is post tested? What strategies have you put in place?



Barbara Jacala

Guam Community College

Adult Education Program Specialist

POB 23069 GMF, GU 96931

671-735-5584

barbara.jacala at guamcc.edu



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Cathay Reta
Sent: Thursday, December 11, 2008 8:29 AM
To: The Assessment Discussion List
Subject: [Assessment 1503] Re: [BULK] Getting staff used to using data



Hello,



As someone previously mentioned, I think it is helpful to see that data is more than numbers. It also includes information drawn from surveys, focus groups, interviews, etc. I am reminded of Project Learn in Akron, Ohio. Their reports showed that



"Students functioning between the 5.0 and 8.9 grade level equivalent attended for an average of 35 hours for the program year while literacy students averaged 55 hours, GED students averaged 42 hours, and ESOL students averaged 65 hours."



So they gathered more data by talking with students in the pre-GED level and found that they did not feel comfortable going from their student orientation into a classroom where the instructor and students already knew each other. Based on that, they re-arranged class schedules to give instructors a half hour to meet with new students before the start of class. That made a difference evidenced by the next year's report -- the average hours of attendance for those students increased to 52 hours.



I think examples like this are what make me excited about "data."



I'm wondering if anyone has other examples to share, or questions about what type of data to review to address specific concerns. Anyone??



Cathay





Cathay O. Reta
Cornerstone Concepts
6670 Southside Drive
Los Angeles, CA 90022
Ph: 323) 728-4302
cathayreta at sbcglobal.net

--- On Wed, 12/10/08, Vivian Copsey <copsey at allencc.edu> wrote:

From: Vivian Copsey <copsey at allencc.edu>
Subject: [Assessment 1501] Re: [BULK] Getting staff used to using data
To: "The Assessment Discussion List" <assessment at nifl.gov>
Date: Wednesday, December 10, 2008, 11:45 AM

I agree wholeheartedly with your approach to reviewing data. In Kansas, we are required to meet certain indicators of a quality adult education program in order to receive funding. Our instructional staff realizes our data reflects our progress toward these measures. We have an excellent database used by all AEFLA funded programs across the State. The database is available for review by the staff at the Kansas Board of Regents and by any member of our staff. A representative for the adult education program does a yearly on-site visit to review files for accuracy.

I am the only one who enters data. Any incorrect data becomes my responsibility. At our monthly staff meetings, all data is compared to the student's file for accuracy. Because we a very small, rural program (118 participants with 12 hours or more in fiscal year 2008), I am able to continually review data on a weekly basis. Reports are printed out weekly and compared with what goals we projected, building from previous years.

Our staff meetings consist of two full time instructors, one part time instructor, one part time administrative staff and one full time administrative staff. Because we work very closely as a team, we look at the data for the whole program. If there are deficits in any area, we began to look at ways for improvement. Staff members are helpful to one another in giving suggestions.

Overall, when our staff members receive a report, they are very interested in our outcomes. Sometimes there is no awareness of progress or lack of progress until it is seen on paper. The results don't carry any negative connotations but a desire for improvement.

Vivian Copsey
Coordinator Adult Education
Allen Community College
1801 N. Cottonwood
Iola, KS 66749
620.365.5116 x250

At 08:35 AM 12/10/2008 -0500, you wrote:

Having staff have buy-in to looking at and using data can be a challenge but
if programs make the effort to make this a part of what they do, it will
become institutionalized over time.
One way to begin to look at data is to bring some data to a staff meeting.
Do an activity where everyone gets a chance to look at the data and see what
stands out to them. Then go around the room and let each person say what
stands out. (Include ALL staff in this- teachers, administrators, support
personnel, volunteers). Make a list of the things that stand out and then
have a discussion about the different things. It could be that people
notice something really good or maybe an area of needed improvement.
We do this with our year end data every year. We look at enrollment,
retention, gains, goals achieved, demographics, numbers of ABE/GED/ESL
students, inquiries, and more. We usually find a few things to focus on for
the next program year from this activity. Some of our best ideas for
program improvement have come from our receptionist and office manager, as
they were looking at the data from a different perspective.

We also started adding looking at data to all of our meetings. We choose an
area to look at and examine (this is done in our monthly program improvement
meetings, teacher meetings, and other sub-groups of our staff's meetings).
We always try to look at data before making program changes (if the data is
available). Teachers eventually have gotten used to looking at the data and
seeing how it can inform their practice. They are more aware of their
students' attendance and testing information and can be more helpful in
helping the agency to make standards.

If your staff are not used to looking at data, start small but keep it
consistent. Make it a habit and eventually it will be part of what they do.

We do have great systems in PA where using data is something that is
required at the state level. Our program improvement plans that are
required for state funding must be built on data analysis. We have many
teachers and administrators that do practitioner research. However, this
doesn't mean that using data for program improvement is always easy. It is
time consuming but without it we are making decisions based on intuition or
hunches, which can have disastrous results. Change is sometimes
uncomfortable but all staff can see the value of using data once they learn
how to look at it and how it can better inform their practice.

********************************************
Lori Keefer
Program Director
Greater Pittsburgh Literacy Council
100 Sheridan Square 4th floor
Pittsburgh PA 15206
412.661.7323 ext 131
fax: 412.661.3040
lkeefer at gplc.org
www.gplc.org <http://www.gplc.org/>

This email is intended solely for the use of the named addressees and is not
meant for general distribution. If you are not the intended recipient,
please report the error to the originator and delete the contents.

-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of David Rosen
Sent: Wednesday, December 10, 2008 6:57 AM
To: The Assessment Discussion List
Subject: [Assessment 1494] Re: Using Data

Drucie and others,

Earlier in this discussion I asked for specific examples (narratives)
of teachers systematically using program data to answer their
questions. In the SEQUEL Monographs that you suggested we look at,
found at

http://www.pafamilyliteracy.org/pafamilyliteracy/cwp/view.asp?
a=223&Q=145708&PM=1

I see several good examples in these monographs. Thank you for
calling them to our attention. I would like to mention one, in
particular, the Seneca Highlands Intermediate Unit 9-"Cooperative
Learning in Adult Education to Improve Attitudes and Skills in Math" .

One of the biggest challenges our field faces is that very, very few
(I think under 4%) of those in adult secondary education who say they
want to go to college actually complete a degree. There are many
reasons for this, but one of the biggest is that they cannot pass
(usually required) college algebra. This is because they did not get
(positive) exposure to algebra either in school or in an adult
literacy education program. It is also because -- even if algebra is
offered in their ASE program -- many have negative attitudes about,
or fear of, learning algebra. This study, carried out by program
practitioners, looks at the use of cooperative learning as a strategy
to help students overcome negative attitudes and increase knowledge
of algebra during an eight-week program. The monograph is short, well-
written, easy to read, and has some findings worth getting excited
about. It would be great if there were other programs, where teachers
care about this problem, that could replicate it. I wonder if any
programs in Pennsylvania have already done that.

It would be terrific if there were a U.S. national adult literacy
research institute (such as NCSALL was) that would make funds
available to support programs replicating important studies such as
this, to help build a body of professional wisdom on the use of
cooperative learning in adult numeracy and mathematics. This might
provide a sufficient base of evidence to see if it is worthwhile
later to do "gold standard" experimental research.

Thanks, Drucie, and other leaders at all levels in Pennsylvania, who
have for many years now supported programs using data for program
decision-making. It looks like this may be paying off for
Pennsylvania practitioners, as they learn what does and doesn't work
for their students, and it is contributing to a literature of
professional wisdom* so necessary in our field.

David J. Rosen
djrosen at theworld.com

* For a dialogue about professional wisdom (including a definition)
with John Comings, former Director of the U.S. National Center for
the Study of Adult learning and Literacy, see: http://
wiki.literacytent.org/index.php/Professional_Wisdom




On Dec 9, 2008, at 11:23 AM, Drucilla Weirauch wrote:

> In Pennsylvania we have a statewide program improvement initiative
> that uses a specific Practitioner Action Research (PAR) model. Each
> program chooses its own area of inquiry, based on its data. These
> data may be hard data (scores, hours, enrollment numbers, etc.) or
> other data, based on our Indicators of Program Quality (for example,
> the quality of the adult education classroom environment or depth of
> partnerships). Last year, there were 61 projects conducted by PA
> Family Literacy sites. Topics ranged from increasing enrollment or
> retention hours, implementing scientifically-based reading research
> in the adult classroom, improving children's oral receptive
> vocabulary, to increasing referrals from partners. In the spring we
> hosted regional poster shows where programs showcased their projects
> and results. Each program also submitted a monograph that detailed
> their question and background to it (the data), the interventions,
> data sources, results, reflections, and implications for the field.
>
> Monographs can be found at our website
> www.pafamilyliteracy.org. <http://www.pafamilyliteracy.org. /> Left side, click on SEQUAL project, then
> Monographs.
>
> The website also includes the PAR handbook that helped the programs
> identify a problem based on data, intervention, choose best data
> sources, etc.
>
> I evaluated the process to ascertain practitioners' perceptions of
> the inaugural year of the intentional, systematic PAR process. While
> it added a layer of work, most felt that it empowered them as
> practitioners and gave their program "teeth." What was also important
> is that this allowed them to show highlights of their program and
> program improvement that mere data do not always capture (e.g. data
> reported to the state and feds.) Programs used data to inform their
> question and chart their success. Analyzing and reflecting on the
> data made it more than mere numbers. This evaluation report is also
> on the website. It includes a summary of the outcomes from the
> projects and the perceptions of the participants on the research
> process.
>
> Drucie Weirauch
> Penn State University
> Goodling Institute for Research in Family Literacy
>
> -------------------------------
> National Institute for Literacy
> Assessment mailing list
> Assessment at nifl.gov
> To unsubscribe or change your subscription settings, please go to
> http://www.nifl.gov/mailman/listinfo/assessment
> Email delivered to djrosen at theworld.com






-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to lkeefer at gplc.org


-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to copsey at allencc.edu



-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to cathayreta at sbcglobal.net



-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lincs.ed.gov/pipermail/assessment/attachments/20081211/61c68905/attachment.html