[Assessment 1522] Managed Enrollment vs Open Enrollment

Archived Content Disclaimer

This page contains archived content from a LINCS email discussion list that closed in 2012. This content is not updated as part of LINCS’ ongoing website maintenance, and hyperlinks may be broken.

Schneider, Jim jschneider at eicc.edu
Fri Dec 12 00:45:42 EST 2008


<The consequences of such a move completely affected important items like student testing, recruitment and retention, even staff retention, in very positive ways. It’s a layer of accountability that is the responsibility first of the student and therefore can change the dynamic of the program.>

Herein lies one example of the schizophrenic nature of Adult Literacy Education. There is little doubt by anyone with operational synapses that managed enrollment provides a multitude of advantages over open enrollment. However, the funding of programs is often structured in a manner that prevents such an operational decision.

We do not have the sane guidelines like the Student Support Services TRIO program where the cost of adequately serving students is known and mandated so that funding isn't diluted to serve too many people. Rather in Adult Literacy Education, the funding is so limited that the available funds fall far short of meeting the demand.

Add to this insanity, states that fund Adult Literacy Education based on heads and hours and soon anything remotely resembling managed enrollment is tossed aside to merely make the most of the limited funds with as many learners as possible.

I cannot begin to express the envy I have for the programs located in states more interested in quality service over maximizing heads/hours. The luxury of choosing between open and managed enrollment is a choice I hope to have before my career is over.

Jim Schneider




-----Original Message-----
From: assessment-bounces at nifl.gov on behalf of Marie Cora
Sent: Thu 12/11/2008 11:13 AM
To: 'The Assessment Discussion List'
Subject: [Assessment 1511] Re: Testing and Managed Enrollment

Hi all,



This thread is interesting to me. If you look at the focus areas in the
first module of the initiative (see Recommended Preparations for the
Discussion at http://www.nifl.gov/lincs/discussions/assessment/08data.html)
you will see that a number of these areas speak either directly or
indirectly to the exchanges below by Barbara, Phil and Nancy. They focus on
things that would affect pre and post testing, student follow-up,
achievement of goals, and the like. Managed enrollment also came up during
out work with Module 2 as well.



One of the things that Cathay and I found in identifying programs that
employed successful practices was a deliberate move away from
open-enrollment toward managed enrollment. The consequences of such a move
completely affected important items like student testing, recruitment and
retention, even staff retention, in very positive ways. It's a layer of
accountability that is the responsibility first of the student and therefore
can change the dynamic of the program.



Cathay - do you have any comments about managed enrollment and its positive
affects on the program? Vivian, Donielle, and Lori - any thoughts here on
open versus managed enrollment?



Everyone - your thoughts?



Marie







-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of Nancy R Faux/AC/VCU
Sent: Thursday, December 11, 2008 11:33 AM
To: The Assessment Discussion List
Subject: [Assessment 1509] Re: [BULK] Getting staff used to using data




Phil,

Your statistics on Managed Enrollment (ME) are very encouraging. I am going
to save your email to use it later on when speaking with programs that are
resistant to ME.

I also find it insightful that your courses are only 7-8 weeks. About how
many hours of instruction per week would there be? Do the students have
enough hours to qualify for a post-test?

Nancy

*********************************************************
Nancy R. Faux
ESOL Specialist
Virginia Adult Learning Resource Center
Virginia Commonwealth University
3600 W. Broad Street, Suite 669
Richmond, VA 23230-4930
nfaux at vcu.edu
<http://www.valrc.org/> http://www.valrc.org
1-800-237-0178




"Anderson, Philip" <Philip.Anderson at fldoe.org>
Sent by: assessment-bounces at nifl.gov

12/11/2008 10:31 AM


Please respond to
The Assessment Discussion List <assessment at nifl.gov>


To

<barbara.jacala at guamcc.edu>, "The Assessment Discussion List"
<assessment at nifl.gov>


cc




Subject

[Assessment 1508] Re: [BULK] Getting staff used to using data











Barbara,

When you say you get 60%, do you mean that you get 60% of the students to
take a post-test, or that you get 60% of the students to pass to a higher
level when they post-test?

Either way, compared to the numbers we see in Florida, that is great! We
see less than 50% of the students take a post-test, and, depending on the
level, we usually see about 25-30% of the total number of enrolled students
that pass to a higher level. However, when students do take a post-test, we
see that about 50% of these students actually pass to a higher level.
Programs in Florida use primarily the CASAS Life and Work series for ESOL
students. The other tests that our programs use are the BEST Plus and BEST
Literacy.

The message we are trying to get out is that programs need to find ways to
get students to stay in class long enough to post-test. If a few more
percentage points of the students would "persist" until they post-test, the
data would show much higher rates of students passing to higher levels.
Programs will show much stronger results if only they can hone in on
"retaining" students!

Florida has had several local programs begin to implement managed enrollment
in their ESOL programs, and the results are astoundingly high! Managed
Enrollment (ME) programs consistently see 80-90% of students stay long
enough to post-test. And of those who post-test, 70-80% pass to a higher
level.

Miami Dade school district adult ESOL program did a pilot of 7 sites with
ME, and showed these types of numbers. They called the ME classes
"Intensive English Academies." Although the curriculum was the same, the
length of the courses was shortened to 7-8 weeks, and after the first week
the classes were closed to new students entering. The teachers and students
found they were free from the chaos of students coming and going so much,
and were able to build on previous lessons better. For more information
about the ESOL Academies, visit <http://www.floridaadultesol.org/>
www.floridaadultesol.org, or write to Dr. Beatriz Diaz, Adult ESOL
Coordinator, at <mailto:bdiaz at dadeschools.net> bdiaz at dadeschools.net.

Phil
(850) 245-9450



_____


From: Barbara Jacala [mailto:barbara.jacala at guamcc.edu]
Sent: Wednesday, December 10, 2008 7:00 PM
To: cathayreta at sbcglobal.net; 'The Assessment Discussion List'
Subject: [Assessment 1505] Re: [BULK] Getting staff used to using data

Hello. I am a latecomer to this discussion and I hope not too late to get
feedback from experienced practitioners. I am the program specialist for
adult education at Guam Community College. The difficulty we encounter is in
raising the level of our paired tests (pre and post). I am getting only
about 60%. Is this average for this population? How are others encouraging
students to take the post before leaving? Or how are they making sure that
everyone is post tested? What strategies have you put in place?

Barbara Jacala
Guam Community College
Adult Education Program Specialist
POB 23069 GMF, GU 96931
671-735-5584
barbara.jacala at guamcc.edu




_____


From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of Cathay Reta
Sent: Thursday, December 11, 2008 8:29 AM
To: The Assessment Discussion List
Subject: [Assessment 1503] Re: [BULK] Getting staff used to using data



Hello,

As someone previously mentioned, I think it is helpful to see that data is
more than numbers. It also includes information drawn from surveys, focus
groups, interviews, etc. I am reminded of Project Learn in Akron, Ohio.
Their reports showed that

"Students functioning between the 5.0 and 8.9 grade level equivalent
attended for an average of 35 hours for the program year while literacy
students averaged 55 hours, GED students averaged 42 hours, and ESOL
students averaged 65 hours."

So they gathered more data by talking with students in the pre-GED level and
found that they did not feel comfortable going from their student
orientation into a classroom where the instructor and students already knew
each other. Based on that, they re-arranged class schedules to give
instructors a half hour to meet with new students before the start of class.
That made a difference evidenced by the next year's report -- the average
hours of attendance for those students increased to 52 hours.

I think examples like this are what make me excited about "data."

I'm wondering if anyone has other examples to share, or questions about what
type of data to review to address specific concerns. Anyone??

Cathay



Cathay O. Reta
Cornerstone Concepts
6670 Southside Drive
Los Angeles, CA 90022
Ph: 323) 728-4302
cathayreta at sbcglobal.net

--- On Wed, 12/10/08, Vivian Copsey <copsey at allencc.edu> wrote:
From: Vivian Copsey <copsey at allencc.edu>
Subject: [Assessment 1501] Re: [BULK] Getting staff used to using data
To: "The Assessment Discussion List" <assessment at nifl.gov>
Date: Wednesday, December 10, 2008, 11:45 AM
I agree wholeheartedly with your approach to reviewing data. In Kansas, we
are required to meet certain indicators of a quality adult education program
in order to receive funding. Our instructional staff realizes our data
reflects our progress toward these measures. We have an excellent database
used by all AEFLA funded programs across the State. The database is
available for review by the staff at the Kansas Board of Regents and by any
member of our staff. A representative for the adult education program does a
yearly on-site visit to review files for accuracy.

I am the only one who enters data. Any incorrect data becomes my
responsibility. At our monthly staff meetings, all data is compared to the
student's file for accuracy. Because we a very small, rural program (118
participants with 12 hours or more in fiscal year 2008), I am able to
continually review data on a weekly basis. Reports are printed out weekly
and compared with what goals we projected, building from previous years.

Our staff meetings consist of two full time instructors, one part time
instructor, one part time administrative staff and one full time
administrative staff. Because we work very closely as a team, we look at the
data for the whole program. If there are deficits in any area, we began to
look at ways for improvement. Staff members are helpful to one another in
giving suggestions.

Overall, when our staff members receive a report, they are very interested
in our outcomes. Sometimes there is no awareness of progress or lack of
progress until it is seen on paper. The results don't carry any negative
connotations but a desire for improvement.

Vivian Copsey
Coordinator Adult Education
Allen Community College
1801 N. Cottonwood
Iola, KS 66749
620.365.5116 x250

At 08:35 AM 12/10/2008 -0500, you wrote:
Having staff have buy-in to looking at and using data can be a challenge but
if programs make the effort to make this a part of what they do, it will
become institutionalized over time.
One way to begin to look at data is to bring some data to a staff meeting.
Do an activity where everyone gets a chance to look at the data and see what
stands out to them. Then go around the room and let each person say what
stands out. (Include ALL staff in this- teachers, administrators, support
personnel, volunteers). Make a list of the things that stand out and then
have a discussion about the different things. It could be that people
notice something really good or maybe an area of needed improvement.
We do this with our year end data every year. We look at enrollment,
retention, gains, goals achieved, demographics, numbers of ABE/GED/ESL
students, inquiries, and more. We usually find a few things to focus on for
the next program year from this activity. Some of our best ideas for
program improvement have come from our receptionist and office manager, as
they were looking at the data from a different perspective.

We also started adding looking at data to all of our meetings. We choose an
area to look at and examine (this is done in our monthly program improvement
meetings, teacher meetings, and other sub-groups of our staff's meetings).
We always try to look at data before making program changes (if the data is
available). Teachers eventually have gotten used to looking at the data and
seeing how it can inform their practice. They are more aware of their
students' attendance and testing information and can be more helpful in
helping the agency to make standards.

If your staff are not used to looking at data, start small but keep it
consistent. Make it a habit and eventually it will be part of what they do.

We do have great systems in PA where using data is something that is
required at the state level. Our program improvement plans that are
required for state funding must be built on data analysis. We have many
teachers and administrators that do practitioner research. However, this
doesn't mean that using data for program improvement is always easy. It is
time consuming but without it we are making decisions based on intuition or
hunches, which can have disastrous results. Change is sometimes
uncomfortable but all staff can see the value of using data once they learn
how to look at it and how it can better inform their practice.

********************************************
Lori Keefer
Program Director
Greater Pittsburgh Literacy Council
100 Sheridan Square 4th floor
Pittsburgh PA 15206
412.661.7323 ext 131
fax: 412.661.3040
lkeefer at gplc.org
www.gplc.org <http://www.gplc.org/>

This email is intended solely for the use of the named addressees and is not
meant for general distribution. If you are not the intended recipient,
please report the error to the originator and delete the contents.

-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of David Rosen
Sent: Wednesday, December 10, 2008 6:57 AM
To: The Assessment Discussion List
Subject: [Assessment 1494] Re: Using Data

Drucie and others,

Earlier in this discussion I asked for specific examples (narratives)
of teachers systematically using program data to answer their
questions. In the SEQUEL Monographs that you suggested we look at,
found at

http://www.pafamilyliteracy.org/pafamilyliteracy/cwp/view.asp?
a=223&Q=145708&PM=1

I see several good examples in these monographs. Thank you for
calling them to our attention. I would like to mention one, in
particular, the Seneca Highlands Intermediate Unit 9-"Cooperative
Learning in Adult Education to Improve Attitudes and Skills in Math" .

One of the biggest challenges our field faces is that very, very few
(I think under 4%) of those in adult secondary education who say they
want to go to college actually complete a degree. There are many
reasons for this, but one of the biggest is that they cannot pass
(usually required) college algebra. This is because they did not get
(positive) exposure to algebra either in school or in an adult
literacy education program. It is also because -- even if algebra is
offered in their ASE program -- many have negative attitudes about,
or fear of, learning algebra. This study, carried out by program
practitioners, looks at the use of cooperative learning as a strategy
to help students overcome negative attitudes and increase knowledge
of algebra during an eight-week program. The monograph is short, well-
written, easy to read, and has some findings worth getting excited
about. It would be great if there were other programs, where teachers
care about this problem, that could replicate it. I wonder if any
programs in Pennsylvania have already done that.

It would be terrific if there were a U.S. national adult literacy
research institute (such as NCSALL was) that would make funds
available to support programs replicating important studies such as
this, to help build a body of professional wisdom on the use of
cooperative learning in adult numeracy and mathematics. This might
provide a sufficient base of evidence to see if it is worthwhile
later to do "gold standard" experimental research.

Thanks, Drucie, and other leaders at all levels in Pennsylvania, who
have for many years now supported programs using data for program
decision-making. It looks like this may be paying off for
Pennsylvania practitioners, as they learn what does and doesn't work
for their students, and it is contributing to a literature of
professional wisdom* so necessary in our field.

David J. Rosen
djrosen at theworld.com

* For a dialogue about professional wisdom (including a definition)
with John Comings, former Director of the U.S. National Center for
the Study of Adult learning and Literacy, see: http://
wiki.literacytent.org/index.php/Professional_Wisdom




On Dec 9, 2008, at 11:23 AM, Drucilla Weirauch wrote:


> In Pennsylvania we have a statewide program improvement initiative

> that uses a specific Practitioner Action Research (PAR) model. Each

> program chooses its own area of inquiry, based on its data. These

> data may be hard data (scores, hours, enrollment numbers, etc.) or

> other data, based on our Indicators of Program Quality (for example,

> the quality of the adult education classroom environment or depth of

> partnerships). Last year, there were 61 projects conducted by PA

> Family Literacy sites. Topics ranged from increasing enrollment or

> retention hours, implementing scientifically-based reading research

> in the adult classroom, improving children's oral receptive

> vocabulary, to increasing referrals from partners. In the spring we

> hosted regional poster shows where programs showcased their projects

> and results. Each program also submitted a monograph that detailed

> their question and background to it (the data), the interventions,

> data sources, results, reflections, and implications for the field.

>

> Monographs can be found at our website

> www.pafamilyliteracy.org. <http://www.pafamilyliteracy.org.%20/> Left

side, click on SEQUAL project, then

> Monographs.

>

> The website also includes the PAR handbook that helped the programs

> identify a problem based on data, intervention, choose best data

> sources, etc.

>

> I evaluated the process to ascertain practitioners' perceptions of

> the inaugural year of the intentional, systematic PAR process. While

> it added a layer of work, most felt that it empowered them as

> practitioners and gave their program "teeth." What was also important

> is that this allowed them to show highlights of their program and

> program improvement that mere data do not always capture (e.g. data

> reported to the state and feds.) Programs used data to inform their

> question and chart their success. Analyzing and reflecting on the

> data made it more than mere numbers. This evaluation report is also

> on the website. It includes a summary of the outcomes from the

> projects and the perceptions of the participants on the research

> process.

>

> Drucie Weirauch

> Penn State University

> Goodling Institute for Research in Family Literacy

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

> Email delivered to djrosen at theworld.com







-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to lkeefer at gplc.org


-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to copsey at allencc.edu

-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
<http://www.nifl.gov/mailman/listinfo/assessment>
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to cathayreta at sbcglobal.net


-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
<http://www.nifl.gov/mailman/listinfo/assessment>
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to nfaux at vcu.edu








This message contains confidential information and is intended for the individual to whom it is addressed. If you are not the intended recipient you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited. E-mail transmission cannot be guaranteed to be secure or error-free as information could be intercepted, corrupted, lost, destroyed, arrive late or incomplete, or contain viruses. The sender therefore does not accept liability for any errors or omissions in the contents of this message, which arise as a result of e-mail transmission. If verification is required please request a hard-copy version.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lincs.ed.gov/pipermail/assessment/attachments/20081211/7a12f5be/attachment.html