Making Connections Across Indicators to Improve Post-School Outcomes:


Early State Efforts

I. Data Collection on the Four Transition Indicators

B. State Examples of Data Collection on Transition Indicators 13 and 14

Alabama (AL), Indiana (IN), New York (NY), Texas (TX), Washington (WA), and Wisconsin (WI) have been collecting data on the four transition indicators for a number of years. The ways that these states have been collecting graduation and dropout data for Indicators 1 and 2 are relatively straightforward and follow the general SPP/APR guidelines (see http://www.rrfcnetwork.org/content/view/490/47/). They are extracting these data from their state's education database and calculating rates in the same manner that their state employs for general education students.

Section Contents

Data collection on Indicators 13 and 14 is more involved and varied across states. To provide assistance to other states, this section describes:

  • Ways that the six states are collecting data
  • How these states are linking collection on the two indicators
  • Importance of LEA training
  • Advice to states on data collection

Indicator 13

Contacted states have primarily developed protocols for examining IEPs based on the NSTTAC Indicator 13 Checklist (http://www.nsttac.org/pdf/checklista.pdf). NSTTAC has put together a PowerPoint presentation on the use of the list for IEP reviews that outlines the NSTTAC recommended list, what each element entails, and examples of each (http://www.nsttac.org/nsttac_presentations/indicator%2013.ppt Link to a PowerPoint Document).

Indiana, New York, Texas, and Wisconsin have developed online protocols for IEP reviews for LEAs to complete. Indiana has developed a new online Transition IEP protocol that includes thought-provoking questions that pop up for each requirement, examples and nonexamples of what meets each requirement, and links to resources when the IEP doesn't meet a requirement. Wisconsin's online protocol used by districts for reviewing sample IEPs has a similar feature that provides a pull-down menu of options for further information on potential issues and remedies when an IEP does not meet a particular requirement.


1The following staff provided graphics included in this section: Karen Rabren and George Hall at Auburn University in Alabama (AL); Doris Jamison, Joanne LaCrosse, Wendy Latimer, and Cynthia Wilson with the New York State Department of Education (NY); Cinda Johnson and Lisa Scheib at Seattle University in Washington State (WA); and Cathy Hammond and Loujeania Bost at the National Dropout Prevention Center for Students with Disabilities. Graphics included are derived from actual state data or, if state data were not available, prototypes were developed for the purposes of this guide. The following staff from the other three states also made contributions to this section, as noted in parenthesis after relevant text by state abbreviation: Teresa Grossi, Indiana University, and Adam Bauserman, Ball State Univeristy, in Indiana (IN); Debby Norris and Carla Johnson with the Texas Department of Education (TX); and Mary Kampa, Linda Maiterjean, and Lynese Gulczynski with the Wisconsin Department of Public Instruction (WI).

State Online IEP Review Protocol Examples

Indicator 14

Each of the six states contacted about data collection efforts have been conducting post-school outcome surveys for a number of years (7-17 years). They have collected these data through longitudinal studies of individual class cohorts or statewide or district-based annual surveys. All have modified one or more aspects of their procedures, such as sample composition or survey instrument(s), to meet OSEP SPP/APR requirements.

State Post-School Outcome Survey Examples

One of the main decisions a state must make in developing its survey procedures is the means by which the survey will be administered. Will the surveys be mailed to school leavers (exiters)? Will local teachers make phone calls to conduct surveys with school leavers (exiters)? Will the state contract with a university or professional research organization or call center to conduct surveys?

As summarized in Table B-1, the six states contacted use several different methods for administering their post-school outcomes surveys. One state conducts their post-school outcomes survey by mail and had a 25% response rate in 2007. This state has implemented an online survey format to try to increase the response rate (TX). Two states use university or professional call centers to administer their surveys, with New York having a 65% response rate through this method in 2007. In Alabama, Indiana, and Washington, local teachers and other school staff conduct survey phone interviews. Washington achieved a 78% response rate using this method in the 2005-06 school year. Three of the states are currently conducting exit surveys with students prior to graduation and one state hopes to reinstate their survey after it was dropped due to budget cuts.

Table B-1

Survey Administration Methods of Contacted States
    Alabama (AL) Indiana (IN) New York (NY) Texas (TX) Washington (WA) Wisconsin (WI)
Conduct exit survey
*
X
X
X
Method of collection for post-school outcomes survey.
Mail
X
Phone by staff at call center
X
X
Phone by teachers/local staff
X
X
X
Online
X
X
X
X

*Alabama conducted an exit survey until last year, when it was discontinued due to budget cuts. They hope to have it reinstated next year (2008-09).

To assist states in weighing options for conducting surveys, contacted states outlined some benefits and drawbacks of the methods they were using, which are outlined in Table B-2.

Table B-2

Survey Administration Method Benefits and Drawbacks
Survey Administration Method Benefits Drawbacks
Mail Survey Low cost (TX) Low response rate, even with follow-up mailings and phone calls (TX)

Due to low response rate, difficulty in achieving representative student sample, limiting generalizability of results, and inadequate sample size to carry out subgroup comparisons (TX)
Phone Survey by Call Center Higher response rate than mail out

Trained survey experts supervising those doing phone interviews (NY)

Professional assistance to districts with sampling (NY)

Trained staff to assist LEAs with student sample selection and collection of contact information (NY)

Provision of some pre-set resources and contact information to adult service agencies during phone survey (NY)

More consistent data collection across surveys (NY)

Does not burden teachers (NY)

Less subjective response to interviewers since they have not taught youth (NY)

Quicker process (NY)

Higher cost (AL, NY)

Lack of availability of facilities in all states (NY)

Lower teacher and administrator buy-in, since not involved directly in data collection, and may be less likely to use results (NY)

Not a great response rate for cost

Phone Survey by Teacher/Local Staff Highest response rate of the three methods

Provision of individualized assistance to former students when they ask for help during phone interview (AL)

Greater teacher buy-in on using data (WA)

Informative for local people because they hear directly about local issues (WA)

Potential increase in likelihood that former students respond to survey (WA)

Former students might be more apt to report the "truth" to a person they know rather than someone from a survey company that has no contact with the student, family, or community (WA)

Inconsistent response rates and data quality (IN)

Risk "glow" effect where get only good news from students to please former teacher (NY)

Overburden teachers (WA)

Highly dependent on quality of teacher training (WA)

State Survey Administration Method Examples

Positive Feedback on Conducting Surveys

Several states reported receiving positive feedback on post-outcome surveys from parents and teachers.

  • Families and former students often welcome the follow-up, whether through teachers or call centers, and appreciate the opportunities it gives them to receive advice and referrals to community services (AL, NY, WI).
  • Parents in New York have let school boards know how much they appreciate the follow-up on their children.
  • Many LEAs have welcomed the opportunity to receive this type of information and some LEAs collect these data more often than required (TX, WI).
  • Many of the teachers in Washington State conducting the survey phone interviews liked finding out what has happened with their students after leaving school (WA).

Linking Indicators Through Data Collection

It is possible to link data collection on each of the four transition indicators to make it easier to analyze interrelationships between them. Examples from contacted states are described below.

Links Between Indicators 1 and 2 and Indicator 14:

  • Include demographic and exit status data on follow-up survey form collected from student or student database prior to survey: NY, WA
  • Match student database information on demographics and exit status before survey analysis through unique student identifier on survey: WI
  • Collect demographic and exit status from student on exit or post-school outcomes survey: AL, TX, WA

Links Between Indicator 13 and Indicator 14:

  • Currently, or eventually, directly tie data from Indicator 13 to Indicator 14 in online database: AL, IN, WA, WI
  • Coordinate samples for aggregate match between class's IEPs and follow-up survey responses by sampling IEPs in certain districts one year and school leavers (exiters) from those same districts the following year: NY
  • Match student exit and follow-up survey responses through unique student identifier: AL, TX
  • Collect information on IEP goals:
    • During exit conference: IN
    • From final IEP in preparation for follow-up interview: WA
    • During follow-up interview: AL, WI
  • Collect information on plans for living, working, and school after leaving high school on exit survey, which can be linked to follow-up survey: TX
  • Collect information on helpfulness of transition services on follow-up survey used in longitudinal study of two class cohorts: NY

For information on how other states are carrying out data collection on Indicator 14, the NPSO Center Web site includes profiles of state efforts on this indicator (http://www.psocenter.org/state_profiles.html).

Importance of LEA Training on Data Collection

Regardless of the method used for collecting data for Indicators 13 and 14, states emphasized the need for providing training for LEAs in data collection. The State of Washington has developed online training modules for their survey and data collection system. Wisconsin has carried out webcasts and provided training materials online. Indiana invites all LEAs and other constituents to initial trainings on data collection.

State LEA Training Material Examples

Advice for Other States on Data Collection

When making plans for and implementing data collection, contacted states recommended that other states consider:
Advice for Other States on Data Collection
Getting buy-in from state education leaders (NY)
Linking data collection to the school improvement and accountability process in your state (NY)
Involving LEAs and other stakeholders as much as possible in data collection (IN, TX, WA)
Making data collection meaningful for LEAs (WA, WI)
Providing training for LEAs in data collection, interpretation and use of data (NY, TX, WA)
Identifying a facilitator and establishing strong leadership in all districts to oversee data collection (WA)
Collecting contact information on students for follow-up surveys early (9th grade or by 11th grade) to ensure contact information for dropouts (AL)
Updating guidance counselors, principals, and other impacted groups at all stages of data collection (IN)

III. Additional Resources

IV. References