Making Connections Across Indicators to Improve Post-School Outcomes:


Early State Efforts

II. Making Connections Across Indicators to Understand Trends

A. Making Connections Across Indicators in Analysis and Reporting

Section Contents

The first step in analysis of transition indicator data is to identify trends in graduation, dropout, IEPs and transition services, and post-school outcomes. This section helps states get started on these types of analyses by providing:

  • Guiding questions for analysis of all indicators
  • Guiding questions specific to Indicator 14
  • Examples of ways that Alabama (AL), Texas (TX), Washington (WA), and Wisconsin (WI) have assisted local schools and districts in data analysis
  • Links to additional data analysis sections
  • Guiding questions on reporting
  • Policy and practice considerations

Guiding Questions for Analysis

To give states some general guidance on analyzing these trends in post-school outcomes, guiding questions addressing all four indicators and some directed solely at Indicator 14 are outlined as follows.

1Unless otherwise noted, detailed information on what the six contacted states are doing in relation to these four transition indicators was collected during phone conversations and follow-up emails with the following staff: Alabama (AL): Karen Rabren, August 21, 2007, with assistance from George Hall; Indiana (IN): Teresa Grossi, August 27, 2007, with follow-up with Adam Bauserman; New York (NY): Doris Jamison, Joanne LaCrosse, Wendy Latimer, and Cynthia Wilson, August 8, 2007; Texas (TX): Debby Norris and Carla Johnson, August 8, 2007; Washington (WA): Cinda Johnson, August 7, 2007, with assistance from Lisa Scheib; and Wisconsin (WI): Mary Kampa and Linda Maiterjean, August 22, 2007, with assistance from Lynese Gulczynski. Throughout the document, contributions of each of the six states will be noted in parenthesis after relevant text by the state abbreviations noted above. Graphics included are derived from actual state data or, if state data were not available, prototypes were developed for the purposes of this document.

Guiding Questions for All Indicators
Are data sufficient to answer questions? (Bost, Falls, Klare, & Test, 2007) If not, have areas lacking data or the types of data needed been identified and plans made to collect these data?
Do these data meet the state’s APR targets? Is there sufficient data to explain progress or slippage as compared to the target rate? (Bost et al., 2007) If not, are there plans in place to collect necessary data?
What are the outcomes for students with disabilities?
How do the outcomes differ for youth with varying demographics?
What are the trends in these indicators over time?
How do rates vary across LEAs?
How do the rates of special education students compare to those of general education students? (Bost et al., 2007)
Have you gotten input from LEAs/stakeholder groups on the review and summary of data trends? (IN, TX, WA)
Based on these data, which strategies appear to be working and which appear to need improvement? (Reder, 2006)
Do data indicate what types of activities may be needed to address areas needing improvement? (Bost et al., 2007)
Guiding Questions for Indicators 14
How similar are the school leavers (exiters) in characteristics to those in the sample that didn’t complete surveys?
Are the school leavers (exiters) completing surveys representative of the total population of students in the class under study? In terms of disability categories, race/ethnicity, and gender?
Is the response rate sufficient to make all desired subgroup comparisons?
What are students with IEPs doing after leaving school:
  • What percentage is competitively employed?
  • What percentage is enrolled in postsecondary school?
  • What percentage is doing both?
  • What percentage is doing neither?
What are the trends in these indicators over time?
How do rates vary across LEAs?
How do the rates of special education students compare to those of general education students? (Bost et al., 2007)
Have you gotten input from LEAs/stakeholder groups on the review and summary of data trends? (IN, TX, WA)
Based on these data, which strategies appear to be working and which appear to need improvement? (Reder, 2006)
Do data indicate what types of activities may be needed to address areas needing improvement? (Bost et al., 2007)

Important Reminder About Data Quality

Definitive answers to any of the guiding questions included in this or other sections of the guide require representative samples, rigorous research designs, and the use of appropriate statistics. In the absence of these, only tentative conclusions can be drawn and areas requiring further research identified. These explorations, however, can serve as a starting point and provide useful information that can be combined with data from other sources to help guide improvement activities.

Sample Data Summary Tables and Retreats to Aid Local Analysis

Even with guiding questions, some LEAs and schools may find it difficult to know where to begin to analyze data collected for the four transition outcomes. Contacted states have developed various ways to assist school and district staff with analysis of post-school outcome data, such as developing online automated data summary tables and data retreats. Several examples are described below.

  • The Center for Change in Transition Services at Seattle University has developed a protocol for LEAs in the state of Washington to assist with post-school outcome data analysis (http://www.seattleu.edu/ccts/docs/Examiningthedata.pdf Link to a PDF Document). The document includes an overview of the Washington survey process and strategies for examining these data. It was designed to be used in conjunction with a three-page post-school data report compiled for each LEA that summarizes survey findings from students leaving their schools and compares these results to those of school leavers (exiters) statewide.
  • Texas has developed an online survey reporting system which produces individual LEA reports. These reports are posted online for LEA staff to retrieve (http://www.texaseffectiveness.net Link to an External Web Page). Reports are available for responses to their Grade-12 Exit Survey and their Post-School Survey and include results for both special education and general education students, with breakdowns of responses by demographic characteristics.
  • The Alabama online Post-School Outcomes Data System produces reports for LEAs that include results from their Post-School Transition Survey (https://fp.auburn.edu/institute/PODS/PODS_demo Link to an External Web Page). These reports include summaries of survey results in the following sections (AL):
    • Demographics of survey respondents from LEA and statewide
    • Quality of high school preparation for post-school experiences for LEA and statewide
    • Level of independence of school leavers (exiters) for LEA and statewide
    • Employment and education outcomes for LEA and statewide
  • The state of Wisconsin has developed a statewide data retreat process for districts to help with improvement planning. Special education retreats on transition outcomes are based on a standardized data review process and provide LEAs with automated survey response reports that summarize survey results for their district by gender, ethnicity, disability, exit status, within district schools, and across survey years. Educational consultants work with LEA teams over a 2-day period to analyze their data, outline observations, make connections, and develop district improvement plans based on their review and discussion. LEAs are encouraged to use the survey Web site (http://www.posthighsurvey.org Link to an External Web Page) to track improvement efforts and resources.

State Analyses of Post-School Outcome Data

The key to understanding trends in the four transition indicators is to analyze the interrelationships between them. The six states that were contacted have all been co-analyzing data from these indicators and their efforts generally fall into three main categories:

  1. Comparisons in outcomes across demographic groups,
  2. Comparisons in outcomes between graduates and nongraduates, and
  3. Comparisons of outcomes by quality of IEP and transition services.

Brief discussions of each of these types of analysis efforts, along with examples of how states are using results from these analyses, appear in separate sections. Click on the links below to get to the desired section. Making Connections Between Demographic Characteristics and Post-School Outcomes Making Connections Between Graduation, Dropping Out and Post-School Outcomes Making Connections Between Transition Planning and Post-School Outcomes.

Guiding Questions on Reporting
After analysis is completed on data from the four transition indicators, decisions need to be made on public reporting of findings. Factors to consider while developing reporting plans are outlined in the following guiding questions.
WHY? Identify and resolve the purposes and goals for reporting. (Learning Points Associates (LPA), 1995, as cited in Bost, 2006)
  • What data do you have that you need to report? (Bost, 2006)
  • Is reporting for accountability, program development, and/or program evaluation purposes? (Bost, 2006)
  • What do LEAs/stakeholder groups see as the goals and purposes for reporting? (TX, WA, IN)
WHO? Decide the audience(s) to be addressed. (LPA, 1995, as cited in Bost, 2006)
  • Who should you share these data with? (WA)
  • Why do you need to share these data with them? (WA)
WHAT? Determine the key messages you want/need to share. (LPA, 1995, as cited in Bost, 2006)
  • How will you tailor the message to each particular audience so that they care about the findings and use the data to help students? (NY, TX, WA, WI)
  • What "story" do you want to tell with the data? (Reder, 2006, p. 7)
  • What are the six or seven factors that were found to make the most difference? (NY)
  • What is the good and bad news that you should include in the report? (TX, WA, NY) What are your strengths as well as your weaknesses? (TX)
  • What interpretation of data trends do you want to include in your reports and summaries? (NY)
  • How can you keep a focus on student outcomes, not just compliance issues, in your data reports? (TX)
Guiding Questions on Reporting (Cont’d.)
After analysis is completed on data from the four transition indicators, decisions need to be made on public reporting of findings. Factors to consider while developing reporting plans are outlined in the following guiding questions.
HOW? Determine procedures for reporting findings you are sharing. (LPA, 1995, as cited in Bost, 2006)
  • What are the best formats/media for conveying the information to your various audiences? (Bost, 2006) Should it be in print? Online on your state's Web site? Public presentations? LEA training?
  • To ensure that data are presented in the best way, ask whether data are presented:*
    • In a visually appealing fashion?
    • In an unambiguous and clear manner (i.e., using keys/legends and clear labels)?
    • Using clear citations of data sources?
    • Efficiently by conveying the ideas in tables, charts, or graphs?
    • In user-friendly, alternative formats in multiple ways?
  • "What is our state's rule regarding the masking/suppressing of data based on small cell sizes for purposes of confidentiality and what are the implications of this rule for reporting our post-school outcomes data . . . .?" (Brauen, 2006, p. 5).
  • How can you involve LEAs/stakeholder groups in decisions about data reporting? (TX, WA, IN)
WHEN? Resolve/Decide the schedule for reporting (LPA, 1995, as cited in Bost, 2006).
  • When is the most beneficial time during the school year to report these data to selected audiences? What time of day is best for each audience?
  • What setting is best for reporting to each audience? A retreat? Teacher in-service days? School board meetings?
  • How much time will reporting take for each audience, and how often will you need to do it for each group?

*Adapted from Office of Special Education, State Performance Plan Summer Institute, 2005, as cited in "Sharing the Findings," by L. Bost, PowerPoint presentation by the National Dropout Prevention Center for Students with Disabilities at the National Secondary Transition Technical Assistance Center's "Making the onnection" Forum, Denver, CO, September 2006

The Impact of Findings on Policy and Practice

Once reasons for outcome differences are understood, states and LEAs can examine what they can do to improve post-school outcomes. States or LEAs could decide to:

  • Provide professional development and TA for LEAs
  • Change or increase monitoring
  • Improve data collection
  • Change state and/or local policy
  • Change state and/or local practices
  • Realign or reallocate resources
  • Collect further data, information, or technical assistance

III. Additional Resources

IV. References