Letter From the United States Department of Education, Office of Special Education and Rehabilitative Services
Date stamped October 29, 2003
Dr. David P. Driscoll
Commissioner of Education
Massachusetts Department of Elementary and Secondary Education
75 Pleasant Street
Malden, Massachusetts 02148-4906
Dear Commissioner Driscoll:
The purpose of this letter is to inform you of the results of the Office of Special Education Programs' (OSEP's) recent verification visit to Massachusetts. As indicated in my letter to you of June 18, 2003, OSEP is conducting verification visits to a number of States as part of our Continuous Improvement and Focused Monitoring System (CIFMS) for ensuring compliance and improving performance with Parts B and C of the Individuals with Disabilities Education Act. We conducted our visit to Massachusetts during the week of July 14, 2003.
The purpose of our verification reviews of States is to determine how they use their general supervision, State-reported data collection, and State-wide assessment systems to assess and improve State performance; and to protect child and family rights. The purposes of the verification visits are to: (1) understand how the systems work at the State level; (2) determine how the State collects and uses data to make monitoring decisions; and (3) determine the extent to which the State's systems are designed to identify and correct noncompliance.
As part of the verification visit to the Massachusetts Department of Elementary and Secondary Education (ESE), the OSEP staff met with Ms. Marcia Mittnacht (the State's Director of Special Education), and members of ESE's staff who are responsible for the State's general supervision activities (including monitoring, mediation, complaint resolution, and impartial due process hearings), the collection and analysis of State-reported data, and State-wide assessment. Prior to the visit, OSEP staff reviewed a number of documents1, including the following: (1) Massachusetts's Part B State Improvement Plan and progress reports; (2) ESE's Coordinated Program Review Procedures (3) OSEP's 2000 Monitoring Report of ESE; (4) Massachusetts Comprehensive Assessment System Manual; (5) 2003 Eligibility Documents; and (6) other information from the State's website. In addition, we conducted a conference call on June 24, 2003, with a member of Massachusetts' State Advisory Panel on Special Education, to hear their perspective on the strengths and weaknesses of the State's systems for general supervision, data collection, and, for Part B, State-wide Assessment. Ms. Marcia Mittnacht, the State's Director for Special Education, also participated in the call. During our visit to ESE, OSEP staff met with Ms. Mittnacht and other ESE staff who are involved in, and responsible for, the oversight of general supervision activities (including monitoring, mediation, complaint resolution, and impartial due process hearings), collection and analysis of State-reported data, and ensuring the participation and reporting on the performance of children in State-wide assessments. During the visit, OSEP staff also reviewed a number of State documents, including: (1) selected ESE monitoring files for school districts, including monitoring reports and corrective action documents; (2) a tracking log of special education complaints sent to ESE for resolution; and (3) a log and selected files regarding requests for due process hearings.
The information that Ms. Mittnacht and her staff provided during the OSEP visit, together with all of the information that OSEP staff reviewed in preparation for the visit, greatly enhanced our understanding of ESE's systems for general supervision, data collection and reporting, and State-wide assessment.
General Supervision - Monitoring
In reviewing the State's general supervision system, OSEP collected information regarding a number of elements, including whether the State: (1) has identified any barriers, (e.g., limitations on authority, insufficient staff or other resources, etc.) that impede the State's ability to identify and correct noncompliance; (2) has systemic, data-based, and reasonable approaches to identifying and correcting noncompliance; (3) utilizes guidance, technical assistance, follow-up, and-if necessary-sanctions, to ensure timely correction of noncompliance; (4) has dispute resolution systems that ensure the timely resolution of complaints and due process hearings; and (5) has mechanisms in place to compile and integrate data across systems (e.g., 618 State-reported data, due process hearings, complaints, mediation, large-scale assessments, previous monitoring results, etc.) to identify systemic issues and problems.
OSEP believes that ESE's systems for general supervision, with the exception noted below, constitute a reasonable approach to the identification and correction of noncompliance; however, OSEP cannot, without also collecting data at the local level, determine whether they are fully effective in identifying and correcting noncompliance.
As documented in ESE's monitoring procedures, OSEP found that local level compliance is determined through ESE's Coordinated Program Review (CPR) system by the Program Quality Assurance division (PQA). A comprehensive review of each public agency occurs every six years. First, ESE reviews the local special education plan and policies and procedures. At a pre-site conference, the ESE Chairperson and local education agency representative select a sample of student records for review by the ESE monitoring team. The record review helps drive interviews with teachers and support personnel. Interviews are also conducted with administrators and the Parent Advisory Council for each local educational agency. Other interviews are conducted as requested by the general public. Telephone, public hearings, and surveys are utilized to gather public feedback. ESE also observes classrooms and reviews facilities, and issues a formal written monitoring report within 60 business days.
ESE staff informed OSEP that a written response to any required corrective actions is due after the written report is issued. When criteria are not "fully implemented," the district or school is required to propose corrective actions that are approved by ESE. Corrective actions and associated documentation are reviewed by the assigned liaison in PQA until the plan is completed. The following steps are included in the process to ensure noncompliance is corrected: 1) ESE works with local educational agencies to approve (or impose) an effective corrective action plan; 2) the local education agency demonstrates in writing that it has implemented the corrective action plan; and 3) ESE follows-up to ensure effectiveness of the corrective action plan, either at a 3-year follow-up review or earlier, if necessary. Because ESE has included a mid-cycle review in its process, it is now able to validate the effectiveness of each local educational agency's corrective active plan at least every 3 years.
If ESE is unable to obtain compliance, ESE administrators indicated that it has a number of enforcement options that it can and has utilized. These include: 1) threatening to withhold or delay Federal funds; 2) calling a meeting to explain ESE's concerns to appropriate parties; 3) escalating the issues to the level of the superintendent or school committee; 4) appointing a "consultant"/special master from the outside with authority to make needed changes; or 5) withholding funds. ESE also has available a fund of $410,000 that it may use to assist local education agencies in addressing noncompliance if needed.
As evidenced through interviews and ESE's monitoring schedule, the biggest change to the system has been the number of local education agencies that ESE is monitoring and the time between its visits. It is now conducting approximately 100 visits per year to monitor approximately 370 districts and charter schools. To accomplish this task, ESE has significantly increased its monitoring staff from 12 to 13 people during OSEP's last visit to 32 positions with 6 supervisors. ESE informed OSEP that it has also started including local directors on monitoring teams, prior to their own visits. They have also supplemented teams with retired special education directors and teachers. All districts have now had a comprehensive review, which was not the case when OSEP visited in 1999. This is a positive change in that ESE is now conducting regular visits to local programs at least every 3 years through either a comprehensive review or 3-year follow-up visit.
ESE is utilizing the data it collects more and more for leveraging change. OSEP was informed that it is now able to begin evaluating trends. For example, ESE was able to provide OSEP with data on the areas demonstrating the highest level of non-compliance over the past 2 monitoring cycles. This has guided the trainings that ESE has provided the field, guided the goals set forth in ESE's State Improvement Plan, and has been used to report on special education to the State legislature. In addition, ESE is looking at the performance of children by disability area to determine if better performance is generated in one educational placement over others and comparing the results to the performance of nondisabled children. OSEP believes that ESE's use of data in such a manner is an exemplary practice.
In examining the monitoring process, reviewing samples of ESE's monitoring files for local educational agencies, and interviewing ESE staff regarding its monitoring system, it was apparent that: (1) ESE has designed a monitoring system with the clear goal of ensuring both the identification and correction of noncompliance; (2) ESE has initiated changes in its monitoring system that appear to ensure local agencies are frequently reviewed, including follow-up visits every 3 years, and correction of noncompliance is enforced; and (3) ESE is collecting and using data from several different sources including monitoring trend data to identify issues and determine the effectiveness of its special education programs.
General Supervision -Complaint management.
As set forth at 34 CFR §300.661, each State Education Agency shall include in its complaint procedures a time limit of 60 days after a complaint is filed to issue a written decision to the complainant, unless an extension of time is permitted due to exceptional circumstances that exist with respect to a particular complaint. OSEP examined ESE's log of special education complaints for FY 2003. Of 242 special education complaints that ESE logged, 82% received a written decision or letter of closure within 60 days of ESE's receipt of the complaint or by the specified extension date. Fourteen or 6% of the complaints that were not answered within timelines with no extensions noted were a week or less overdue. OSEP asks that ESE keep OSEP informed concerning its progress in ensuring compliance with the 60 day timeline.
General Supervision - Due process hearings.
As set forth at 34 CFR §300.511, the public agency shall ensure that not later than 45 days after the receipt of a request for a hearing a final decision is reached in the hearing and a copy mailed to each of the parties, unless the hearing officer at the request of either party grants a specific extension of time. OSEP reviewed ESE's log of due process hearing requests for Fiscal Year 2003. Although ESE maintains a log to track timelines, OSEP was not always able to determine when hearing decisions were reached and mailed to the parties because of how extensions were calculated. OSEP requested to review the complete files for 4 hearings that produced a written decision. In reviewing the records, OSEP found that all 4 hearings decisions were mailed to the parties beyond the 45-day timeline (47 days, 57 days, 70 days, 125 days beyond the 45 days or extension timelines), even when specific extensions of time were taken into account. As a result, OSEP found that ESE was not in compliance with 34 CFR §300.511 and requests that ESE submit a plan to correct this noncompliance within 60 days from the date of this letter.
In looking at the State's system for State-wide assessment, OSEP collected information regarding a number of elements, including whether the State: (1) establishes procedures for State-wide assessment that meet the participation, alternate assessment, and reporting requirements of Part B, including ensuring the participation of all students, including students with disabilities, and the provision of appropriate accommodations; (2) provides clear guidance and training to public agencies regarding those procedures and requirements; (3) monitors local implementation of those procedures and requirements; and (4) reports on the performance of children with disabilities on those assessments, in a manner consistent with requirements.
OSEP has determined, through its review of the State's written procedures for State-wide assessments and the State's reports to the public and the Secretary on the participation and performance of children with disabilities on such assessments, that those procedures, as written, and those reports are consistent with Part B requirements. OSEP cannot, however, without also collecting data at the local level, determine whether all public agencies in the State implement the State's procedures in a manner that is consistent with Part B.
Massachusetts' Comprehensive Assessment System (MCAS) was implemented in response to the Education Reform Law of 1993. OSEP was informed that all students are required to participate in the MCAS. If a student does not participate in the regular MCAS, then the student must take the MCAS Alternate Assessment. ESE is able to track this through its centralized system. Every student is assigned a unique number and must be accounted for through testing. Participation rates are monitored when local education agencies report student performance, and IEP decisions are monitored through the Program Review System.
Both the general and the alternate assessments are aligned with and measure proficiency according to Massachusetts' curriculum standards. The alternate assessment is based on developing a portfolio to demonstrate student proficiency. Approximately 1% (or 5,000 students) of all students with disabilities take the alternate assessment.
As evidenced through MCAS written procedures, high school students must pass the 10th grade test in order to graduate. A diploma may be received either through successful performance on either the general or alternate assessment. Students have at least 3 opportunities a year to pass the test. For the general assessment, scoring levels are "warning/failing," "needs improvement," "proficient," and "advanced." For the alternate assessment, scoring levels are "awareness," "progressing," "emerging," "needs improvement," "proficient," and "advanced."
Participation rates are high (94-96%) for all students, including special education students. No student is exempt from taking the MCAS. Only students who are absent do not take the test. Schools/districts earn a "0" for students who do not participate. Schools are given credit for students demonstrating progress on the alternate assessment so there is less incentive for principals to move them out of their schools to improve school performance.
When the MCAS was first initiated and reviewed by OSEP during its 1999 monitoring visit, there was a great deal of public concern over the failure rate of students taking the test. At that time, 93% to 96% of students with disabilities in 10th grade failed to meet the "proficient" level of performance on at least one section of the test, and 56% to 74% of regular education students at 10th grade failed to meet the "proficient" level performance on at least one section of the test. Many parents were concerned about whether their children would graduate once the "high stakes" requirement for the test was phased-in. The trend in scores has been upward and a significant majority of both special and regular education students pass the MCAS to demonstrate learning competence.
In conclusion, OSEP found that ESE has a State-wide assessment system that ensures the participation of students with disabilities and has developed an exemplary system of accountability for improving the performance of children with disabilities. ESE has developed an alternate assessment that permits students with disabilities who take it to earn a general education diploma.
Collection of data under section 618 of the IDEA.
In looking at the State's system for data collection and reporting, OSEP collected information regarding a number of elements, including whether the State: (1) provides clear guidance and ongoing training to local programs/public agencies regarding requirements and procedures for reporting data under section 618 of the IDEA; (2) implements procedures to determine whether the individuals who enter and report data at the local and/or regional level do so accurately and in a manner that is consistent with the State's procedures, OSEP guidance, and section 618; and (3) implements procedures for identifying anomalies in data that are reported, and correcting any inaccuracies.
Local educational agencies input the data and maintain security. The agencies are supposed to validate their data. ESE conducts 150 different checks after the data are submitted to ensure the data's accuracy and correct for any errors. The data are then aggregated and checks are done to ensure internal consistency. Trend data are also looked at to identify any inconsistencies. These procedures are conducted in order to ensure that data are accurately reported and to correct inaccuracies when they are discovered.
OSEP was informed that six State-wide data trainings are provided to data personnel using data primarily from the December 1 child count and SIMS in order to ensure that individuals who report data do so accurately and in a manner consistent with ESE procedures. The training is not required but strongly encouraged. For local educational agencies that do not attend, field specialists will provide follow-up assistance with one-on-one training if needed. Massachusetts also provides access to local educational agencies for data assistance through its data hotline.
As noted above, ESE is moving toward collecting most of the needed data through the SIMS system. Currently, personnel and discipline data are collected in paper form. ESE staff acknowledge that there has been much more room for error with these data. For example, the personnel data is collected by the SEA and key punched to enter the information into the system. Discipline data comes from a paper report. This is only a special education data collection. ESE stated there have been problems with local educational agencies understanding the information that they are being asked to report.
ESE is comfortable with the accuracy of the child count data. The SEA Director indicated that the data for educational environments, which was one of the factors that identified Massachusetts for the OSEP verification visit, has not been accurate. Massachusetts has reported 2000 State data according to State prototypes. There are inconsistencies between the State's prototypes and OSEP's definitions of educational environments. For example, the OSEP category of "less than 21% outside the regular class" actually corresponded to a ESE prototype that required the student to spend 100% of their day in the regular classroom. This has caused the State's data to show fewer students with disabilities in regular classrooms than the number really served in regular classrooms. ESE changed forms last year to align with Federal setting categories. Data should be accurate for the December 1, 2003 child count.
The other area that resulted in Massachusetts being selected for a verification visit was the higher percentage (7.2%) of children with disabilities being served in private/public separate school placements as compared to the national average (3%). ESE officials explained that the data might either accurately reflect a higher number of children served in these environments acknowledging that there are numerous private schools in Massachusetts with high enrollments of students receiving special education or, that there might have been some double counting in this area before the utilization of SIMS system with its unique student numbers because of confusion between local educational agencies about who should place the students on their child counts.
In conclusion, OSEP found that at least one of the 618 data indicators that OSEP used for ranking Massachusetts' performance in the least restrictive environment is not accurate because of the use of differing definitions. ESE has made changes to its data collection for section 618 that will result in more reliable and accurate data. Because of acknowledged problems in collecting personnel and discipline data and issues regarding the placement data for private/public school special education placements, OSEP is requesting that ESE provide an analysis of this data in its Annual Performance Report.
We appreciate the cooperation and assistance provided by your staff during our visit. We look forward to receiving your plan within 60 days from the receipt of this letter for addressing noncompliance with the due process hearing timeline requirements under 34 CFR §300.511. In addition, we will be looking for updates and an analysis of timelines for you complaint system and 618 data reporting requirements for discipline, personnel, and placement of children with disabilities in private/public separate school placements in your Annual Performance Report. We look forward to our continued collaboration with Massachusetts to support your work to improve results for children with disabilities and their families.
Stephanie Smith Lee
Office of Special Education Programs
cc: Ms. Marcia Mittnacht