Home > Press Releases & Statements > Annual Administrative Report on edTPA Data Shows Continued Growth and Support for the First Nationally Available Assessment of Teacher Candidates
WASHINGTON – Oct. 24, 2016 – A newly released public report on the second full year of edTPA implementation provides a detailed picture to date of edTPA’s continued expansion and support as the first nationally available performance–based assessment and support system for teacher licensure, program completion and accreditation. More than 27,000 candidate portfolios are included in the findings, and analyses are presented in the report to reaffirm reliability and consistency of scoring, examine evidence of validity and document trends in candidate performance.
edTPA, exclusively owned by Stanford University and developed by educators for educators, has been used operationally to assess teacher candidates since fall 2013; it is now used by educator preparation programs in 38 states. Membership in the edTPA online community, first launched in 2011, has grown to about 9,100 faculty from more than 700 educator preparation programs, the report notes. The extensive support infrastructure available to edTPA members includes more than 165 resources that have been downloaded more than 670,000 times.
Educative Assessment and Meaningful Support: 2015 edTPA Administrative Report presents analyses of the 27,172 edTPA portfolios from 27 states that were scored in 2015. Of those, 21,452 came from candidates in states where edTPA is required for licensure, certification or program completion/program approval, including California, Georgia, Iowa, Illinois, Minnesota, New York, Tennessee, Washington and Wisconsin.
The annual report reviews scoring patterns to assess edTPA consistency and reliability. The findings show through multiple analyses that edTPA meets professional standards for validity and reliability (as per AERA, APA, NCME, 2014). In other words, edTPA effectively assesses the three job–related tasks for which it is designed – planning, instruction and assessment of student learning.
“We are encouraged by the positive validity evidence supporting edTPA and remain committed to supporting IHE and state research initiatives to examine the impact of edTPA on program renewal and student learning,” said Raymond L. Pecheone, Executive Director of the Stanford Center for Assessment, Learning and Equity (SCALE), which led the development of edTPA with educators nationwide beginning in 2009. Pecheone notes that an additional 40,000 portfolios are expected to be scored in 2016.
More than 2,500 teachers and teacher educators have been certified as official edTPA trainers, scoring supervisors or scorers. In 2015, about half of scorers were teacher educators and half were classroom teachers; 32% of the practicing classroom teachers and 20% of the qualified scoring pool were National Board Certified Teachers.
“edTPA enables us to act on the principle of assessment in support of learning,” said Sharon P. Robinson, President and CEO, American Association of Colleges for Teacher Education. “Because its growth is rooted in the unified professional community, edTPA represents a learning tool for educators at all levels.”
There are five rubrics for each of edTPA’s three core areas–planning, instruction and assessment–with each rubric scored on a five–point scale. Total scores can range from 15 to 75 points. The edTPA national recommended professional performance standard is 42, although states are free to set their own cut scores. Today, state-set cut scores range from 35–41. The report finds that in 2015 the average candidate score was 44.2.
On average, candidates did best in lesson planning and instruction, with slightly lower average scores on how well they assess and give feedback to their students. Candidates scored higher in states where edTPA is required for licensure, certification or program completion: the average score in these states was 44.5, as compared to 43.1 in states without consequential policy.
Differences by demographic group were small; women generally scored higher than men, and suburban teachers on average scored higher than teachers in other teaching contexts. Performance differences were found between African American and White candidates, with differences in mean performance at about one half of a standard deviation. In addition, White and Hispanic candidates had comparable performance, as did those indicating Other for ethnicity, and those who declined to answer. Taken together, demographic variables such as gender, ethnicity, teaching placement context, education level and primary language explained approximately 4% of the total variance in edTPA scores. This result highlights that demographic factors account for a very small portion of the variables that contribute to how a candidate scores on edTPA. In other words, a candidate’s demographic characteristics alone are a poor predictor of edTPA performance or readiness to teach. The finding further supports the conclusion that while some statistically significant differences exist between subgroups, 96% of the explanation for that performance can be attributed to other non–demographic factors.
“As programs strive to increase the diversity of their candidate pools, it is gratifying to note that candidates’ racial and cultural differences are not a significant factor in edTPA performance,” said Robinson. “We will remain vigilant in monitoring the assessment’s impact across various groups and remain optimistic that this trend will persist.”
As is the case with National Board assessments, educative use of a performance–based assessment is more than a testing exercise completed by a candidate. edTPA’s emphasis on support for implementation mirrors the National Board for Professional Teaching Standards’ use of professional networks of experienced educators for professional learning. The educative use of edTPA provides educator preparation program faculty and their P–12 partners opportunities to engage in professional learning to improve student learning outcomes.