Every two years, the North Carolina Department of Public Instruction and the North Carolina State Board of Education conduct the Teacher Working Conditions Survey to collect teacher insights that help them inform how districts make improvements to their schools.

Ahead of the 2024 survey period, Jeni Corn (’97 B.A., ’02 M.Ed.), then-DPI Director of Research and Evaluation, was presented with a task — overhaul it.
The survey is “as high stakes as it gets for teacher working conditions,” said Corn, who now serves as Social Sciences Research Director for the North Carolina Collaboratory, and it had some pressing problems.
The Department and Board ask every teacher and school staff member in the state to take the survey, but it had ballooned to 200 questions over the years, and the time commitment was a burden. The question format was inconsistent in some areas, and it included questions that were redundant or outdated, and sometimes confusing.
It was time for a strategic refresh, and Corn knew she needed help, so she made some calls. She wasn’t looking for a content expert on teaching; she needed a rigorous survey methodologist, a psychometrician. That’s how she found out about Peter Halpin, Ph.D., an associate professor at the UNC School of Education who specializes in psychometrics and educational measurement.
“I cold-emailed Peter and said, ‘I need to overhaul this statewide survey that 100,000 teachers respond to,’” Corn recalled.
Saying yes was “a no-brainer,” Halpin said, a chance to use his expertise to help improve a significant measurement tool and to support North Carolina public schools.
A majority of Halpin’s academic work includes developing novel ways to use psychometrics to solve new or emerging problems. This was an existing problem where he could truly make a difference.
“After I talked to Jeni, I talked to some of my teacher colleagues, and they confirmed that this survey was, in fact, a huge burden,” he said. “This is exactly the kind of thing that I can help with. There were things we could do to make the survey more straightforward and clearer, to improve the experience for teachers, and to ensure that the survey reflects what’s actually going on at schools, so that school leaders can have reliable information to make decisions.”
The main goals were shortening the survey to reduce the response burden and improving the quality of information received from teachers. Previous surveys collected a massive amount qualitative data. Were those kinds of responses still needed? Corn and Halpin put together working group to go through each item and collect information from stakeholders about what kinds of measurements mattered most.
Corn saw the survey overhaul as one of the most important in her career to date, and big questions loomed as she tried to make meaningful changes that could move the needle for teachers and schools. As a result of their work, the survey length was cut by half and streamlined, with 99 questions retooled to enhance the quality of data collected, making it easier for leaders to make decisions about how to improve schools across 12 measured domains — areas like school leadership, student well-being, facilities and resources, equity, retention, and more.
One additional significant update to the survey: Because responses are anonymous to achieve honest results, the survey had never included demographic data like race and ethnicity as just two examples. But teacher responses could certainly vary across race and ethnicity, or years of experience, Corn said.
- Reordered domains so Retention and School Leadership appear first
- Changed School Safety to Safety & Wellbeing to include today’s most relevant needs
- Facilities & Resources focused more clearly on if a school’s physical environment is sufficient for quality instruction
- Professional Learning now enquires about top three needs for areas for professional development
- Instructional Supports & Practices are now about top three needs for support and the NCEES (NC Educator Evaluation System) process
- Aligned Teacher Leadership domain questions to NC Professional Teaching Standard 1 (Teachers Leading in the Classroom and School)
At these kinds of crossroads, Corn turned to Halpin, who she said “had the background and chops” she could trust for expert feedback and input. If they added the demographics question, but kept it optional, would that impact the reliability and validity of the survey? Halpin told her that they’d need a minimum of 20 percent of the 100,000 respondents answering the question to consider it valid from a methodological standpoint.
They put those demographic questions in, and of the 102,082 who people responded, 95 percent answered.
“We were able to give that data to DPI and the Board for the first time responses by race, content area, and years of experience, which was new data for this,” Corn said. “They had given me this task to better benefit teachers, and to have a psychometrician like Peter by my side gave me a lot of confidence that we could make changes without sacrificing reliability and survey validity.”
During the State Board of Education’s Bi-Annual Planning and Work Session in May 2024, Corn facilitated a data session using the results from the revised survey. Board members worked in small groups to examine survey results by race/ethnicity and years of experience to identify areas of success and growth based on feedback from the teachers in schools in their region. At this meeting, they were also able to discuss priorities for serving these teachers better, Corn said.

The team also leveraged the expertise of Cole Smith, a UNC-Chapel Hill doctoral student in education policy and graduate research assistant who created an interactive dashboard to aggregate the data collected from the survey. By merging the survey data with other school characteristics to get a sense the relationships between items on the survey and school-level growth, the dashboard helped to show relationships between the survey and school-level growth and displayed results in ways schools and districts can better use to make improvements.
“This was the first time that we released data back to districts in a more disaggregated way,” said Smith in an interview earlier this year. “(Before) what they were not able to do was really dissect across the district or within a school.”
Every school in North Carolina is required to assemble a school improvement team — that includes leadership, staff members, and parents — charged with using the survey results to create improvement plans.
“School improvement teams are mandated to use this survey,” said Halpin. “Cole was able to create some new tools for them to use that had never been available before.”
Halpin found another way to realize the impact of the work when the elementary school his children attend was looking for parent representatives to join the school improvement team — the one required to use the survey to develop a school improvement plan. He decided to join.
“It was actually a really great way to see this information used in practice,” he said. “It was an interesting little circle of experience to work on redesigning the survey and then be in the room with the principal, the teachers, and other parents, reading the results of the survey and trying to make decisions about how to improve outcomes for the kids at our local school.”
It’s part of the reason he’d said yes to the partnership to begin with.
“Psychometrics may seem like an esoteric skill set,” he said. “But I was excited to use to do something useful for North Carolina schools. It was a rewarding experience to contribute to something that’s going to impact real people’s lives.”