Improving the Passing Score on the National Physical Therapy Examination
Walter Erikson, moderator
Physical Therapy Department, Eastern Washington University
Forum, Volume 17, Number 1
Passing scores rise and fall, while reasons for sometimes startling score differences from year to year vary. Here are methods used by five schools to reverse decreasing scores.
School One Emphasized faculty item writing skills
We were ahead of the curve in an unfortunate way when our scores dropped in 2001 while nationally, the average was going up. Initially, there were comments that the student pool has changed; in fact, the maximum variations were only 3%. Our faculty then developed a strategy to make sure they came back up again, and in 2003, scores rebounded to the mid- to upper-90s.
The faculty looked at curriculum relative to the NPTE blueprint; we thought perhaps one faulty area might be content. It wasn't. We also contacted students who had done poorly on the exam and asked if they would share their concerns with us. A theme that quickly emerged was that the type of question asked on the NPTE was not comparable to the type of questions asked on exams in the program. We then focused our attention there.
This required intervention with the faculty, as not everyone was using multiple choice or computerized grading and the item analysis. We worked to develop the faculty ability to write test items by bringing in a consultant for an item-writing workshop. Those faculty members who were comfortable using item analysis also discussed their work with other faculty members. Faculty members were encouraged to visit the FSBPT website for insight.
We also renewed the emphasis on the patient scenario format. We felt that we had moved away from an emphasis on skills and toward an emphasis on teaching theories and principles and allowing students to do problem solving and critical decision-making based on theories. There was simply not enough practice in applying those theories to patient scenarios.
Finally, we stressed the importance of textbooks with students, emphasizing textbook study as opposed to class notes in preparation for the exam. We noted that the exam is based on current practice, rather than classroom teaching of what we would like it to be.
School Two Recommitted to multiple choice items
Lower scores first got our attention in 2002. The faculty initially looked at the curriculum because that was in our control and we felt we were doing a good job in terms of blueprint and content outline. However, in the early 2000s, we began moving from pure philosophy to a case base and inadvertently abandoned multiple choice questions. Philosophically, we felt that other assessment tools were more in keeping with the learning activities and instructional methods.
But students who had once been good test takers told us they had been losing that skill while in our curriculum. As a result, we decided to make a philosophical shift and re-commit to multiple choice and become much better item writers. Three faculty members are item writers for the NPTE exam; we asked them to help the rest of the faculty appreciate and improve their test writing skills.
We continued to encourage students to do a lot of self-assessment after exams. Students were also encouraged to consider taking an examination and preparation course. In the past, virtually none of our students utilized the commercial course, unless they had failed the first time. The class embraced that idea and organized a test preparation course at the end of the curriculum. The students have picked up most of the expense.
We have also tried to shift to computerized testing. We were concerned with cheating, as we thought a few students visited web sites during the test. We are now waiting for technology to improve before we proceed further with this type of testing.
In the last semester, we tried to put much more emphasis on the NPTE expectations. We told students what they should expect in terms of studying and brought former students in to provide feedback. We tried to set a different tone on the difficulty and consequences of the exam, pointing out that by no means is passing the exam a given.
School Three Focused on Patient-Scenario Questions
As the lone full-time faculty member at the school, I was very troubled by the sudden drop in scores in 2000-2001. One of the first issues we looked at was test design, specifically, how we were writing multiple choice questions. Because we were relying on adjuncts, it was very difficult to have a lot of control over the exams.
We looked at every one of our exams and realized we were lacking patient-scenario-type questions. We have since started identifying sample questions and are putting them on the exams to let students know that this is the level of thinking that will be required. We do, however, throw the questions out if more than half of the class gets them wrong.
We revisited questions and content concerning ethics, law and the scope of practice for the physical therapist assistant and are currently trying to re-write all the questions to make them more like a case scenario. We encourage students to attend a review course.
Before the last clinical, we have two seminar courses that do not require exams, but address management issues and quality assurance. We also administer a test that students believe will be graded. Instead, we give it back to them to be used as a self-assessment tool. Finally, we give four different versions of a sample exam to develop areas in which they are weak.
School Four Emphasized studying textbooks, not just curriculum
I believe there are multiple variables concerning improvement of the passing grade. For instance, at the same time we finally received funding for more students, we had a dramatic reduction in the number of applications. If you go from 650 to 150 applicants and at the same time increase from 24 to 40 available openings, you have some new dynamics.
Seven years ago, we initiated a written comprehensive exam that used a multiple choice format, and the faculty was educated on how to write a good question. We felt it was important to have case studies on the exam, so we included about eight or nine on the exam. About 140 questions are asked over three to four hours. I should note this is still a pencil and paper exam; we are not comfortable with computers.
We use this comprehensive exam to measure curriculum, to learn who these students are and how they perform relative to the curriculum, how they perform in the clinic, and, finally, how they fare on the exam. When the test is completed, we emphasize self-assessment.
We also emphasize the consequences, but these are in the form of corrective behavior. We bring low-scoring students in, and pose questions to them in an open book environment. They then have to go get the answers.
We still have some concerns that students may have only used curriculum to prepare for a test, and that textbook assignments are being ignored. We are continually telling students now that they need to need to read those chapters to prepare for the exam.
School Five allows students to come back for a week of review
We have always had a good pass rate, although this year, our pass rate dropped. We knew that our applicant pool has dropped, yet oddly enough, students who had difficulty on NPTE were not the students who had problems on their clinicals.
At one time we had a comprehensive exam, but found the correlation between that and the national exam wasn't very good. Now we have comprehensive practical exams that encompass anything up to that point. Students like it better than one more multiple-choice exam. By the way, we have always shared the test content outline and its wording with the students.
Something that has also worked for us is having the student come back for a week at their option. We pull all the exams, have them retake those exams by looking up the answers and then visit with a faculty member. So far, it has worked. Test-taking ability, I think, depends on content, inherent test-taking skills, preparation and the depth of reading skills required.