Researcher: HsingChi von Bergmann
Position: Associate Professor, Education Research
Year level: All years
Number of students: 230
Professional schools like medicine, law, pharmacy and dentistry have a mandate to establish a competency-based education. But in reality it’s not clear how that competency actually get assessed. This is a 20- or 30-year-old challenge to identify how one actually studies competency-based education or how to assess competencies.
Progress Testing is designing one exam containing 150 to 200 multiple-choice questions. The test is designed to map all the cognitive knowledge domain expected of a competent graduate, and is given to all students in the program, regardless of the year they’re currently in, to benchmark their progress.
To answer questions regarding student progress and competencies, researchers have been looking into the students’ performance to see how it changes from year to year. Further research will examine how student anxiety levels change with the implementation of the Progress Testing.
Findings show how students have progressed from year one to year four. The number of knowledge domains mastered by students in each year increases over the course of the degree. In addition, the variance of scores within each class has decreased as students progressed through the years, with a smaller gap between the highest scoring and lowest scoring students.
Can you give some background on the research?
HsingChi von Bergmann: In professional schools —such as medicine, law, pharmacy or dentistry — we have a mandate to establish a competency-based education. But in reality when each school’s dean signs this competency document to graduate a particular student, it’s not clear how that competency actually gets assessed. This is a 20- or 30-year-old challenge to identify how one actually studies competency-based education or how to assess competencies.
In medical schools in the Netherlands, they have been working on particular innovations; one of which is called Progress Testing. Progress Testing is in essence designing one exam containing 150 to 200 multiple-choice questions. The test is designed to map the cognitive knowledge domains expected of a competent medical doctor, and it is given to all students in the program, regardless of the year they’re currently in, to benchmark their progress. We decided to try it out in Dentistry.
What was the research question?
HCvB: As the Progress Testing literature suggested, we designed our system based on a testing blueprint to map out cognitive knowledge domains expected of a entry-level dentist. The first research question of this project is: “How can progress testing inform curricular designers and module coordinators to reflect on their curriculum design and implementation of the curriculum?”
The second research question is more about immediacy of feedback. Using another survey to evaluate student perception of Progress Testing, we want to understand how this tool might help students identify if they are struggling in a specific area of their learning. Can Progress Testing provide timely feedback to learning? Can and will students apply a variety of help strategies to improve and enhance their achievement of knowledge competencies?
Lastly, all students in our faculties are expected to take a high stake national dental board exam when they are reaching graduation year. The board exam questions are very similar to progress testing format. Would the experience of progress testing better prepare students for their dental board examination? Specifically, will the experience help to alleviate test anxiety?
How was the experiment set up?
HCvB: The biggest challenge at the time was that there was no existing test blueprint for dentistry. We wanted the progress tests to be competency based instead of curriculum sensitive. We wanted to map all four years knowledge in dental school. We started with TLEF funding in our first year to develop a test blueprint together with content experts in the faculty. In the second year we developed our first test using all course modules and administered the test two times.
To test all items, we have three different test booklets. Out of every 200 questions within a booklet, there are only 86 items that are identical. This arrangement allows us to (1) test all item properties, (2) benchmark students using common items, (3) equate the booklets.
One critical feature of our dentistry progress testing system is that: In the progress test we gave students the option to answer “I do not know” for each of the question. This is intentional given that the purpose of the tests was not to frustrate students but to help them self-benchmark their progression throughout the program.
What did you find?
HCvB: We can see over the four years, the knowledge domains have increased, and the variances among students within a class have decreased as students progressed through years. The later part is what we expect of our students — the highest and lowest scoring students’ results converge as they progress through the degree.
In the two tests we have administered this year, one in the first term and one in the second term, the “I do not know” items within a specific domain, have decreased and the number of correct answers have increased.
How did you evaluate your findings?
HCvB: This is part of the new assessment system for the new DMD curriculum. Being a new part, we have just been looking into the students performance to see how the performance improved from year to year. But for us to begin to answer our research questions about student anxiety levels, student perception about immediacy of feedback, and the types of help strategies that are sought post Progress Testing, we need ethics approval. We’re still waiting for it so we don’t really have any data to answer those questions yet.
How will this study impact teaching and learning?
HCvB: One of the possible outcomes that we are hoping to obtain via the progress testing system is to stimulate conversations between curricular designers: “Our students over a two-year period of time have been quite weak in domain nine, for example, why is that?” We’re hoping Progress Testing in the aggregated form can constantly build the intended feedback loop to program designers and module coordinators so they can think about: “Ok, this is the domain that the students have not been doing well in. Perhaps we need to rethink how we teach or how we prepare the lessons to enhance student learning.”
How will this study impact future research?
HCvB: In the field of Dentistry, we are one of two dental faculties (another one is in the UK) that first implemented Progress Testing. For dental education researchers to even start to think about how to design a test blueprint, a large-scale assessment, etc., we have a lot that we can contribute.
As a science education researcher by training, I also hope that such an innovation can be transferred to other disciplines within and outside the UBC. That would be a very interesting future research direction.