Exam results, whether GCSEs or A levels, can bring mixed emotions. Students put a lot of effort into their revision and sitting the exams. So, it’s no surprise that they feel disappointed when their results come back – whether the result was higher or lower than they expected.
Fortunately, there’s much more to a student’s abilities than their exam grades. This is why it’s important to look beyond an individual exam result and assess how well a student has performed in a particular testing environment.
Item Analysis is a useful way of doing this. A basic Item Analysis shows the percentage of questions answered correctly, how much students scored above or below chance (using stats like Pearson Product Moment Correlation), and other useful information.
An Item Analysis is also a good tool for assessing how a student’s performance in a specific section of the test compares with that of other students around the world. This is helpful when deciding how to allocate marks, or to identify trends in performance.
Before an exam can be scored, NCARB’s test development facilitators and Item Writers must ensure that the item does not discriminate based on gender or race. In addition, each question going through the pretesting process is evaluated for potential bias. All of this is done in order to ensure that exam scores are fair for all candidates.