Prometric has extensive experience in transitioning programs from paper-based delivery to computer-based delivery. The test development department is staffed with psychometricians who have directly participated in the research and application of methodologies to help ensure minimal impact on candidate performance during such a transition.

Research

As computer-based testing (CBT) has become extremely prolific in the last 10-15 years, the amount of industry research on the comparability between paper-based and computer-based exams has grown considerably. As would be expected, research findings that were common at the initial introduction of computers into the testing population are no longer relevant to the candidate populations entering the testing environment today. As candidate populations have evolved in their exposure, access, and familiarity with computers, effects of the introduction of computer-based exams has reduced substantially. This has been true across the multiple candidate segments, as availability of computers has increased exponentially across all population subgroups.

In summarizing the findings of the multiple studies in the last 5-10 years (most recent being conducted in 1997), the main points identified in the research area as follows:

  1. Effect sizes on exam performance for paper-based versus computer-based testing is generally statistically insignificant (p > 0.05), suggesting no substantial difference between the two testing methodologies.
  2. Differential Item Functionality (DIF) analysis on individual items that are delivered on paper-based versus computer-based typically indicate that when item types are conducive to both delivery methodologies, the performance on items in both delivery channels is comparable.
  3. Clear communication to the candidate pool regarding computer-based exam delivery can further decrease the chance of the delivery mode impacting final candidate performance.
  4. Preparation and practice in a simulated testing environment (through a practice exam and/or a tutorial prior to live testing) can decrease candidate performance anxiety and further contribute to an equitable testing result.

In addition to the theoretical research available in within the testing industry, Prometric also has practical experience in the evaluation and comparison of candidate performance across multiple delivery channels. We have successfully migrated a large number of clients from one delivery mode to another and have consistently evaluated candidate performance to ensure that all candidates are provided a fair opportunity to display their knowledge (independent of the channel being used). In addition, Prometric has a number of clients that use a dual delivery model (both paper-based and computer-based), which also allows for the continual monitoring of comparability of exam performance and individual item performance across the multiple testing environments. Prometric's findings consistently mirror those noted above from the general testing community and we have been able to develop and advance approaches that assure the desired result.

Practice

In migrating an examination programs, there are several steps that should be considered. Although there are a large number of individual tasks, the major activities include the following:

  1. Evaluation of current item bankAn evaluation of the current paper-based item bank is typically done at the beginning of the project to ensure the applicability to a computer-based environment and to gather baseline statistics for future evaluation.
  2. Modification of item presentationIf necessary, Prometric makes recommendations on individual item modifications to increase the likelihood of equitable performance on the overall exam in a computer-based environment.
  3. Prepare a communication effort to the candidatesPrometric works with clients to ensure that an appropriate communication campaign is developed for the candidate population prior to the introduction of a new delivery model.
  4. Prepare sample exams and/or tutorialsPrometric prepares practice exams and/or tutorials that are provided to candidate prior to a live exam to increase the comfort level of the candidate population with the test delivery model.
  5. Exam and item analysisUsing the baseline data collected in step #2, Prometric will evaluate exam and item level performance to ensure comparability of candidate performance.
  6. Continual monitoring and maintenanceEvaluation of the delivery model does not stop after initial confirmation of comparability. Prometric psychometricians will continually monitor the performance of the USPS items and exams to ensure that they are performing within acceptable statistical parameters and to forecast future test development needs.

Prometric embeds the methodologies above within our test development and delivery processes to ensure a fair, valid, and legally defensible process for all candidates. Our experience with successfully implementing computer-based exams is matched by our proven processes for ensuring quality of product and performance throughout the lifecycle of the exams.

Return to Paper Based to Computer Based Testing Page