Topics Map > Services > Teaching, Learning and Classrooms > Exam Management

Generated Reports

Below is a list of the types of OMR generated reports.
  1. Order of Tests Read
  • Lists the original order the tests were read in to the scanner to provide you with quick access to any test in the stack
  • Useful for checking individual answer sheets for accuracy and/or errors
  • Other reports are sorted by the student name or ID number as determined by Question #4 on the INFO sheet
  • Instructor Information
  • Lists the Scoring Options chosen for this OMR job and the answer key and weights associated with each question
  • Test Analysis and Distribution of Scores
  • Lists the number of questions, the number of exams graded, the subjective points possible, total points possible, mean, median, mode, standard deviation, and the KuderRichardson reliability scores (KR-20 and KR-21)
  • Overall class performance is illustrated with a histogram showing frequencies and percentiles for each score obtained on the exam
  • Frequencies of Responses by Question
  • Lists simple counts of how many students selected each choice, as well as the number of invalid and omitted responses for each question
  • When the "AND" or "OR" options are used, the number of people answering combinations is displayed, although the actual combinations are not
  • Item Analysis
  • An analysis of the individual items on the exam
  • The class is divided into thirds based on the overall test scores of students
  • For each question, the percentage of students from the upper third selecting each choice is compared to the percentage of those from the lower third who picked that choice
  • The correct answer is denoted by the letters CR for single correct answers and by the correct combination for combination answers
  • Individual Scoring
  • Detailed information on each student taking the test
  • The student's name, identification number, weighted percentage score, weighted correct score, number of omitted questions and individual subjective points obtained are displayed and the student's answer to each question on the test is indicated
  • A $ symbol is printed for each correct answer. If an answer is incorrect, the response chosen is printed
  • NOTE: An asterisk (*) appears for each answer which could not be read by the scanner for both the 5-choice or 10-choice, single correct answer exams. This is usually due to more than one answer being chosen on a single-response test or to stray marks on the page
  • Student Summary
  • May be used for posting exam results publicly
  • The Student ID number sorted in ascending order is printed on the left side of the page
  • Student names are printed on the right side of the page or you may choose to have them omitted
  • The unweighted number of questions answered right, wrong and omitted, the subjective points obtained, the weighted points and weighted percentage display to the right of each Student ID number
A well-written exam item should have responses in all or nearly all of the possible choices. When no student in the class picks a choice then it is not a good distractor. The choice had the same effect as if not there. You may also want to check that many of the students in the upper third do not flock to any one wrong choice. This often indicates a poorly worded choice or one in which the distinction between the right and wrong answer is not totally clear - even your "good" students were confused on the point covered.
Items should discriminate between the students in the upper and lower thirds of the class. If all items cover the same general topic, then the amount of students in the upper third who correctly answer a question should be greater than or equal to those in the lower third who correctly answer the same question.
The item discrimination index shows this relationship using the following formula: (U - L)/ N where U is the number of students answering correctly in the upper third, L is the number of students answering correctly in the lower third and N is the number of students in a third. A negative score means students who did poorly on the test did better on this item than those who did well -- this item probably needs to be rewritten. A score below .20 suggests there is low discrimination power and the question may need examination. Ideally, you should have a variety of indices covering the .2 to .8 range.

Keywordsoptical mark reader, omr LAIC test scoring   Doc ID101947
OwnerSharley K.GroupIT Knowledge Base
Created2020-05-11 13:48:01Updated2024-01-25 16:47:37
SitesIT Knowledge Base
Feedback  0   0