Support for descriptive type questions. (Feature #147)


Added by shreikant kv almost 12 years ago. Updated almost 4 years ago.


Status:WontFix Start date:02/24/2013
Priority:Normal Due date:
Assignee:- % Done:

0%

Category:-
Target version:-

Description

Many thanks for all developers of amc. It is a dream come true!

What I am requesting for in brief:
Can amc provide some additional utilities for grading descriptive
type questions?

I know what I am requesting is, strictly speaking, outside the scope
of multiple-choice grading. However if this feature is added, then
amc would become much much more attractive although it is very very
attractive now.

1. After we have conducted the exam, we ask amc to process the
multiple choice questions in the usual way.

2. Next there should be a tab on amc for "Mark Descriptive/Long
Answer Type Questions"
Click on this tab and amc asks for
(i) No of instructors who will grade/evaluate the answers.
[?] (Suppose you say 5: It means 5 different faculty
members will evaluate different questions.
(ii) Names of these instructors:
1--> Alex
2--> Bob
3--> Cunningham
etc. etc.
(iii) Associate questions with instructors as follows:
Question 1 --> Instructor 1 (Alex)
Question 2 --> Instructor 3 (Cunningham)
Question 1 --> Instructor 2 (Bob) { We might allow the same question to
evaluated by two different instructors!}
Question 4 ---> etc. etc.

3. After this exercise, amc will generate one bundle for each instructor who is
evaluating the long answers. These bundles are handed over by the chief instructor
to the respective instructors.

4. Some image manipulation front end using gimp/inkscape etc should be used to
develop a front end wherein Instructor 1 gets the 1st question of 1st student
with suggestions for partial marks, and ability to add annotations( like wrong,
irrelevant, correct, incomplete etc.) Instructor 1 clicks like 2marks (out of 5
or whatever). Click Next and Instructor 1 gets 1st question of 2nd student etc.

5. After completion Instructor 1 hands over his corrected bundle to chief
instructor. Likewise other instructors.

6. AMC looks for mismatches (if more than one instructor has evaluated the same
question) and generates the mismatch file. Chief instructor now gives these
mismatches to those instructors etc...

7. When there are no mismatches, amc grabs the partial marks assigned by evaluators
and compiles mails etc...

Am I asking for too much?


History

Updated by Alexis Bienvenüe almost 12 years ago

Adding some AMC commands to extract open questions' answers from scans and to integrate scoring for them can be considered, but I'm afraid that all the rest is not at present in the scope of AMC - this could be developed as a web application for example.

Updated by Alexis Bienvenüe almost 4 years ago

  • Status changed from New to WontFix

Also available in: Atom PDF