Summary
This project encompasses the ability for faculty or other users to import an assessment or question pool from Respondus files into Tests & Quizzes (Samigo).
This jira is listed at UCD as SAK-208 and at Sakai collab as SAK-1891
The project is broken into several parts:
- Ability to successfully import all Respondus question types into an assessment or question pool.
- Ability to import in Sakai versions 2.1.x (UCDavis and others) through current 2.3 (backporting)
- Ability to import a Respondus QTI 1.1 xml file or Content Package
- Ability to import from Respondus 3.5 (current version) and back
- Ability to migrate Respondus files into Tests & Quizzes via Common Cartridge
To successfully complete this project, we will identify the scope of each of these parts. For example, can all Respondus question types be translated into Samigo? What are the version limitations for backporting for both Sakai and Respondus?
Agile StoryCards
#Spec out GUI design for import Respondus quizzes and pools with Stanford - SAK-241
#Gap analysis between Samigo and Respondus question types and file formats - SAK-243
#Code design for import and Common Cartridge migration of multiple translation types - SAK-244
Spec out GUI design for import Respondus quizzes and pools with Stanford
Gap analysis between Samigo and Respondus question types and file formats
Testing Results
Import Results for 10/3/06
Initial assessment import testing of a Respondus v3.5 xml file with 8 questions, one of each type available. This testing is prior to any code changes or additions. The testing environment was my local Sakai v2-2-x with Sam Dev v2.3.
Original Type |
Translated Type |
Question Text |
Points |
Answers |
Feedback |
---|---|---|---|---|---|
Multiple Choice/Multiple choice single choice (mcsc) |
Defaulted to Essay type |
Text imported correctly |
Value correct |
Defaulted to individual Model Short Answer records - only the last displayed |
Feedback was not saved |
True-False (TF) |
Correctly typed |
Text imported correctly |
Value correct |
true and false answers displayed; correct answer not selected |
Feedback was not saved |
Essay |
Correctly typed |
Text imported correctly |
Value incorrect - original of 5.0 not saved |
n/a to translation |
Feedback was not saved |
Matching |
Correctly typed |
Text imported correctly |
Value correct |
Pairs did not translate |
Feedback was not saved |
Fill-in-Blank |
Correctly typed |
Text imported twice on question list; not at all in Modify Question |
Value correct |
no - question text not visible in Modify Question |
Feedback was not saved |
Multiple Response (Grading Method: Right Less Wrong) |
Correctly typed |
Text displayed correctly in list, but not in Modify where it generated UI problems |
Value correct |
Answers displayed correctly in list, but correct answers not indicated; UI problems in Modify |
Feedback was not saved |
Multiple Response (Grading Method: All Points or None) |
Defaulted to Essay type |
Text imported correctly |
Value correct |
Defaulted to individual Model Short Answer records - only the last displayed |
Feedback was not saved |
Algorithmic |
Instructions and link to Flash movie displayed; link fails due to failure to import SWF |
Value correct |
Defaulted to 5 individual Model Short Answer records |
n/a |
Of note here is that the feedback models for Samigo are more numerous and flexible than for Respondus. Samigo offers both question-level and answer-level incorrect and correct feedbacks, depending on the question type. Respondus offers only general feedback except for true-false. This will make for fewer required translation points.
Also of note here is that Samigo does not curently support algorithmic question types. Respondus exports this question type as a .swf or Flash movie and the question text is actually the JavaScript launch code and the xml answer output is similar to a multiple-choice single-correct answer. It would be interesting to determine if the SWF could be imported as an attachment and whether the user's response could be sent back for scoring.
Finally, there are differences in grading method that should be noted in the gap analysis so that instructors understand the potential grading changes in their translations.