answer-single-question.wf

  • last updated 2 hours ago
Constraints
Constraints: committers
 
Constraints: files
Constraints: dates
Perform stricter checking of parent_ids

bump version number of xowf to 5.10.1d34

  1. … 2 more files in changeset.
do not allow to edit unresolve links during the exam, but allow this in the preview mode

  1. … 1 more file in changeset.
streamline substitution handling

Perform same substitutions as in other test-item workflows.

Support for pool questions in the test-item family

Features:

- select random questions from some folder

(Currently siblings, i.e., folders of the same package instance)

- one exam can have multiple pool questions, potentially from other pools

- pools can be links to other folders (which are no siblings)

- The current folder can be used as well as a pool folder. In this

case, other not-used items can be selected as replacement items

given these match the specified filter characteristics

- Question filtering

* filter by item type:

allow one to select from all/some/some item types

(mc, sc, short text, reordering, composite, ...):

use as replacement items only examples of these types

* filter by minutes and/or points:

use as replacement items only items with matching points/minutes

* filter by language:

use as replacement items only items in a certain language

* Filter per item name pattern:

use as replacement items only items matching a name.

When (short-) names are used systematically, one can e.g.

use the date in the name and specify only items from e.g.

one year ago, via "*2020*", or from some chapter "*ch01", ...

Certainly, it is also possible to use different item

folders for this

Potential further steps:

- Currently, exams containing pool questions are treated as

auto-correctable (which implies automated exam-review (Einsicht), when

from the question pools only question from strictly closed questions

are selected (MC, SC, Ordering). Depending on the detailed settings,

also other item types could be possible (via correct-when), but this

requires a deeper analysis of every question, which is so far not

performed.

- Categorized items: technically, the infrastructure is mostly here to

allow also filtering by categories. This would allow one to select

e.g. from technical questions, case examples, knowledge transfer

questions, research-oriented questions... which are orthogonal to

the filtering currently available. Every lecturer can define

own categories depending on their needs, we could provide

university-wide category-trees, etc. Of course, one could also

define separate pools for these purposes, but categorizing is

probably more convenient and more flexible.

- Performance: when question pools become large (500+ questions) and

the cohorts as well (500+), the current version might require more

tuning. The only critical time is the exam-start, where the random

question placeholders have to be resolved for every student. The

approach from this weekend uses basic caching, but maybe this has to

be extended.

- Protecting selected questions: Question pools are more detached from

an exam than single exercises, a lecturer might have in mind. One

should not allow one to delete questions/question pools when these are

in use. Probably deletion should be a move to a trash-can, which is

actually, an issue for all exams, but getting more important with

pool questions.

- Handling of potential duplicates: When items are pulled from a

question pool, a replacement item is selected by making sure that

this item does not occur already in the query selection. Therefore,

one can safely draw two questions from the same question pool

without fearing that a student gets the same question twice.

This duplicate checking might require some fine-tuning:

* the system checks for duplicates in an exam via POOL/NAME.

* if a lecturer uses two pool-questions in an exam pointing to the same pool,

the systems makes sure, the same question is not used twice.

* however, if a teacher adds a question q1 to pool1 and the same question to pool2,

these two instances have different item ids and are regarded as two different

questions. One could make a test for only checking NAME (without POOL),

but then it might be the case that certain questions are not accepted

although these are different, because they have the same name.

- Statistics: so far, i have not provided any special answer

statistics for treating pool questions.

  1. … 10 more files in changeset.
strengthen parameter checking

  1. … 4 more files in changeset.
added exam-overview

  1. … 6 more files in changeset.
fixed ckeditor form field for inline mode.

use inline mode for editing test items

added a describe funtion for MC items

  1. … 6 more files in changeset.
added preview/testrun to edit item workflow

  1. … 3 more files in changeset.
file answer-single-question.wf was initially added on branch oacs-5-10.

    • -0
    • +0
    ./answer-single-question.wf