- Non-standard closed-format items
These kinds of items usually present some kind of a table in the question
section, and the respondents' task is to mark one or several of the cells in
the table. The respondent has to associate the two given dimensions and
select the correct combinations. An example of a typical question would
be deciding which persons should carry out particular chores.
- Open-answer items
The respondents are required to generate their own answer and write it
down in the space provided. This may involve writing down one or more
numbers or letters or combinations of both, filling in forms, or specifying
the required information. For example, the respondents may have to write
down a price or a date or specify errors they found. They may be asked to
indicate the sequence in which they would carry out certain actions.
Sometimes respondents have to give an explanation for a response they
gave previously.
The majority of the items have either a multiple-choice or some other non-standard
closed format. The nature of the tasks and the relative intransparency of the
problems posed are linked with a generally relatively high difficulty level.
However, this can be
somewhat counteracted by structuring the answer possibilities in this manner.
It should also be noted that in doing so the data-processing load is greatly
reduced and is also less
error prone. Including a restricted number of open-format items ensures a higher
life-relevance
and broadens the scope of the test.
4.4 Conclusions
As presented above, the project approach for the conceptualization of essential
subsets of problem-solving competency particularly aims at analytical problem
solving in well-defined,
contextualized problem situations. The model of problem solving as outlined
above serves as a framework for item development and puts analytical problem-solving
tasks in context.
Solving the project tasks requires analytical operations such as searching,
understanding, systemizing, organizing, evaluating, reasoning and combining
information. These cognitive operations are essential for problem solving, defined as an
information processing activity. Additionally, the tasks sometimes demand a certain
kind of practical reasoning, which can best be described as the application of common
sense or everyday knowledge.
Of course, there are some aspects of problem solving that cannot be measured
within this approach: The dynamic aspects of task regulation (continuous processing of
incoming information, coping with processes that cannot be influenced directly, coping
with feedback and critical incidents) can only be addressed by computer-simulated
tasks (Complex Problem Solving). The motivational, affective, and self-regulatory aspects
of task regulation although implicit in the test tasks can only be explicitly addressed in
a questionnaire or some similar method.
Problem solving behavior triggered by this test will depend on general, context-specific,
domain-specific, and situation-specific processes. Nevertheless, this test is
designed to tap a general (latent) competency for analytical problem solving
as an essential part of problem solving. In fact, such a latent dimension has
been established in large-scale
assessments among student populations that used the project approach (Klieme,
Ebach et al., in press; Klieme, Funke et al., 2001), and the data of the ALL
pilot study corroborate these results, as will be reported in the next chapter.
|