But if an individual or a group of individuals performed poorly on this measure, one would be hard pressed to understand or explain why. Were there underlying reading or language problems? Did test takers have sufficient technical knowledge to complete the tasks presented in the ICT measures? To understand what role these other domains contributed one would have to include cognitive and technical tasks in the assessment or test. Alternatively, one might want to focus on particular ICT proficiencies (for example, how well a person can access and manage information) and their underlying cognitive and technical components. This would involve creating tasks that measured these types of skills and knowledge across the three proficiency domains. These measures would provide evidence separating literacy and technology proficiencies from ICT proficiency. Such information would be useful for constituencies such as adult basic education centers interested in diagnosing and remediating problems students are having accessing information on the Internet. A series of tasks that might be appropriate in this context are presented below (and in more detail in Appendix C).

Scenario: Following a stroke, your mother has been diagnosed with an atrial septal defect, or a hole in one section of her heart. While not an emergency, her doctor has recommended open-heart surgery to repair the hole and reduce the risk of additional strokes. You would like to find several reliable sources on the Web that recommend treatment options for this condition.

Access Using a search engine, locate sites that have articles about holes in the heart, or atrial septal defects.

Students having trouble with this basic ICT task could be presented with related cognitive and technical tasks to help diagnose what was causing their difficulty. For example, students might be presented with multiple-choice questions asking them to select the best word or phrase to use when searching for some specified information. Included among the choices might be terms that are overly general or specific. Students having difficulty with this type of task might need practice in defining categories and efficient search strategies. In addition, very basic computer tasks, such as opening a search engine, clicking on sites, and navigating back to the search engine from those sites, might uncover technical skills requiring review or training.

Currently, there are various measures of literacy, numeracy and problem solving being used in large-scale assessments of school age and adult populations. There is also a measure of technical knowledge and under-standing that is being used with schoolaged populations. These are traditional paper and pencil measures. No attempt has been made, however, to build computer-based tasks to measure the integration of these cognitive and technical domains or to separate out the role each plays in the development of these more generative ICT proficiencies. The panel believes that the measurement of ICT literacy using paper and pencil will limit the ability to assess the full domain of knowledge and skills. Valuable information will be lost if assessment tasks are not embedded in real-world settings incorporating technology. For example, the measurement of an individual's ability to search for and access information would be hindered if the measurement did not provide an opportunity to log onto the Internet or a similar type of environment.