Open response items require students to construct a response to the question or prompt, as opposed to selecting the correct response like with multiple choice and true/false questions. Teachers and other school or district staff can then review and score student responses. Students may be instructed to write their answers directly onto the generated answer sheet, to write their answers on a separate paper, or to enter their responses online.
Open response items can also be aligned to a rubricA standard of performance that is used to evaluate and measure student performance. In Schoolnet, rubrics are tables with rows that define a measured quality or skill and columns that define performance levels for each row. Rubrics can be created and managed in Assessment Admin by state, district, regional, and school administrators as well as teachers for classroom tests. that is used during scoring to measure student performance. When you associate a rubric with an open response item, weights are defined in the rubric for each performance level to calculate the number of points and the maximum points that can be awarded for the item. When modifying a rubric for a scheduled test item, the changes are only reflected on open response items. All other associated items will reflect the original rubric.
In TestNav, open response answers are saved automatically every 30 seconds.
The Text to Speech accommodation will read the question content, but not the student's written response.
To view or edit the following fields, click Additional Properties. For items that have been purchased from external vendors, these fields are imported. Your system may have up to three additional custom properties for items.
| Field | Description |
|---|---|
| Name | Searchable names used to locate the item in item banks. This field is not optional for task item types. |
| Author | The person who created the item. |
| Publisher | Vendor that wrote the item or passageA reading passage that can be included with a test item. Passages can be created independently or while creating a test item., such as Equella, Kaplan, or ETS. |
| Keywords | Searchable keywords used to locate the item in item banks. |
| Anchor ItemAny item that has been flagged by a teacher or administrator for identification as an anchor item. This flag is user-defined, visible to only its creator, and may be used for a variety of purposes. | Any item that has been flagged by a teacher or administrator for identification as an anchor item. This flag is user-defined, visible to only its creator, and may be used for a variety of purposes. |
| Question Language | The language used to author the content. The item can be used to administer assessments to students in a language other than English. This option is available for school districts with system configurations that support multiple languages. For the Text to Speech accommodation, English or Spanish pronunciation will be used based on this setting. Note that the test-level Text to Speech language setting will override the item-level Question Language setting. |
| Response Language | The language used by students to respond to the question in a language other than English. This option is available for school districts with system configurations that support multiple languages. The Response Language will default to the option selected for the Question Language, with the exception being Chinese. If Chinese is selected as the Question Language, the Response Language field will default to English, as Chinese text cannot be entered for student answers. |
| Authored Difficulty | Difficulty determined for this item after it has been field tested. |
| Batch | The batch number. This field is used by Florida only. |
| Bloom's TaxonomyA multi-tiered scale for learning objectives used to define how well a skill or compentency is learned or mastered. | Represents levels of learning: Remembering, Understanding, Applying, Analyzing, Evaluating, and Creating. |
| Course ID | The course ID the item is associated with. |
| Cognitive Demand Level | Low, Moderate, or High. |
| Item Category | An attribute of open response item types. |
| Reading Level | Select the reading level range for this item. |
| Additional Item Identifier | A unique item number that may be provided when importing items from a publisher, such as Equella. |
| Webb | Norman Webb's Depth of Knowledge Levels:
|
| Year | The year the test item was imported or created. |
| Hard to Measure Content Area | Indicates whether the test itemA test question, such as True/False, Matching, Open Response, and so on. Test items can be created in Assessment Admin or imported through the QTI import process. is in a hard-to-measure content area, such as World Languages. Items that are imported through QTIThe IMS Question & Test Interoperability (QTI) specification enables the exchange of item, test, and results data between authoring tools, item banks, test constructional tools, learning systems, and assessment delivery systems. Test items that meet the QTI 2.1 specification can be imported into Schoolnet. Import that are flagged as hard-to-measure will maintain the flag when they are imported. |