Pillar Practices
Downloads
Pillar Practices & New Core Principles (AERA 2024 paper)
Radical Empathy
Radical Empathy Personas Packet
Item Review
RTD Feedback Typology Packet
RTD Item Block Feedback Packet
Unpacking Standards
Item Refinement
Item Refinement Packet
Balancing Confidence and Humility
While there is a lot to Rigorous Test Development, there are five practices that content development professionals (CDP) rely on constantly. Each is a non-linear practice that cannot be simply proceduralized or turned into an algorithmic set of instruction. Instead, they are built on the RTD Core Principles, and expand out their own set of principles, ideas and goals. They make use of various RTD procedures and tools, but their real use comes through CDPs growing professional judgment, producing truly proficient professional practice. Over time, as CDPs hone their sense of professional judgment, their use of each of the pillar practices necessarily grows in sophistication.
These pillar practices are the tent poles that support validity-focused item development work.
Radical Empathy
Test developers create tests for broad groups of diverse individuals, usually with the goal of supporting valid inferences about each test taker. Even when tests are intended to support inferences about groups of test takers, they require the evidence elicited by items to be valid for individual test takers. But test takers vary in an enormous number of ways. Items must be understood equivalently by the range of typical tests takers and support valid inferences for the range, as well. This means that item developers must be able to view the items through the perspectives of tests takers unlike themselves — for a range of test takers unlike themselves. We call the practice of viewing items and tests through the perspectives of multiple test takers unlike oneself radical empathy. (For more information on this pillar practice, see the Radical Empathy page of this site and/or the downloads available from the sidebar to the left.)
Unpacking Standards
Individual content standards and assessment targets are written in a brief and perhaps pithy form. But they are dense with meaning. It can be easy to miss or blur the lines between different standards, especially across grades or levels. But it is important for CDPs to understand the depth and breadth and meaning within any single content standard or assessment target, understand the boundaries between them and to recognize what aspects or facets are worth assessing. After all, valid items elicit evidence of the targeted cognition, not just any vaguely germane cognition. Hence, CDP work is constantly informed by the meaning they have unpacked from the concise language in which content standards and/or assessment targets are presented. (Download the Unpacking Standards white paper from the sidebar to the left for more information on this pillar practice.)
Item Review
So much of content development work is about reading items and analyzing what (and how) they will prompt from test takers. This practice, Item review, is quite different than reading an item simply in order to respond to it. Instead, item review is an analytical process of breaking down how an item works, its relationship to the targeted cognition, how the element (e.g., stimuli, stem, answer options) fit together, depend upon each other and can influence test takers’ understanding of the meaning of each other. To borrow a legal term, Item review is often about issue spotting, which includes recognizing, naming and understanding the issue in question. The practice of Item Review can be built into item refinement (the pillar practice of altering, reworking and improving an item), but it often stands alone. That is, there are many times that items are reviewed without yet being seriously altered, as with review panels or initial intake of items from item writers. Item review requires understanding how item hygiene issues and item content and cognition traits are important to items ability to elicit evidence of the targeted cognition for the range of typical test takers. Because item review is so often part of the collaborative dimension of test development, the ability to communicate analysis of item issue (i.e., feedback) is another critical piece of item review. (For more information on this pillar practice, see the Item Review page of this site and/or the downloads available from the sidebar to the left.)
Item Refinement
Item refinement is the non-linear iterative process of revising an item until it is acceptable — until the CDPs believes that it can successfully elicit evidence of the targeted cognition for the range of typical test takers. Though we use the language of drafting, drafts and revision, the pillar practice of item refinement is quite different than editing or revising more common types of prose writing. While in later stages, item refinement can focus on getting small details right, in earlier stages it can require wholesale revamping of an item or large changes to major elements of the item. Item refinement is the process of recognizing issues in items (often due to feedback from others), figuring out a plan/approach for addressing them and then actually implementing those changes. This is unlike more common types of editing because of the delicate and interconnected nature of items. Even minor edits to one part of an item can take it off target, create bias, undermine the usefulness of a distractor and perhaps even subvert the identity of a key. Hence, CDPs must simultaneously evaluate ideas for item alterations as solutions to particular localized problems and in light of the overall functioning of the item. CDPs must bring understanding of the logic of the particular item and understanding of how the item type works across different test taker strategies. They must hold the whole item in their heads, as they consider the different entry points and paths through the item that may be attempted. This non-linear and iterative form of simultaneously holistic and detailed editing is the pillar practice of Item Refinement. (Download the Item Refinement white paper from the sidebar to the left for more information on this pillar practice.)
Balancing Confidence and Humility
Item development is collaborative work, as is the entire test development process. Different people contribute different expertise along the way. CDPs must balance confidence (i.e., knowing when to talk and to answer questions) with humility (i.e., knowing when to listen and to ask questions). Without appropriate confidence, they will fail to contribute to items and test development as they should. Without appropriate humility, they will prevent others from contributing as they should while cutting themselves off from important learning opportunities. Hence, finding the appropriate balance of confidence and humility among shifting groups of colleagues and collaborators, among different stages and processes, and as one’s own abilities and expertise grow is a critical ongoing practice. Failing to find that right balance actually hurts the quality of items and of tests. (Download the Expertise, Confidence and Humility white paper from the sidebar to the left for more information on this pillar practice.)