AKA “Is everything in the rubric equally important to assess?”
This activity was co-created by Alysa Cummings and Karin Hess while working at the NJ Department of Education (1989). It has been revised and used with students and educators for many years to introduce types of scoring criteria and to provide a rationale for weighting some criteria over others. Any assessment task can have a “taste” criterion – the thing that is most important to the final product, and therefore gets more instructional attention and perhaps more scoring emphasis.
This activity is used to introduce using rubrics with multiple and varied criteria. It provides teachers (and students) with a concrete experience in the development and use of an assessment tool and then transferring that learning to examination of rubrics currently in use.
What This Activity Accomplishes
The universal appeal of chocolate chip cookies makes this task both accessible and enjoyable. The development process helps participants to quantify the quality of "good enough" which they later apply to content-specific assessment work. The deeper examination of the criteria used in the rubric leads to two important key ideas when using rubrics with multiple criteria: (a) a common understanding of the purpose and intent of each criterion is needed before one begins to score student work; and (b) it is important to have an understanding that not all criteria are equally important and therefore, some criteria may need to be weighted to better describe “good enough” performance.
Creating The Cookie Rubric
Groups brainstorm and record the qualities of a great chocolate chip cookie (e.g., taste, texture, number of chocolate chips, size). Once the categories (criteria) have been identified, define performance levels for each criterion. I suggest breaking down each criterion into 4 different levels, starting with descriptions of the “proficient” level. Once proficient performance is defined by group consensus, the other three levels are described: from the lower end/novice level of the range to the optimum or expert level. Write descriptors into a group rubric so they can be displayed for the whole group. Sometimes it’s helpful to suggest adding specific examples of what they might be looking for at each level, such as the brand of chocolate used. (Raters will not see the two cookies to be tested and scored until the rubric is completed.)
Each group rubric is then used with cookie A and cookie B to taste test two “blind” brands of chocolate chip cookies. (I try to find two similar cookies – but with defining features, such as one that is crispy and one that is chewy. Try to make the rating decisions difficult/subjective.) Hand out cookies A and B but do not reveal brands. After rating the two cookies, a spokesperson for each group must state conclusions using the data/evidence recorded. Score comparisons are made across groups about criteria used, how criteria are defined, and overall scores given to each cookie.
Conclusions will vary depending on each group’s criteria. Discuss how each rubric's criteria and descriptors influenced resulting scores. For example, is the score for cookie size as important as the score for taste?
Examining Criterion Types
Questions Typically Answered by Each Criterion
Process (Following the recipe)
Did the student follow the correct/appropriate steps or processes (e.g., procedures for a science investigation; data collection; measuring and recording; notetaking; developing an outline; following a routine or recipe, validating credibility of sources)? (Usually DOK 1 or 2)
Form (producing a batch of cookies)
Did the student apply correct formats and rules (e.g., handed in on time; used correct citation format; organized parts of task properly; used required camera shots/visuals; edited for grammar and usage)? (Usually DOK 1)
Accuracy of Content (product is chocolate chip cookies)
Is the answer/terminology/calculation, correct? Is the relationship explained in enough detail with elaboration/examples? Is the concept understood or accurately applied? Does the representation align with appropriate content and intended purpose? Are diagrams/representations correctly constructed and labeled? (Usually DOK 1 or 2)
Construction of New Knowledge (what you learned making cookies)
Did the student go beyond the accurate solution and correct processes to gain new insights, raise new questions and provide supporting evidence for claims or judgments made? Was an alternative solution presented with supporting evidence? (FAR Transfer - Usually DOK 3 or 4)
Did the final product achieve its intended purpose and provide supporting evidence for claims or judgments made (e.g., solved a problem; persuaded the audience; synthesized information to create a new product/performance) (FAR Transfer - Usually DOK 3 or 4)
Applying the Cookie Taste Test to Your Rubrics
Does the assessment task for your rubric ask students to think deeply and stretch their thinking?
Does the wording of the assessment task prompt and rubric wording match a range of Depth of Knowledge levels (form-accuracy-processes-knowledge production-impact)?
Are descriptors in rubrics descriptive rather than subjective (e.g., good, often, rarely) leading to better scoring agreement?