As children carry out investigations and design tasks; as they model, write, and draw; and as they reason, discuss, argue, and engage in other activities described in the previous chapters, they are learning. They are also revealing information about how they think, what they know and don’t know, and where they need more support. You can use this information to plan and adjust your instruction to better meet their needs as they progress toward learning targets—in other words, to engage in assessment.
Assessment is an integral part of instruction. You no doubt use it regularly to get feedback on learning for both individual children as well as the whole class, and then use that feedback to figure out your next instructional steps. In classrooms centered on actively engaging students in scientific investigation and engineering design, opportunities for assessment are all around you. Often, these opportunities are so closely intertwined with effective instructional strategies as to be inseparable.
This chapter is not about the standardized tests given at the end of a unit or school year, nor is it about grading. It is about classroom-based formative and summative assessment. This includes assessment evidence you gather through children’s discussions and written work, self/peer assessment, and more formal summative assessments. This chapter discusses and gives examples of various strategies you can readily use to better understand how your students are engaging in three-dimensional learning.
In a unit designed to help second graders learn how water shapes land, students make models, build small dams, and engage in other tasks. The following case shows how models and explanations can be embedded as a way to assess children’s understanding of disciplinary core ideas and engagement in science practices. Notice how the teacher, Ms. Vaughan, not only analyzes children’s models to assess each child’s thinking, but also organizes information from the models to advance learning for the whole class.
___________________
2 Shim et al., 2018.
The sections that follow focus on how you can support students in making their ideas and capabilities visible for purposes of assessment and instruction, with additional examples from Ms. Vaughan’s class.
Typically, assessments focus on determining what children know and understand. This type of information about children’s understanding (and misunderstanding) of disciplinary core ideas will help you plan instruction and determine what kinds of supports children need. In classrooms anchored in investigation and design, there’s an additional consideration—namely, that assessment, like instruction, should address all three dimensions of science and engineering learning.
Therefore, when you’re working with preschool and elementary children in a classroom oriented toward three-dimensional learning, assessment can help you accomplish the following purposes:
Ultimately, to enact instruction aligned to the new standards, we must think about assessment differently. First, we need to think about assessment as ongoing rather than something that happens only at the end of instruction. When formative assessments are embedded throughout science instruction, teachers can use assessment information to improve teaching and learning.
—Lorena Llosa, Scott Grapin, and Alison Haas3
Just as you provide scaffolding during instruction to help children learn, you can provide specific types of supports during assessment to help children demonstrate what they know, care about, and can do. Instruction anchored in investigation and design opens up a variety of approaches, formats, and contexts for assessment. As you weigh different assessment approaches, you might keep in mind the general suggestions in Box 6-1.
Several assessment approaches have already been mentioned throughout the book. For example, as described in Chapter 5, classroom discussions often reveal what children know and wonder about. During discussions, you can use multiple forms of evidence by allowing them to record a word, phrase, or drawing on an individual whiteboard; use gestures; or answer in multiple languages if necessary. When you ask students to demonstrate their understanding through drawings, models, and other artifacts, you can also encourage them to label and talk about the artifacts, instead
___________________
3 Llosa, L., Grapin, S., & Haas, A. (n.d.). Integrating science and language for all students with a focus on English language learners: Formative assessment in the science classroom [Brief 7]. SAIL, New York University. https://www.nysed.gov/sites/default/files/programs/bilingual-ed/brief-7-formative-assessment-in-the-science-classroom-a.pdf
of assuming the artifact alone is enough to show their thinking. In other classroom tasks, you can make sure to allow students to express their understanding in multiple ways—not only across the scope of a unit but for individual assignments.
Below are some additional examples of how you can support students in conveying their ideas as they engage in assessment tasks.
Classroom learning activities that connect with children’s home and community experiences can not only make instruction more relevant and meaningful but can also be a source of assessment information about the interests, experiences, and prior knowledge of children and families. Here are some examples that can provide important information about students’ time outside of school that you can build on in your teaching:4
As children develop artifacts to document their thinking, they often need guidance about focus, structure, and expectations for content. For example, the value of notebooks as an assessment tool depends heavily on the guidance teachers give chil-
___________________
4 The RISE project, Home-School Connections, https://rise.as.tufts.edu/home-to-school/
5 https://rise.as.tufts.edu/wp-content/uploads/2019/10/Fruits-and-Vegetables-in-My-Home.pdf
dren about what information to include in their entries.6 However, too little or too much guidance is not helpful and limits what you are able to assess. Providing children with moderate amounts of support—such as giving them prompts while letting them draw and write what they learned in their own words—works better for both assessment and learning.
In Figure 6-5, you can see how a four-year-old first drew the seeds of a pumpkin, which are rendered as circles inside the pumpkin’s outline. The child then colored over them to illustrate that the seeds are underneath the pumpkin’s flesh and skin. Around the edges, you can see that a paraprofessional wrote quotes from the child’s description of the drawing. In other words, the child drew what they knew, not just what they literally saw in front of them at the time, with guidance from the teacher.
Additionally, in an upper-elementary lesson about electric circuits, some teachers gave minimal guidance to students about what to put in their science notebooks, such as “write what you learned today.” The responses were often so lacking in detail that it was difficult to tell how much they understood. Other teachers were so prescriptive about format and content that the notebook entries looked as if students had just copied “right answers” from the board. A teacher named Gloria Diaz found the middle ground, offering a moderate degree of support. She provided enough structure that students knew what to focus on—in this case, comparing the brightness of a bulb in a series versus parallel circuit—so they had a good indication of what to represent in their notebooks. At the same time, students were encouraged to represent their ideas about the lesson in their own words, with symbols and/or diagrams. Their entries conveyed sufficient information for Ms. Diaz and others to monitor their understanding.7
Similarly, children benefit from guidance about important features to include in models and explanations. The self- and peer-assessment checklist in Figure 6-6 about elements of a landfill bottle could be used in this way. But a checklist doesn’t have to be a prepared handout. Instead, you can work with students to co-develop “gotta have it” checklists for the class. This option is suggested in the materials for the water and landforms unit that Ms. Vaughan taught in the case presented earlier in
___________________
6 Aschbacher, P., & Alonzo, A. (2006). Examining the utility of elementary science notebooks for formative assessment purposes. Educational Assessment, 11(3-4), 179–203.
7 Aschbacher & Alonzo, 2006.
this chapter. These materials include a sample checklist with examples of elements that could be addressed in models, drawings, and explanations, such as “evidence that water can slowly and quickly change the shape of the land” and explanations of “where this particular glacial moraine comes from, what is inside [it], and why the town became flooded.”8 This sort of sample doesn’t have to be taken verbatim; you could use something like this to guide students in collaboratively designing their own assessment rubrics.
Guidance could also take the form of an “anchor chart” that displays useful strategies for a particular task. An anchor chart about science observations might include such strategies as drawing what you see, focusing on what is most important, and using closeups and different views. Anchor charts can be revised or expanded as you make children’s work public and discuss what kind of information helps everyone in the class know what the creator of the work is thinking.
A relatively simple assessment support strategy is to make resources available in the classroom where children can refer to them as needed—and to explicitly tell students why they are there. You can also point to these resources to clarify questions that come up during discussions or group work (see Chapter 5 for more on this technique). Examples include but are not limited to anchor charts, dictionaries, and lists of key terms with illustrations. To the extent possible, these resources can be provided in multiple languages to further support multilingual learners.
Previous chapters emphasized the value of asking specific questions to guide students during discussions, investigations, and other tasks. These types of probing questions
___________________
8 All Circles of Learning Water Unit Guide, https://www.allciclesoflearning.com
[Our preschool assessment] was all observation-based, no testing. So, as you were leading the small group activity, you would just observe how the kids answer your questions . . . how are they engaging with the content? . . . So, you would look at your data and say, you know, a lot of the kids really needed extra help around x skill. And you could go back and pull that activity again, and redo it with some kids.
—Jessica Silver, a former preschool teacher and current professional development provider9
also serve an assessment purpose by bringing to light information about children’s thinking. Here are some examples:
The following example from Ms. Vaughan’s class highlights how probes can bring out children’s thinking.
___________________
9 Interview, May 16, 2022.
Once you’ve gathered assessment evidence about children’s understanding and proficiency, it can still be difficult to make sense of all the information. Children’s activities, talk, and work products are complex and can be interpreted through many lenses. You need to be clear about what you’re looking for and aware of the biases that can affect your interpretations.
A classroom assessment system really is a structured process for examining and interpreting children’s learning over time through their classroom work. Without such a process, it’s easy to get caught up in looking for textbook explanations for a phenomenon or looking at features like vocabulary, neatness, or basic writing conventions. What matters most is children’s progress in developing proficiency toward specific learning goals. Therefore, a process that supports you in making inferences about children’s progress and adapting your instruction accordingly should focus on these aspects:
___________________
10 Salgado, M., & Phelps, D. (2021). Approaches to research and design: Engaging young children in science and engineering practices [Conference session]. NARST.
From the beginning of an instructional unit, you can start thinking about what types of classroom activities and student work will yield evidence of students’ progress toward designated learning goals and uncover the kinds of thinking you want to encourage.
Consider how Ms. Vaughan approached children’s modeling work at the beginning of the water and landforms unit. The decisions she made were intended not only to get evidence about students’ grasp of the targeted disciplinary core ideas and practices for the unit, but also to create opportunities for students to grapple with these ideas and apply these practices. In looking at children’s initial models, she paid less attention to whether children “correctly” understood that the water went through the porous land. In fact, assessing for correctness at that point would likely have gotten in the way of future learning. Instead, she organized these models according to their main ideas about how water and land interact, as a way to set up future investigations that would enable students to build on those ideas over time. She asked the children themselves what experiences they had had that supported or challenged these ideas. Doing so served a dual purpose: Ms. Vaughan obtained more information about student thinking and personal contexts for learning, while the children had a chance to organize their thinking and use evidence to support and refute claims.
In general, rubrics are a vital tool to help you make inferences about children’s learning. You can structure rubrics so they specify criteria for assessing students’ work along a continuum of increasing proficiency in disciplinary core ideas, science practices, or both. Once again, an example comes from Ms. Vaughan’s instruction.
Note how the criteria for evaluating the depth of children’s explanations are based on their understanding of the specific mechanisms and ideas at play in the Moncton scenario. The criteria consider children’s ability to explain what happened, attend to how it happened, and identify the specific mechanisms for why it happened. The criteria are also connected to the big ideas that students explored in the unit, such as the movement of water through land, the permeability of landforms, and the role of man-made features like dams in changing land, among others. These types of criteria allow the teacher to make inferences about how an individual student is progressing toward a learning goal, but they should not be used to make comparisons between students.

Assessment, like other aspects of instruction, is most effective when all children have fair and multiple means of demonstrating what they know and can do. Equitable and relevant assessment presumes that children bring important knowledge, interests, and experience from their daily lives, which you can elicit and use to inform instruction.11
Some assessment tools or tasks may not accurately reveal the full range of understanding and proficiency for all children, including those from historically marginalized groups, because the tools were constructed for more homogenous groups of children. For example, if a child doesn’t remember the meaning of the word “plate” when used in the geologic sense, they may not do well on an assessment question or task that includes the word “plate” in a question about how mountains are formed. The child may be thinking of a dinner plate but may nevertheless understand in some way that pieces of the earth move and collide.
In addition, when some teachers interpret information from assessments, they may judge the skills of some children differently from others based on irrelevant characteristics such as gender, race, ethnicity, disability, or English language proficiency. This, in turn, may influence how teachers facilitate and modify instruction for children.
The potential for bias in conducting and interpreting assessments is real, and reducing it requires self-monitoring and proactive steps. Many of the suggestions in earlier chapters for making instruction equitable and culturally and linguistically relevant also apply to assessment. When you’re implementing and interpreting evidence from assessments, you can address potential bias with strategies like these:
___________________
11 Bell, P., Neill, T., Stromholt, S., & Shaw, S. (2018). Making science instruction compelling for all students: Using cultural formative assessment to build on learner interest and experience [Conference session]. https://www.oercommons.org/courseware/lesson/14482/overview
An example from a semi-urban school in the Northeastern U.S. highlights strategies used by Jesse Greene, an early-career kindergarten teacher, to engage and assess his students equitably in an investigation of what worms need to survive. Many of the children are from immigrant families, and many are emergent multilingual leaners. Mr. Greene, who is bilingual in English and Spanish, has thought a lot about how to effectively assess multilingual learners.
___________________
12 Brown, M., Zembal-Saul, C. & Lee, M. (2022). Formative assessment case and tools for kindergarten worm unit. Science 20/20: Bringing language learners into focus through school–university–community partnership. Centre for Educational Research and Innovation.
As the examples in this chapter illustrate, effective assessment is flexible. It takes advantage of both planned and impromptu opportunities to glean more about students’ thinking and practices. Effective assessment not only reveals learning but advances it.
If you incorporate investigation and design in your instruction to some extent, even if you’re not doing it as much as you’d like yet, you’re already engaging children in activities and practices that yield rich evidence for assessment. The key is to be more intentional when you’re designing classroom activities and guiding students’ discussions and work products, so you’re thinking from the outset about their assessment value and learning value.