Friday, November 3, 2017

Creating NGSS-Aligned Performance Tasks – Part 2: An Example


   
As noted in the previous post, performance tasks provide a means to more authentically assess students’ ability to think and work like scientists (3D learning in NGSS parlance). Ideally, students shouldn’t feel like they’re “taking a test.” Authentic assessment allows them to show their learning in a meaningful context that’s part of the flow of daily instruction—it’s not a “drop everything and test” approach.

In this case, I’m going to imagine I was back teaching fifth graders and doing a physical science unit to support students in understanding properties and changes of matter (5-PS1). Below is my thinking as I designed a task, using the steps noted in the previous post on designing performance tasks.

1)   Determine a phenomenon – With an overarching question in this unit of, “How do substances change under different conditions?,” I look for an engaging phenomenon for students to investigate related to this learning. I decide on having them observe and investigate a burning match (with the added benefit of supporting students in proper fire safety!). Criteria for evaluating phenomena from NSTA could help in choosing a phenomenon. I see the burning match as engaging to students, something they’ve likely experienced (or can experience in class), easily connected to the intended standards, and containing a bit of mystery as to what exactly is going on.

2)   Work with practices – I next decide how and whether this phenomenon can connect to the science practices I feel will bring this learning alive for students. Ideally, I’d like to engage them in practices where I know they struggle, so I can have another data point in their progress. Modeling fits the bill on both fronts. Based on ideas from Appendix F of the NGSS within the 3-5 grade band, I’ve already had students doing collaborative modeling and using models we’ve created (or I’ve provided) to support explanations. I decide to have them try to individually develop their own models here to describe this phenomena; that’s also a sub-skill noted in Appendix F. That means students will need some extra guidance on not getting help from others (yet), so I can get a better sense of where they’re at individually. 

3)   Form a learning target – In conjunction with thinking about practices, I dig further into the disciplinary core ideas (DCIs) and performance expectations (PEs) to flesh out a learning target for this assessment task. I see this work as building toward PEs 5-PS1 through 5-PS4. I also see links to the PS1.A and B DCIs: matter is made of particles too small to be seen, the amount of matter is conserved when it changes form, and when substances are mixed a new substance may be formed. Within this focus, I can also see that students will be most explicitly working with the crosscutting concept of matter and energy, though others could also fit such as patterns and scale). My learning target would thus become, “Students create a model to help explain what happens at the particle level when a match burns.” I want them to be able to zoom in to show that we start with a mix of air and match particles, then end up with different particles: smoke, ash, and water (though noticing the water won’t be critical here). I want them to also use that model to help describe that properties (such as color, texture, and weight) have changed—realizing that some of that weight literally went up in smoke. Because I’m also going to try to see whether students can describe conservation of matter during this change, there’s also a formative assessment of finding weight (mass vs. weight is not differentiated at this grade), although conservation of matter is more of a secondary aspect of the learning target that we’ll work more with after this assessment.

4)   Flesh out the scenario – As a class we’d discuss what substances they’re starting with–the match and the air around it (I want them to consider the air, though I don’t think it’s necessary they come up with that on their own here), and what properties to consider. I’d lead them toward weight if it didn’t come up, being open to others as well. Students then individually find the weight of the match and make further observations of it (another formative assessment). With teacher support as needed, they light the match and let it burn on a safe surface, making further observations, including touching it once it’s cooled off if they choose.

Thinking about questions for them during this process, I look to the Research andPractice NGSS Task Formats [link] and the modeling components of Appendix F for ideas. I decide to provide this instruction: “Draw a model (picture) of the match before and after it burns that helps explain what the particles are doing in the match and around the match.” I also look at CCCs for question ideas and decide to ask, “How did you show things in your model that are too small to see?” and “How does your model show the same amount of matter there at the beginning and end?” In this unit, students would have previously worked with models of particles and models of particles in matter that’s undergoing changes.

5)   Create a vision of proficiency – While there are several skills and content pieces going on here, I specifically want a straightforward rubric focused on my key learning target of students being able to create a model that helps explain the particle nature of matter and that a change has taken place. At this point, I would create and use a rubric, however, only if I had a clear sense of expected elements of students’ proficiency. If I didn’t have a sense of how to lay out this topic, skill, or way of thinking in a progression, I would instead work with Facets of Students Understanding to gather and organize their work into categories showing what they can do/understand at this point. These categories could later be tailor made into a rubric showing a progression of their abilities and understanding.

In this case, I’m planning use of a rubric. At a more advanced level, I’m looking for students showing conservation of matter in their models (the frame of the crosscutting concept); that’s something that I expect them to collaboratively begin to be able to describe, but I see it as a more advanced skill at this time at the individual level. Below is an image of what that rubric might look like. Here’s a word file of this rubric [link] and a pdf. To create the rubric I used Appendix F and the evidence statements of the NGSS, following principles described in a previous post.

Main Target
1
2
3
4
Students create a model to help explain what’s happening at the particle level when a match burns.




Student creates a model that shows visible objects (the match before and after in this case). 

He/she provides some observations of these objects, such as the match before and after it burns.
Student shows a connection between visible matter and particles too small as part of their model.

Through before and after models, student shows that a change has taken place in this phenomenon.
Student creates a model that shows and describes visible objects (the match) and particles too small to be seen in the air and the match.

Student’s model describes and shows that the particles before and after are different, because we have new substances (e.g., ash and smoke).

With scaffolding, students is able to describe how there’s the same amount of stuff before and after within this phenomenon.

Student’s series of models clearly describes visible and particle-level changes, providing evidence that a chemical change has taken place.

Student describes through the model how the amount of stuff is the same before and after, even though the detailed weight measurement suggests it’s less.


6)   Reflect – I would walk around with a rubric in hand writing students’ names on it, noting where students are at and adding relevant notes about any other elements of their understanding. I’d reflect on results to determine how to best structure our next investigation(s) and who might need further scaffolding, mentoring, or other support within those investigations to build understanding of these topics and skills of modeling. I’d also gain a sense of where we’re at as a class overall. Note: I wouldn’t be grading students, and they wouldn’t be grading each other! That’s not what formative assessment is about.

To more directly support students’ learning, I’d have them share models with a partner. Students would each share their model, talking about its components and what it means. They would then take turns discussing one another’s models/thinking in relation to the rubric. As a class we would share a few models with important learning elements, and I’d provide students time to revise their models based on that learning. We’d also revisit these models at the end of the unit, enhancing them with further learning.



Tuesday, March 21, 2017

Creating NGSS-Aligned Performance Tasks – Part 1

Whether you’re at the end of the unit or want to check for understanding earlier, performance tasks provide a way to gauge students’ abilities to engage in scientific thinking and use their content knowledge. It’s difficult to truly determine their depth of understanding of a concept or their ability to create scientific models and explanations through multiple choice or brief-response questions. As seen in the image to the right, people training to be astronauts don’t just answer multiple-choice questions! Performance tasks have the potential to provide more meaningful information to guide instruction and to frame feedback for students. But how do you create high-quality, NGSS-aligned tasks? Here’s one idea for a process to do so, and my next blog post will detail an example of going through this process.
  1. Determine a phenomenon – considering the current unit, what relevant phenomenon would make students go “hmmmm”? One new resource I found that includes some fabulous phenomena comes from the California Academy of Sciences, called BioGraphic. Generally, phenomena don’t need to be earth-shattering ideas. I like having an interesting question to guide a unit, then connect that to large- and/or small-scale experiences and engaging stories. For example, that could be declining bat populations or dropping a bowling ball and a feather in a vacuum. A task with your selected phenomenon as a context could frame a performance task at the beginning, middle, or end of the unit. 

  2. Work with practices – After determining a relevant phenomenon, I consider which science and engineering practice (SEP) would bring it to life and which SEP my students need more work with. It would be great if I was collaboratively working on a particular SEP with my department, making that a natural choice. Considering practices, I would not try to assess a practice as a whole, such as analyzing and interpreting data. It’s more useful to focus on a particular subskill in order to design the task, clearly determine students’ abilities, and provide specific feedback. Handy ideas for subskills can be found within Appendix F, the progression of SEPs, and the NGSS Evidence Statements, which break down each performance expectation by subskills of each practice. 

  3. Form a learning target – My primary learning target would be having students use a subskill of a science practice to work with a specific disciplinary core idea (DCI). To achieve three-dimensionality, a crosscutting concept (CCC) might be an implicit part of this learning target. Once I start framing learning targets that are three-dimensional, I start stuttering in the process of rubric creation (as noted in the last blog post). Instead, I often use two learning targets: one that connects practice and content and a second that connects content and big ideas (CCCs).

  4. Flesh out the scenario – With the goals and context of the task in mind, I begin to craft the story and related questions. Which part of the story are students exploring in this task? How does it fit into the overall storyline of the unit. My task might begin a unit, such as engaging students in data that describe concentrations of various chemicals in a nearby lake over the past 50 years. Students would go on to explore ecosystems, water chemistry, and human impacts. Crosscutting concepts are a wonderful resource for creating questions for the task, as each can be transformed into an authentic scientific question. For example, “What is the scale of the agricultural runoff problem?” Or, “What are the important inputs and outputs to consider in the sturgeons’ ecosystem?” This could be an opportunity for students to ask their own questions. Another great resource for framing questions based on the practices is the NGSS Task Formats from the Research + Practice Collaboratory. It provides a series of question templates that can be adapted to wide-ranging contexts. In the end, you’ll want to consider whether the question or series of questions in the task will be moving them further toward expertise in relation to performance expectations (PEs)—not that you’d have a goal of checking off proficiency in relation to PEs, more that you’d consider building student progress toward them through multiple authentic tasks. 

  5. Create a vision of proficiency – I outline my main ideas on proficiency in my previous blog post on rubrics. For proficiency with explanations, I also like the “What, How, Why” rubric by Thompson et al.. Notably, expectations for proficiency may start out a bit vague – having sample student work will help clarify what proficiency looks like, and further rubrics will improve over time. It’s a process! Also, I believe that teachers should reflect on whether or not these individual pieces of proficiency will add up to an assessment of your overall vision for students’ learning in science. Additionally, it’s important to consider whether you want individual proficiency or if you can glean important information to guide instruction from group work. Or, can students’ self- and peer-assessment provide the critical learning at this point? Rubrics or other proficiency guidance should be accessible to students. 

  6. Reflect – Both students and teachers should take time to reflect on the task. Teachers would reflect on evidence of student learning and how the task performed. Did it provide the information wanted in relation to the practice and content? Was it clear to students? Students should receive feedback sufficient to understand where they’re at in their learning in relationship to the goals put forth. That reflection can be supported by personal, peer, or teacher feedback. A key question with all assessment is: How are you giving feedback to students and how are they acting on that feedback? Honestly, I wouldn’t do in-depth reflection with every task; that could quickly become overwhelming. I’d recommend at least once per unit, with a range of practices throughout the year. Teachers will need at least a few common tasks and rubrics to use collaboratively through the year and discuss. 
As noted above, my next blog post will provide an example of going through this process to create a performance task.

Image courtesy of NASA: https://www.nasa.gov/feature/simulators-give-astronauts-glimpse-of-future-flights