Tuesday, March 21, 2017

Creating NGSS-Aligned Performance Tasks – Part 1

Whether you’re at the end of the unit or want to check for understanding earlier, performance tasks provide a way to gauge students’ abilities to engage in scientific thinking and use their content knowledge. It’s difficult to truly determine their depth of understanding of a concept or their ability to create scientific models and explanations through multiple choice or brief-response questions. As seen in the image to the right, people training to be astronauts don’t just answer multiple-choice questions! Performance tasks have the potential to provide more meaningful information to guide instruction and to frame feedback for students. But how do you create high-quality, NGSS-aligned tasks? Here’s one idea for a process to do so, and my next blog post will detail an example of going through this process.
  1. Determine a phenomenon – considering the current unit, what relevant phenomenon would make students go “hmmmm”? One new resource I found that includes some fabulous phenomena comes from the California Academy of Sciences, called BioGraphic. Generally, phenomena don’t need to be earth-shattering ideas. I like having an interesting question to guide a unit, then connect that to large- and/or small-scale experiences and engaging stories. For example, that could be declining bat populations or dropping a bowling ball and a feather in a vacuum. A task with your selected phenomenon as a context could frame a performance task at the beginning, middle, or end of the unit. 

  2. Work with practices – After determining a relevant phenomenon, I consider which science and engineering practice (SEP) would bring it to life and which SEP my students need more work with. It would be great if I was collaboratively working on a particular SEP with my department, making that a natural choice. Considering practices, I would not try to assess a practice as a whole, such as analyzing and interpreting data. It’s more useful to focus on a particular subskill in order to design the task, clearly determine students’ abilities, and provide specific feedback. Handy ideas for subskills can be found within Appendix F, the progression of SEPs, and the NGSS Evidence Statements, which break down each performance expectation by subskills of each practice. 

  3. Form a learning target – My primary learning target would be having students use a subskill of a science practice to work with a specific disciplinary core idea (DCI). To achieve three-dimensionality, a crosscutting concept (CCC) might be an implicit part of this learning target. Once I start framing learning targets that are three-dimensional, I start stuttering in the process of rubric creation (as noted in the last blog post). Instead, I often use two learning targets: one that connects practice and content and a second that connects content and big ideas (CCCs).

  4. Flesh out the scenario – With the goals and context of the task in mind, I begin to craft the story and related questions. Which part of the story are students exploring in this task? How does it fit into the overall storyline of the unit. My task might begin a unit, such as engaging students in data that describe concentrations of various chemicals in a nearby lake over the past 50 years. Students would go on to explore ecosystems, water chemistry, and human impacts. Crosscutting concepts are a wonderful resource for creating questions for the task, as each can be transformed into an authentic scientific question. For example, “What is the scale of the agricultural runoff problem?” Or, “What are the important inputs and outputs to consider in the sturgeons’ ecosystem?” This could be an opportunity for students to ask their own questions. Another great resource for framing questions based on the practices is the NGSS Task Formats from the Research + Practice Collaboratory. It provides a series of question templates that can be adapted to wide-ranging contexts. In the end, you’ll want to consider whether the question or series of questions in the task will be moving them further toward expertise in relation to performance expectations (PEs)—not that you’d have a goal of checking off proficiency in relation to PEs, more that you’d consider building student progress toward them through multiple authentic tasks. 

  5. Create a vision of proficiency – I outline my main ideas on proficiency in my previous blog post on rubrics. For proficiency with explanations, I also like the “What, How, Why” rubric by Thompson et al.. Notably, expectations for proficiency may start out a bit vague – having sample student work will help clarify what proficiency looks like, and further rubrics will improve over time. It’s a process! Also, I believe that teachers should reflect on whether or not these individual pieces of proficiency will add up to an assessment of your overall vision for students’ learning in science. Additionally, it’s important to consider whether you want individual proficiency or if you can glean important information to guide instruction from group work. Or, can students’ self- and peer-assessment provide the critical learning at this point? Rubrics or other proficiency guidance should be accessible to students. 

  6. Reflect – Both students and teachers should take time to reflect on the task. Teachers would reflect on evidence of student learning and how the task performed. Did it provide the information wanted in relation to the practice and content? Was it clear to students? Students should receive feedback sufficient to understand where they’re at in their learning in relationship to the goals put forth. That reflection can be supported by personal, peer, or teacher feedback. A key question with all assessment is: How are you giving feedback to students and how are they acting on that feedback? Honestly, I wouldn’t do in-depth reflection with every task; that could quickly become overwhelming. I’d recommend at least once per unit, with a range of practices throughout the year. Teachers will need at least a few common tasks and rubrics to use collaboratively through the year and discuss. 
As noted above, my next blog post will provide an example of going through this process to create a performance task.

Image courtesy of NASA: https://www.nasa.gov/feature/simulators-give-astronauts-glimpse-of-future-flights

No comments:

Post a Comment