Monday, August 22, 2016

Creating Rubrics for Performance Tasks Aligned to NGSS – Part 2

I created the three-dimensional rubric below in an attempt to help get the ball rolling. I have honestly not yet seen a rubric where the creator claims it is three-dimensional. I’m not sure I’m there yet, so critique away! Most rubrics I’ve found only focus on the practices, which I agree is a good place to start (see the resource list at the end of this post). I would, however, like to see practices and crosscutting concepts linked to content within a rubric, so I attempted to do that here. Importantly, column three represents where a proficient student should be, while four provides ideas for more advanced studies.

Some background on this unit of study and the related performance task:

  • High school biology students are investigating ecosystems (LS2.C), human impacts on those ecosystems (LS4.D), and related pollution chemistry (PS1.B).
  • Imagining I’m still teaching… I engage the class in this unit by having them walk over to a nearby lake to make observations, ask questions, and take multiple water samples, highlighting the presence of large amounts of algae if students don’t bring it up. We meet the regional limnologist there and she briefly shares some information about pollution in the lake system and is on hand for questions (could alternatively Skype w/ a scientist or even watch a short watching a short video detailing pollution challenges – such as this news story)
  • The next day students discuss their observations and consider how and why the ecosystem in their local lake may be changing. They model the ecosystem of the lake, detailing relationships within and across biotic and abiotic elements, including what might be causing ecosystem changes. The models provide a formative assessment on students’ modeling ability and their background understanding of ecosystems generally, but also within the lake context. After completion, class sharing and discussion of those models serves to build common background knowledge about topics such as farm runoff and other pollutants affecting the lake.
  • I want to know where students are at in their ability to ask testable questions in an ecosystem modeling framework (Practice - Asking Questions; Crosscutting Concept – Systems and System Models). So, toward the end of that class I ask them to individually develop questions for studying changes to the lake ecosystem, framing those questions with the lens of the full system and available data on lake chemistry (e.g. data like this). I use the following rubric to score students’ individual responses before having them revise their questions in groups the next day. 
Here are some of my considerations in crafting this rubric:
  • I developed goals for the unit first and then created the rubric in conjunction with creating the investigations within the unit. I want multiple opportunities to assess student learning in a more formal way through a unit, and this performance task and rubric flowed out of the progression being built. So, the goals for learning represented in the rubric were in mind throughout the process, not an afterthought.
  • Our state vision for science learning in Wisconsin comes from page one of the summary of the NRC Science Education Framework. I’d want my assessment to provide information as to whether students are progressing toward that vision as well as through the NGSS progression we’d laid out for the year. The goals of this lesson, students being able to ask meaningful questions about local water pollution and the chemical impact on ecosystems, do fit within those broader goals.
  • Possibly the most important resource for designing the rubric was Appendix F, the progression document for the practices. The progression detailed for grades K-2, 3-5, 6-8, and 9-12 for asking questions provided ideas for where students should be and where they’re coming from, supporting the development of the columns within the rubric. They provide ideas for a developmental progression of learning without resorting to terms like never, somewhat, and always. Specifically, based on the progressions of the asking questions practice, I included having students connect questions to an analysis of data and systems.
  • Another important resource for designing the rubric was the NGSS Evidence Statements document. The evidence statements provide a concrete way to break down a practice into specific subskills, which is very useful in articulating the multiple rows of a rubric. In my case, they were most useful in suggesting that the question needs to be practicably testable (in the classroom) and relate to cause and effect.
  • Finally, I also used Appendix G, the progression document detailing the crosscutting concept of systems and system models. From this progression, I pulled ideas of inputs and outputs within the system, understanding the boundaries of the system to better formulate the question.  So, the rubric pushes students to consider how timeframes and a narrowed focus on particle chemicals and lake inputs could lead to a better question.
  • The specific NGSS components targeted here are: SEP Ask Questions, CCC Systems and System Models, and DCIs HS-LS2.C, HS-LS4.D, and HS-PS1.B. 
  • I also wanted to focus on questioning as the NGSS performance expectations (PEs) have limited connections to the questioning practice (only two in middle school and two in high school). Because teachers make the mistake of using the PEs to design their instruction, I worry students won’t have as many opportunities as they should to ask questions.
  • I used the idea of “with guidance” as part of the progression. It was a tough decision to include that. I felt that if we’re talking about a true developmental progression, the first step is often being able to do it with some help. Some students need scaffolding to get going with a skill, and they’re not going to be independent at first. So, I reflected that within this rubric.
  • Additionally, I’d want to have student responses to the performance task to serve as examples (anchors) of the varying levels within the rubric. I didn’t feel I could meaningfully create those on my own, so I hope to get some teachers to try this rubric, or something similar, and share anonymized samples of student work.
For the best outcomes, teachers should collaboratively create these rubrics or collaboratively refine and revise an existing rubric to meet their needs/vision. To improve instruction for all students, it’s also essential that they collaboratively review student work in light of the rubric. It won’t be perfect the first time! Teachers will have to improve the rubric over time along with other elements of their instruction based on formal and informal assessment data.

My next blog post will discuss strategies for developing NGSS-based performance tasks.

Annotated links to other resources w/ rubrics – please, add a link to yours in the comments!

  • Collaborative Inquiry into Students’ Evidence-based Explanations: How Groups of Science Teachers Can Improve Teaching and Learning” is article by Jessica Thompson, Melissa Braaten, Mark Windschitl, et al. This article provides details on how to create rubrics that detail learning progressions in terms of the what, how, and why of explanations. A sample rubric with embedded anchors of explanations, shows what student reasoning might look like, is provided.
  •  The Design-BasedImplementation Research team created a first draft of a rubric on the practiceof scientific modeling. It provides super useful details on what constitutes effective modeling. A problem is that it’s a bit long to be useful, though perhaps portions of it could be pulled out to assess subskills. I also don’t think progressions of ability using language such as “does not,” “some,” and “all” is as straightforward as denoting what students at different levels can do. 
  • The Instructional Leadership for Science Practices group provides a series of rubrics based on each practice that can be used to evaluate student performance. Or, there’s another version of the rubrics that could be used by an observer to provide teachers feedback on how the practices are being used in his/her classroom. Though, both versions tend to focus more on what students have the opportunity to do than what they have the capacity to do.
  • Wisconsin's Marshall High School has been working on standards-based grading and created a rubric based on the practices and life sciences DCIs
  • Arapahoe Elementary in the Adams County Five Star School District provides standards-basedgrading rubrics linked to NGSS – It gives a generic rubric template you’d use to plug in specifics for each particular CCC or SEP or DCI, but it might not provide sufficient information or nuances for individual SEPs, CCCs.
  • Edutopia provides a rubric for science projects, which has some good ideas for progressions of abilities, but remains fairly traditional - built from “scientific method” steps.
  • And, thanks to Cathy Boland, @MsBolandSci, for sharing a rubric for explanations through Twitter - I hope others will share too! 

5 comments:

  1. Thank you so much for the really helpful article and wealth of resources.

    ReplyDelete
  2. Fantastic work! The resources you list will be immensely helpful towards writing similar rubrics for middle school.

    ReplyDelete
  3. Not science specific, but here's a nice resource by Rick Wormeli on what makes a quality rubric - https://www.amle.org/BrowsebyTopic/WhatsNew/WNDet/TabId/270/ArtMID/888/ArticleID/539/Rubrics-and-Grading-Scales.aspx

    ReplyDelete
  4. This was nice and impressive post i read it and it was incredible thank you so much for sharing this kind of info with us you are so good and amazing with your work i appreciate that thank you.  site

    ReplyDelete
  5. The Museum of Science and Industry in Chicago is supporting teachers with the curricular, instructional, and assessment transformations required to meet expectations of the NGSS. We engage teachers in an unpacking process that identifies key ideas within DCI components, those key ideas are aggregated into Lesson Level Conceptual Pieces. Those pieces are then linked to a CCC element that best provides explanatory value. That pairing is then connected to an SEP element that provides the most appropriate engagement for students to make sense of whatever phenomenon is investigated. From that unpacking process, teachers can create lesson level Learning Performances (~3D lesson objectives). The unpacking process and document then facilitates the creation of lesson level assessment task rubrics based on Marzano's proficiency scales:

    https://docs.google.com/document/d/1OZ9KhJpZqBMwF17d-LN_RFEkVpsh2m-ogpKNPf9vPo8/edit?usp=sharing

    The rubric aspect of this process is still in development and in need of field testing.

    ReplyDelete