Use Case: Supporting Students In Science

Research Related to This Challenge:

“In the PISA 2006 science literacy assessment, students completed exercises designed
to assess their performance in using a range of scientific competencies, grouped and
described as ‘competency clusters.’ These clusters— identifying scientific issues,
explaining phenomena scientifically, using scientific evidence —describe sets of skills
students may use for scientific investigation. PISA 2006 provides scores on three
subscales based on these competency clusters in addition to providing a combined
science literacy score.

      •    Identifying scientific issues includes recognizing issues that are possible to
           investigate scientifically; identifying keywords to search for scientific
           information; and recognizing the key features of a scientific investigation.
      •    Explaining phenomena scientifically covers applying knowledge of science in a
           given situation; describing or interpreting phenomena scientifically and
           predicting changes; and identifying appropriate descriptions, explanations, and
           predictions.
      •    Using scientific evidence includes interpreting scientific evidence and making
           and communicating conclusions; identifying the assumptions, evidence, and
           reasoning behind conclusions; and reflecting on the societal implications of
           science and technological developments.

Combined science literacy scores are reported on a scale from 0 to 1,000 with a mean
set at 500 and a standard deviation of 100.6. Fifteen-year-old students in the United
States had an average score of 489 on the combined science literacy scale, lower than
the OECD average score of 500 (tables 2 and C-2). U.S. students scored lower in
science literacy than their peers in 16 of the other 29 OECD jurisdictions and 6 of the 27
non- OECD jurisdictions. Twenty-two jurisdictions (5 OECD jurisdictions and 17 non-
OECD jurisdictions) reported lower scores than the United States in science literacy.”

Baldi, S., Jin, Y., Skemer, M., Green, P.J., and Herget, D. (2007). Highlights From PISA
2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in
an International Context (NCES 2008–016). National Center for Education Statistics,
Institute of Education Sciences, U.S. Department of Education. Washington, DC.1



1 Baldi, S., Jin, Y., Skemer, M., Green, P.J., and Herget, D. (2007). Highlights From PISA 2006:
Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International
Context (NCES 2008–016). National Center for Education Statistics, Institute of Education Sciences, U.S.
Department of Education. Washington, DC



                    Go to www.CollaborizeClassroom.com for more information
Use Case: Supporting Students In Science




Challenge: Supporting Student Success in Science

    •    Given the 30% reduction in students choosing a college major in STEM
         courses (science, technology, engineering and math), science teachers need to
         develop innovative strategies for engaging and interesting students in the
         various branches of science.
    •    Students are struggling to identify scientific issues, explain phenomena
         scientifically, and use scientific evidence to reach conclusions (as discussed in
         research above).
    •    Science teachers have limited time in the classroom to both perform dynamic
         labs/experiments and have meaningful follow-up collaborative discussions
         about the results.

Proposed Solutions

    •    Use Collaborize Classroom™ as a space for students to conduct follow-up
         conversations about lab results and discuss the implications and relevance of
         those results.
    •    Use Collaborize Classroom to allow students to work collaboratively to reach
         conclusions, address concerns, clarify confusions, make connections, analyze
         and synthesize results.

                 Go to www.CollaborizeClassroom.com for more information
Use Case: Supporting Students In Science

      •    Post questions online to facilitate focused, high quality discussions. Students
           could then use those conversations and the information gleaned to write more
           insightful, dynamic lab reports, which demonstrate their thorough
           understanding of the lab/experiment and the implications of its outcome(s).
      •    Post questions online that require students to make real world connections and
           discuss possible applications on a larger scale given their findings. These
           extension questions would make the material more meaningful for students.2


Expected Results

      •    These conversations would facilitate a deeper comprehension of the scientific
           principles at work, provide the necessary follow-up to engage and interest
           students, as well as produce a tangible outcome that could be discussed in
           class.
      •    Students would be more engaged in the process of performing the lab/
           experiment because they would be held accountable for their findings in the
           online forum.
      •    Students struggling with particular labs/experiences/scientific principles would
           have a supportive venue in which to have their questions and concerns
           addressed by their peers.
      •    Lab work would truly become a collaborative team building experience for
           students, positively impacting the classroom community and culture.




2Wheaton Shorr, Pamela. "The Science Crisis". Scholastic. April 10, 2010 <http://www2.scholastic.com/
browse/article.jsp?id=7153>.




                   Go to www.CollaborizeClassroom.com for more information

Supporting Students in Science

  • 1.
    Use Case: SupportingStudents In Science Research Related to This Challenge: “In the PISA 2006 science literacy assessment, students completed exercises designed to assess their performance in using a range of scientific competencies, grouped and described as ‘competency clusters.’ These clusters— identifying scientific issues, explaining phenomena scientifically, using scientific evidence —describe sets of skills students may use for scientific investigation. PISA 2006 provides scores on three subscales based on these competency clusters in addition to providing a combined science literacy score. •  Identifying scientific issues includes recognizing issues that are possible to investigate scientifically; identifying keywords to search for scientific information; and recognizing the key features of a scientific investigation. •  Explaining phenomena scientifically covers applying knowledge of science in a given situation; describing or interpreting phenomena scientifically and predicting changes; and identifying appropriate descriptions, explanations, and predictions. •  Using scientific evidence includes interpreting scientific evidence and making and communicating conclusions; identifying the assumptions, evidence, and reasoning behind conclusions; and reflecting on the societal implications of science and technological developments. Combined science literacy scores are reported on a scale from 0 to 1,000 with a mean set at 500 and a standard deviation of 100.6. Fifteen-year-old students in the United States had an average score of 489 on the combined science literacy scale, lower than the OECD average score of 500 (tables 2 and C-2). U.S. students scored lower in science literacy than their peers in 16 of the other 29 OECD jurisdictions and 6 of the 27 non- OECD jurisdictions. Twenty-two jurisdictions (5 OECD jurisdictions and 17 non- OECD jurisdictions) reported lower scores than the United States in science literacy.” Baldi, S., Jin, Y., Skemer, M., Green, P.J., and Herget, D. (2007). Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (NCES 2008–016). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.1 1 Baldi, S., Jin, Y., Skemer, M., Green, P.J., and Herget, D. (2007). Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (NCES 2008–016). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC Go to www.CollaborizeClassroom.com for more information
  • 2.
    Use Case: SupportingStudents In Science Challenge: Supporting Student Success in Science •  Given the 30% reduction in students choosing a college major in STEM courses (science, technology, engineering and math), science teachers need to develop innovative strategies for engaging and interesting students in the various branches of science. •  Students are struggling to identify scientific issues, explain phenomena scientifically, and use scientific evidence to reach conclusions (as discussed in research above). •  Science teachers have limited time in the classroom to both perform dynamic labs/experiments and have meaningful follow-up collaborative discussions about the results. Proposed Solutions •  Use Collaborize Classroom™ as a space for students to conduct follow-up conversations about lab results and discuss the implications and relevance of those results. •  Use Collaborize Classroom to allow students to work collaboratively to reach conclusions, address concerns, clarify confusions, make connections, analyze and synthesize results. Go to www.CollaborizeClassroom.com for more information
  • 3.
    Use Case: SupportingStudents In Science •  Post questions online to facilitate focused, high quality discussions. Students could then use those conversations and the information gleaned to write more insightful, dynamic lab reports, which demonstrate their thorough understanding of the lab/experiment and the implications of its outcome(s). •  Post questions online that require students to make real world connections and discuss possible applications on a larger scale given their findings. These extension questions would make the material more meaningful for students.2 Expected Results •  These conversations would facilitate a deeper comprehension of the scientific principles at work, provide the necessary follow-up to engage and interest students, as well as produce a tangible outcome that could be discussed in class. •  Students would be more engaged in the process of performing the lab/ experiment because they would be held accountable for their findings in the online forum. •  Students struggling with particular labs/experiences/scientific principles would have a supportive venue in which to have their questions and concerns addressed by their peers. •  Lab work would truly become a collaborative team building experience for students, positively impacting the classroom community and culture. 2Wheaton Shorr, Pamela. "The Science Crisis". Scholastic. April 10, 2010 <http://www2.scholastic.com/ browse/article.jsp?id=7153>. Go to www.CollaborizeClassroom.com for more information