RESEARCH PAPER
Data Explorations: Secondary Students’ Knowledge, Skills and Attitudes Toward Working with Data
 
More details
Hide details
1
University of Hawaii at Hilo, School of Education, 200 W. Kawili St. 96720, USA
 
2
Cary Institute of Ecosystem Studies, Box AB, Millbrook, NY 12545, USA
 
 
Online publication date: 2019-01-21
 
 
Publication date: 2019-01-21
 
 
EURASIA J. Math., Sci Tech. Ed 2019;15(6):em1686
 
KEYWORDS
ABSTRACT
The Data Explorations in Ecology Project was a professional development and research project designed to address data literacy issues in secondary science classrooms. Curricular modules focusing on locally relevant environmental issues were developed and implemented to support students in gaining proficiency with a variety of data exploration practices. The research focused on highlighting students’ knowledge, skills, and attitudes toward these practices. The findings indicate that students across grade levels are able to demonstrate proficiency with some data exploration practices but have difficulty applying these skills to the creation and evaluation of data-driven arguments about the environment. The findings also indicate that despite the difficulties secondary students have with some data exploration practices, some still enjoy and appreciate the usefulness of working with data. This study introduces a conceptual framework that illustrates some of the data exploration activities that occur at different phases of inquiry and critique processes.
REFERENCES (29)
1.
AAAS. (1993). Benchmarks for Science Literacy. Project 2061. American Association for the Advancement of Science. Oxford UP, Oxford. Achieve, Inc. 2013. Next Generation Science Standards. Washington, DC. Achieve, Inc. www.nextgenscience.org.
 
2.
Advisory Committee for Environmental Research and Education. (2009). Transitions and Tipping Points in Complex Environmental Systems. A Report by the NSF Advisory Committee for Environmental Research and Education.
 
3.
Ben-Zvi, D. (2004). Reasoning about variability in comparing distributions. Statistics Education Research Journal, 3(2), 42-63.
 
4.
Ben-Zvi, D., & Garfield, J. (2004). Research on reasoning about variability: A forward. Statistical Education Research Journal, 3(2), 4-6.
 
5.
Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 95(5), 765-793. https://doi.org/10.1002/sce.20....
 
6.
Bieda, K., & Nathan, M. J. (2006, November). Speech and gesture in pattern generalization tasks involving graphs: Evidence that perceptions influence conceptions. In S. Alatorre, J. L. Cortina, M. Sáiz, & A. Méndez (Eds.), Proceedings of the twenty-eighth annual meeting of the North American chapter of the International Group for the Psychology of Mathematics Education (pp. 139–142).
 
7.
Chan, S. W., & Ismail, Z. (2012). The Role of Information Technology in Developing Students’ Statistical Reasoning. Procedia-Social and Behavioral Sciences, 46, 3660-3664. https://doi.org/10.1016/j.sbsp....
 
8.
Common Core State Standards Initiative. (2010). Common Core State Standards for mathematics. Retrieved from http://www.corestandards.org/a....
 
9.
Confrey, J., & Makar, K. (2002). Developing secondary teachers’ statistical inquiry through immersion in high-stakes accountability data. PME-NA24, Athens, GA.
 
10.
delMas, R. C., & Liu, Y. (2005). Exploring Students’ Conceptions of the Standard Deviation. Statistics Education Research Journal, 4(1), 55- 82.
 
11.
Driver, R., Newton, P., & Osborne, J. (2000). Establishing the norms of scientific argumentation in classrooms. Science Education, 84(3), 287-312. https://doi.org/10.1002/(SICI)...<287::AID-SCE1>3.0.CO;2-A.
 
12.
Ford, M. (2008). Disciplinary authority and accountability in scientific practice and learning. Science Education, 92(3), 404-423. https://doi.org/10.1002/sce.20....
 
13.
Franklin, C., Kader, G., Mewborn, D., Moreno, J., Peck, R., Perry, M., & Scheaffer, R. (2007). Guidelines for assessment and instruction in statistics education (GAISE) report. Alexandria: American Statistical Association.
 
14.
Friel, S. N., Curcio, F. R., & Bright, G. W. (2001). Making sense of graphs: Critical factors influencing comprehension and instructional implications. Journal of Research in Mathematics Education, 32, 124–158. https://doi.org/10.2307/749671.
 
15.
Garfield, J., & Ben-Zvi, D. (2004). Research on statistical literacy, reasoning, and thinking: issues, challenges, and implications. In D. Ben-Zvi & J. Garfield (Eds.), The challenge of developing statistical literacy, reasoning, and thinking (pp. 397-409). Dordrecht, the Netherlands: Kluwer Academic Publishers. https://doi.org/10.1007/1-4020....
 
16.
Gunckel, K. L., Covitt, B. A., & Anderson, C. W. (2009, June). Learning a secondary Discourse: Shifts from force-dynamic to model-based reasoning in understanding water in socioecological systems. In Learning Progressions in Science (LeaPS) Conference, Iowa City, IA.
 
17.
Hammerman, J. K., & Rubin, A. (2004). Strategies for managing statistical complexity with new software tools. Statistics Education Research Journal, 3(2), 17-41.
 
18.
Kahneman, D., & Tversky, A. (2000). Choices, values, and frames. Cambridge: Cambridge University Press.
 
19.
Kelly, G. J., & Chen, C. (1999). The sound of music: Constructing science as a sociocultural practice through oral and written discourse. Journal of Research in Science Teaching, 36(8), 883 – 915. https://doi.org/10.1002/(SICI)...<883::AID-TEA1>3.0.CO;2-I.
 
20.
Lee, C., & Meletiou, M. (2003). Some difficulties of learning histograms in introductory statistics. In Joint Statistical Meetings-Section on Statistical Education (pp. 2326-2333).
 
21.
Luykx, A., & Lee, O. (2007). Measuring instructional congruence in elementary science classrooms: Pedagogical and methodological components of a theoretical framework. Journal of Research In Science Teaching, 44, 424–447. https://doi.org/10.1002/tea.20....
 
22.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks, CA: Sage Publications.
 
23.
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. National Academies Press.
 
24.
NGSS Lead States. (2013). Next Generation Science Standards: For States, By States. Washington, DC: The National Academies Press.
 
25.
Osborne, J. (2002). Science without literacy: A ship without a sail? Cambridge Journal of Education, 32, 203–215. https://doi.org/10.1080/030576....
 
26.
Reading, C. (2004). Student description of variation while working with weather data. Statistics Education Research Journal, 3(2), 84–105.
 
27.
Sampson, V., & Clark, D. (2008). Assessment of the ways students generate arguments in science education: Current perspectives and recommendations for future directions. Science Education, 92(3), 447 – 472. https://doi.org/10.1002/sce.20....
 
28.
Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic scaffolds for scientific inquiry. Science Education, 88(3), 345-372. https://doi.org/10.1002/sce.10....
 
29.
Zawojewski, J. S., & Shaughnessy, J. M. (2000). Data and chance. In E.A. Silver and P.A. Kenney (Eds.), Results from the seventh mathematics assessment of the National Assessment of Educational Progress (pp.235–268). Reston, VA: National Council of Teachers of Mathematics.
 
eISSN:1305-8223
ISSN:1305-8215
Journals System - logo
Scroll to top