Assessment of learning plays a dominant role in formal education in the forms of determining features of curriculum that are emphasized, pedagogic methods that teachers use with their students, and parents’ and employers’ understanding of how well students have performed. A common perception is that fair assessment applies the same mode of assessment and content focus for all students—the approach of assessments in international comparative studies of science achievement. This article examines research evidence demonstrating that the act of assessment is not neutral—different forms of assessment advantage or disadvantage groups of students on the basis of family backgrounds, gender, race, or disability. Assessment that implicitly or explicitly captures the social capital of the child serves to consolidate, not address, educational equity. The article provides an overview of ways that science curriculum focus and assessment can introduce bias in the identification of student achievement. It examines the effect of changes to curriculum and assessment approaches in science, and relationships between assessment of science and the cultural context of the student. Recommendations are provided for science–assessment research to address bias for different groups of students.
References
[1]
No Child Left Behind Act of 2001. Available online: http://www2.ed.gov/policy/elsec/eag/esea02/index.html (accessed on 1 August 2013).
[2]
Atkin, J.M.; Black, P.; Coffey, J. Classroom Assessment and the National Science Education Standards; National Academy Press: Washington, DC, USA, 2001.
[3]
Campbell, D.T. Assessing the Impact of Planned Social Change. In Social Research and Public Policies: The Dartmouth/OECD Conference; Lyons, G., Ed.; Dartmouth College, The Public Affairs Center: Hanover, NH, USA, 1975. Chapter 1; pp. 3–45.
[4]
Darling-Hammond, L. Evaluating “No Child Left Behind”. The Nation, 21 May 2007. Available online: http://www.thenation.com/doc/20070521/darling-hammond (accessed on 3 March 2013).
Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ.: Princ. Policy & Pract. 1998, 5, 7–74.
[7]
Harlen, W.; Deakin Crick, R. Testing and motivation for learning. Assess. Educ.: Policy Pract. 2003, 10, 169–207, doi:10.1080/0969594032000121270.
[8]
Shepard, L.A. Reconsidering large-scale assessment to heighten its relevance to learning. In Everyday Assessment in the Science Classroom; Atkin, J.M., Coffey, J.E., Eds.; NSTA Press: Arlington, VA, USA, 2003.
[9]
Black, P.; Harrison, C. Science inside the Black Box: Assessment for Learning in the Science Classroom; GL Assessment: London, UK, 2004.
[10]
Cumming, J.J. Valuing Students with Impairments: International Comparisons of Practice in Educational Accountability; Springer: Dordrecht, The Netherlands, 2012.
[11]
Moss, P.; Pullin, D.; Gee, J.; Haertel, E. The idea of testing: Psychometric and sociocultural perspectives. Meas.: Interdiscip. Res. Perspect. 2005, 3, 63–83, doi:10.1207/s15366359mea0302_1.
[12]
Broadfoot, P. Preface. In Educational Assessment in the 21st Century: Connecting Theory and Practice; Wyatt-Smith, C., Cumming, J., Eds.; Springer: Dordrecht, The Netherlands, 2009; pp. v–xi.
[13]
Lave, J.; Wenger, E. Situated Learning: Legitimate Peripheral Participation; Cambridge University Press: Cambridge, UK, 1991.
[14]
Saxe, G.B. The mathematics of child street vendors. Child Dev. 1988, 59, 1415–1425, doi:10.2307/1130503.
[15]
Stobart, G. Testing Times: The Uses and Abuses of Assessment; Routledge: Abingdon, UK, 2008.
[16]
Hill, S.; Fensham, P.; Howden, R. PhD Education in Australia: The Making of Professional Scientists. Report No.7, Science and Industry Forum; Australian Academy of Science: Canberra, Australia, 1974.
[17]
Fensham, P.J. Social content in chemistry courses. Chem. Br. 1976, 12, 148–150.
[18]
Solomon, J.; Aikenhead, G. STS Education: International Perspectives on Reform; Teachers College Press: New York, NY, USA, 1994.
[19]
Bennett, J.; Hogarth, S.; Lubbens, F. A Systematic Review of the Effect of Context-Based and Science-Technology-Society (STS) Approaches in the Teaching of Science; Department of Educational Studies, University of York: York, UK, 2005.
[20]
OECD (Organisation for Economic Co-operation and Development). PISA 2006: Science Competencies for Tomorrow’s World; OECD: Paris, France, 2007; Volume 1.
[21]
Martin, M.; Mullis, I.; Foy, P.; Stanco, G. TIMSS 2011 International Results in Science; TIMSS & PIRLS International Study Centre: Boston, MA, USA, 2012.
[22]
Lyons, T. Different countries, same science classes: Students’ experiences of school science in their own words. Int. J. Sci. Educ. 2006, 28, 591–614, doi:10.1080/09500690500339621.
[23]
Lindahl, B. Pupils Responses to School Science and Technology? A Longitudinal Study of Pathways to Upper Secondary School. G?teborg Studies in Educational Sciences, 196; Acra Universitatis Gothoburensis: Gothenburg, Sweden, 2003.
[24]
Osborne, J.; Collins, S. Pupils’ and parents’ views of the school science curriculum. Sch. Sci. Rev. 2000, 82, 23–31.
[25]
Brotman, J.S.; Moore, F.M. Girls and science: A review of four themes in the science education literature. J. Res. Sci. Teach. 2008, 45, 971–1002, doi:10.1002/tea.20241.
[26]
Hildebrand, G. Redefining achievement. In Equity in the Classroom; Murphy, P.F., Gipps, C.V., Eds.; Falmer Press: London, UK, 1996. Chapter 12; pp. 149–172.
[27]
Klainin, S.; Fensham, P.J.; West, L.H.T. Successful achievements by girls in physics learning. Int. J. Sci. Educ. 1989, 11, 101–112, doi:10.1080/0950069890110110.
[28]
Doig, S.M.; Wyatt-Smith, C.M.; Cumming, J.J.; Ryan, J. The evolution of language education within official accounts of Queensland curriculum. Qld. J. Educ. Res. 1998, 14, 4–44.
[29]
Literacy-Curriculum Connections: Implications for Theory and Practice; Cumming, J.J., Wyatt-Smith, C.E., Eds.; ACER: Melbourne, Australia, 2001.
[30]
Gardner, P.L. Words in Science; Australian Science Education Project: Toorak, Victoria, Australia, 1972.
[31]
Johnstone, A.H.; Cassells, J.R.T. Words that Matter in Science; Royal Society of Chemistry: London, UK, 1985.
[32]
Gardner, P.L.; Schafe, L.; Thein, U.M.; Watterson, R. Logical connectives in science: Some preliminary findings. Res. Sci. Educ. 1977, 17, 97–108.
[33]
Sutton, C.R. Language and communication in science lessons. In The Art of the Science Teacher; Sutton, C.R., Haysom, J.R., Eds.; McGraw Hill: London, UK, 1974; pp. 41–53.
[34]
Munby, A.H. Some implications of language in science education. Sci. Educ. 1976, 61, 115–124.
[35]
Lemke, J. Classroom Communication of Science; Educational Resources Information Center: Washington, DC, USA, 1983.
[36]
Lemke, J. Talking Science: Language, learning and values; Ablex: Norwood, NJ, USA, 1990.
Harlen, W. The Teaching of Science in Primary Schools, 2nd ed. ed.; David Fulton Publishers: London, UK, 1996.
[39]
Murphy, P. Sources of inequity: Understanding students’ responses to assessment. Assess. Educ. 1995, 2, 213–232, doi:10.1080/0969595950020302.
[40]
Harlow, A.; Jones, A. Why students answer TIMSS test items the way they do. Res. Sci. Educ. 2004, 34, 221–238, doi:10.1023/B:RISE.0000033761.79449.56.
[41]
Messick, S. Meaning and value in test validation: The science and ethics of assessment. Educ. Res. 1989, 18, 5–11.
[42]
Fensham, P.J. Student response to the TIMSS test. Res. Sci. Educ. 1998, 28, 481–489, doi:10.1007/BF02461511.
[43]
Shapiro, B. What Children Bring to Light: A Constructivist Perspective on Children’s Learning in Science; Teachers College Press: New York, NY, USA, 1994.
[44]
Lokan, J.; Adams, R.; Doig, B. Broadening assessment, improving fairness? Some examples from school science. Assess. Educ. 1999, 6, 83–99, doi:10.1080/09695949993017.
[45]
Wang, J. A content examination of TIMSS results. Phi Delta Kappan 1998, 80, 36–38.
[46]
Sj?berg, S. (2012) PISA: Politics, fundamental problems and intriguing resultsLa revue. Res. Educ. 2012, 14, 1–21.
[47]
McTaggart, R.; Curro, G. Book Language as a Foreign Language: ESL Strategies for Indigenous Learners; Queensland College of Teachers: Toowong, QLD, Australia, 2009.
[48]
Davidson, T. The Language of Maths: Literacy and Numeracy in the Middle Years. Far North Cluster of Indigenous Schools; DEST: Canberra, ACT, Australia, 2005.
[49]
Kaesehagen, C.; Klenowski, V.; Funnell, R.; Tobias, S. Where did I lose you? Accessing the literacy demands of assessment. Prim. Middle Years Educ. 2012, 10, 3–11.
[50]
Luykx, A.; Lee, O.; Mahotiere, M.; Lester, B.; Hart, J.; Deaktor, R. Cultural and home language influences on children’s responses to science assessments. Teach. Coll. Rec. 2007, 109, 897–926.
[51]
Turkan, S.; Liu, O. Differential performance by English language learners on an inquity-based science assessment. Int. J. Sci. Educ. 2012, 34, 2343–2369, doi:10.1080/09500693.2012.705046.
[52]
Wilson, K.; Stemp, K. Science education in a “classroom without wall’: Connecting young people via place. Teach. Sci. 2010, 56, 6–10.
[53]
Nelson-Barber, S.; Estrin, E. Bringing Native American perspectives to the teaching of mathematics and science. Theory Pract. 1995, 34, 174–185, doi:10.1080/00405849509543677.
[54]
Nelson-Barber, S.; Trumbull, E. Making assessment practices valid for Indigenous American students. J. Am. Indian Educ. 2007, 46, 132–147.
[55]
Solano-Flores, G.; Nelson-Barber, S. On the cultural validity of science assessments. J. Res. Sci. Teach. 2001, 38, 553–573, doi:10.1002/tea.1018.
[56]
Australian Curriculum, Assessment and Reporting Authority (ACARA). Science Foundation to Year 10 (v 5.0) Year 7 Science as Human Endeavour/Use and Influence of Science. 2010. Available online: http://www.australiancurriculum.edu.au/Science/Curriculum/F-10 (accessed on 7 December 2010).
[57]
Lynch, S. “Science for all” is not equal to “one size fits all””: Linguistic and cultural diversity and science education reform. J. Res. Sci. Teach. 2001, 38, 622–627, doi:10.1002/tea.1021.
[58]
OECD (Organisation for Economic Co-Operation and Development). Students with Disabilities, Learning Difficulties and Disadvantages: Policies, Statistics and Indicators; OECD: Paris, France, 2007.
[59]
Cumming, J.J. “Falchenberg v NY State Department of Education, 2008, p.183” Case Discussed in Valuing Students with Impairment: International Comparisons of Practice in Educational Accountability; Springer: Dordrecht, The Netherlands, 2012; p. 29.
[60]
Sireci, S.G.; Scarpati, S.E.; Li, S. Test accommodations for students with disabilities: An analysis of the interaction hypothesis. Rev. Educ. Res. 2005, 75, 457–490, doi:10.3102/00346543075004457.
[61]
Zuriff, G.E. Extra examination time for students with learning disabilities: An examination of the maximum potential thesis. Appl. Meas. Educ. 2000, 13, 99–107, doi:10.1207/s15324818ame1301_5.
[62]
OECD (Organisation for Economic Co-Operation and Development). PISA 2009 Results: What Students Know and Can Do—Student Performance in Reading, Mathematics and Science; OECD: Paris, France, 2010; Volume 1.
[63]
Brzyska, B. Requirements/Accommodations for Special Educational Needs within Large International Surveys and Studies. In Proceedings of the 12th Annual Conference of the Association for Educational Assessment—Europe, Belfast, Northern Ireland, 10–12 November 2011.
[64]
Martin, M.O.; Mullis, I.V.S.; Foy, P.; Olson, J.F.; Erberber, E.; Preuschoff, C.; Galia, J. TIMSS 2007 International Science Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades; TIMSS & PIRLS International Study Center, Boston College: Boston, MA, USA, 2008.
[65]
Zanderigo, T.; Dowd, E.; Turner, S. Delivering School Transparency in Australia: National Reporting through My School, Strong Performers and Successful Reformers in Education; OECD Publishing: Paris, France, 2012.
[66]
Ben-Simon, A. Improving Access to PISA for Students with Disabilities and Other Special Education NeedsMeeting of the PISA Strategic Development Group. 2011. Available online: http://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=EDU/PISA/GB/SDG(2011)3&docLanguage=En (accessed on 11 January 2013).
[67]
Kleinert, H.; Browder, D.; Towles-Reeves, E. Models of cognition for students with significant cognitive disabilities: Implications for assessment. Rev. Educ. Res. 2009, 79, 301–326, doi:10.3102/0034654308326160.
[68]
Mandinach, E.; Bridgeman, B.; Cahalan-Laitusis, C.; Trapani, C. The Impact of Extended Time on SAT Test Performance; College Board: New York, NY, USA, 2005.
[69]
Baroody, A.J.; Snyder, P.M. A cognitive analysis of basic arithmetic abilities of TMR children. Educ. Train. Ment. Retard. 1983, 18, 253–259.
[70]
Cumming, J.J.; Elkins, J. Lack of automaticity in the basic addition facts as a characteristic of arithmetic learning problems and instructional needs. Math. Cognit. 1999, 5, 149–180, doi:10.1080/135467999387289.
[71]
Finnane, M. The Role of Fluency in Mathematical Development: Factors Associated with Early Learning Difficulties in Mathematics. Ph.D. Thesis, University of Queensland, Brisbane, QLD, Australia, 2007.
[72]
Elliott, S.; Kettler, R.; Beddow, P.; Kurz, A.; Compton, E.; McGrath, D.; Bruen, C.; Hinton, K.; Palmer, P.; Rodriguez, M.; et al. Effects of using modified items to test students with persistent academic difficulties. Except. Child. 2010, 76, 475–495.
[73]
Elliott, S.; Kettler, R.; Roach, A. Alternate assessments of modified achievement standards: More accessible and less difficult tests to advance assessment practices. J. Disabil. Policy Stud. 2008, 19, 140–152, doi:10.1177/1044207308327472.
[74]
Cumming, J.J. After DIF: What Culture Remains? In Proceedings of the 26th Annual Conference of International Association for Educational Assessment, Jerusalem, Israel, 14–18 May 2000.