Responsive image
博碩士論文 etd-0726112-170327 詳細資訊
Title page for etd-0726112-170327
論文名稱
Title
科學概念學習進程的發展、評量與教學:以氧化還原為例
Development, Assessment, and Instruction of Learning Progression for Scientific Concepts: An Example of Learning Oxidation-Reduction
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
174
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2012-05-28
繳交日期
Date of Submission
2012-07-26
關鍵字
Keywords
概念理解、氧化還原、Rasch測量、可能發展區、學習進程
zone of proximal development, learning progression, oxidation-reduction, conceptual understanding, Rasch measurement
統計
Statistics
本論文已被瀏覽 5794 次,被下載 1300
The thesis/dissertation has been browsed 5794 times, has been downloaded 1300 times.
中文摘要
本研究旨在結合學習進程(learning progressions)與學習可能發展區(zone of proximal development; ZPD)的理念,開發氧化還原學習進程的評量工具,並透過實施評量回饋教學界定學習的ZPD,以幫助教育現場的教與學。
首先,本研究根據BEAR(Berkeley Evaluation and Assessment Research)評量架構,發展氧化還原學習進程的評量試題。評量發展團隊成員含括6位化學、科學教育和教育評量領域學者專家,以及3位教學經驗豐富的國中和高中資深化學教師。團隊成員歷經24次的工作會議,完成28題可評量氧化還原學習進程的次序性選擇題(ordered multiple-choice; OMC)。試題預試對象為626位國中學生(國二304位、國三322位);正式施測對象為903位國三學生。
其次,研究者與2位國中理化教師透過7次的工作會議,設計評量回饋教學的流程,並完成教材的開發。最後,透過實施教學實驗,檢驗評量回饋教學對學生氧化還原概念理解的影響,並界定學習氧化還原概念的ZPD。實驗設計採用不等組前後測準實驗設計,參與對象包括高雄市3所學校的196位國二學生(男101位、女95位)。
研究主要發現:(1)BEAR評量架構和Rasch測量模式的分析方法,能開發具多元效度證據的學習進程評量工具。(2)實徵資料支持氧化還原的學習進程是「從單一概念的理解到多重概念的理解,從分散的次概念到整合的主概念。」(3)評量回饋教學能有效地增進學生的科學概念理解。(4)學習進程的評量方式有助於界定學習的ZPD,讓抽象的ZPD落實於實務教學中。
本研究的主要貢獻在於結合學習進程與ZPD的理念,提供一個落實ZPD理念於教學現場的操作範例,並呈現一套如何運用BEAR評量架構和Rasch分析技術,開發學習進程評量工具的方法,以及開發一套適用於國中階段的氧化還原學習進程評量工具和評量回饋教學教材,希冀對於科學教育研究和實務工作能有所啟發與助益。
Abstract
This study aims to develop assessment which measures learning progressions for important scientific concepts such as oxidation-reduction (redox) and to identify students’ zone of proximal development (ZPD) through teaching practice incorporating assessment feedback.
The assessment items of redox were developed based on the framework of the BEAR (Berkeley Evaluation and Assessment Research) Assessment System. Six experts from chemistry, science education, and educational assessment, and three high school chemistry teachers with fruitful instructional experiences were recruited into the assessment team. Through 24 panel discussions, 28 ordered multiple-choice items were developed. Two samples of Taiwanese middle-school students participated in the test development: one for item revision and the other for validation. Sample 1 and 2 consisted of 626 middle school students (304 8th graders and 322 9th graders) and 903 9th graders, respectively.
The materials for instruction integrated assessment feedback were designed by the researcher and two middle-school science teachers through seven group meetings. A teaching experiment was implemented to examine the effect of assessment feedback on students’ understandings of redox and to identify their ZPD. The teaching experiment employed a quasi-experiment with a non-equivalent-group pretest-posttest design. Participants were 196 eighth graders (101 boys and 95 girls) from three middle schools.
The findings showed that (a) the BEAR assessment system and Rasch measurement approaches provided a feasible framework for developing validated tools to assess learning progressions; (b) the empirical data supported students’ learning of redox concept usually progressed “from uni-structure to multi-structure” and “from discrete sub-concepts to integrated concepts”; (c) the teaching practice integrated assessment feedback effectively facilitated students’ understanding of scientific concepts; (d) the assessment of learning progressions provided a mechanism for identifying students’ ZPD and helped realize the abtract idea of ZPD in teaching practices.
The main contributions of the study included (a) demostrating how to carry out the idea of ZPD into teaching practices through linking learning progressions and ZPD; (b) presenting how to apply BEAR assessment system and Rasch techniques to develop tools for assessing learning progressions; (c) developing a set of items for assessing learning progressions of redox and a series of materials for teaching practices integrated assessment feedback.
目次 Table of Contents
論文審定書………………………………………………………… i
誌謝………………………………………………………………… ii
中文摘要……………………………………………………………iii
英文摘要……………………………………………………………iv
第一章 緒論
第一節 研究背景 ……………………………………………1
第二節 研究問題 ……………………………………………10
第三節 研究目的 ……………………………………………10
第二章 文獻探討
第一節 氧化還原概念的學習與教學 ………………………11
第二節 BEAR評量系統 ……………………………………16
第三節 評量與教學的連結 …………………………………21
第三章 研究方法
第一節 氧化還原評量試題的開發 …………………………26
第二節 評量試題的施測與分析 ……………………………32
第三節 評量回饋教學的實施流程 …………………………36
第四節 評量回饋教學實驗 …………………………………38
第四章 研究結果
第一節 預試試題品質的檢核 ………………………………47
第二節 正式測驗的信、效度 ………………………………50
第三節 評量回饋教學效果 …………………………………63
第四節 學習可能發展區之界定 ……………………………70
第五章 討論與結論
第一節 討論 …………………………………………………73
第二節 結論與建議 …………………………………………79
參考文獻 …………………………………………………………81
附錄
附錄1評量發展團隊工作歷程………………………………91
附錄2氧化還原概念圖………………………………………93
附錄3氧化還原命題陳述……………………………………94
附錄4評量回饋教學規劃歷程………………………………96
附錄5教學簡報………………………………………………98
附錄6學習診斷圖範例說明…………………………………105
附錄7學習單填寫範例說明…………………………………108
附錄8試題選項概念理階層次說明…………………………114
附錄9 評量回饋知覺問卷……………………………………141
附錄10專家審查說明…………………………………………142
附錄11二階段診斷試題………………………………………144
附錄12試題命題陳述分配表…………………………………163
參考文獻 References
王文中(2004)。選擇性題目與題組的編製。在王文中、呂金燮、吳毓瑩、張郁雯、張淑慧合著,教育測驗與評量-教室學習觀點(第二版)(頁157-199)。台北:五南。
周進洋、吳裕益(2001)。促進中學生化學反應概念及氧化還原概念改變之研究。行政院國家科學委員會研究計畫成果報告(NSC 89-2511-S-017-049)。高雄:國立高雄師範大學科學教育研究所。
邱鴻麟、周進洋(2002)。我國中學生氧化還原概念內容和迷思概念成因之研究(Ⅱ)。行政院國家科學委員會研究計畫成果報告(NSC 90-2511-S-017-022)。高雄:國立高雄師範大學化學系。
張容君、張惠博(2007)。國中學生「燃燒」概念診斷之研究。科學教育學刊,15,671-701。
教育部(2008)。國民中小學九年一貫課程綱要自然與生活科技學習領域。台北:作者。
詹耀宗、邱鴻麟(2004)。以多元觀點探討中學生氧化還原迷思概念。高雄師大學報,17,337-358。
趙匡華(1998)。化學通史(上)。新竹:凡異出版社。
Adadan, E., Trundle, K. C., & Irving, K. E. (2010). Exploring grade 11 students' conceptual pathways of the particulate nature of matter in the context of multirepresentational instruction. Journal of Research in Science Teaching, 47, 1004-1035.
Allal, L., & Ducrey, G. P. (2000). Assessment of-or in-the zone of proximal development. Learning and Instruction, 10, 137-152.
Alonzo, A. C., Briggs, D. C., & Wilson, M. (2009, April). Learning progressions for formative classroom assessment. In A. C. Alonzo (Chair), Learning progressions in science: Tools for teaching, learning, and assessment. Symposium conducted at the annual meeting of the American Educational Research Association, San Diego, CA.
Alonzo, A., & Steedle, J. T. (2009). Developing and assessing a force and motion learning progression. Science Education, 93, 389-421.
Ames, C. (1992). Classrooms: Goals, structures, and student motivation. Journal of Educational Psychology, 84, 261-271.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.
Barke, H. D., Hazari, A., Yitbarek, S. (2009). Misconceptions in chemistry: Addressing perceptions in chemical education. Berlin, Heidelberg: Springer Berlin Heidelberg.
Bauer, M. (2012, April). The Use of Three Learning Progressions in Formative Assessment in Middle School Mathematics. In M. Bauer (Chair), Current research on the use of learning progressions in mathematics formative assessment: Explorations and synthesis. Symposium conducted at the annual meeting of the American Educational Research Association, Vancouver, BC, Canada.
Beerenwinkel, A., Parchmann, I., & Gräsel, C. (2011). Conceptual change texts in chemistry teaching: a study on the particle model of matter. International Journal of Science and Mathematics Education, 9, 1235-1259.
Bell, B. (2007). Classroom assessment of science learning. In S. K. Abell & N. G. Lederman (Eds.). Handbook of Research on Science Education (pp. 965-1006), Mahway, NJ: LEA.
Bell, B., & Cowie, B. (2001). The characteristics of formative assessment in science education. Science Education, 85, 536-553.
Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94, 765-793.
Biggs, J. B., & Collis, K. F. (1982). Evaluating the quality of learning: The solo taxonomy. New York: Academic Press.
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working inside the black box: Assessment for learning in the classroom. Phi Delta Kappan, September, 9-21.
Black, P., & Wiliam, D. (1998a). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, October, 139-148.
Black, P., & Wiliam, D. (1998b). Assessment and classroom learning. Assessment in Education, 5, 7-75.
Black, P., Wilson, M., & Yao, S. Y. (2011). Road maps for learning: A guide to the navigation of learning progressions. Measurement, 9, 71-123.
Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self-efficacy data. Science Education, 95, 258-280.
Boone, W., & Scantlebury, K. (2006). The Role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90, 253-269.
Bozhovich, E. D. (2010). Zone of proximal development: The diagnostic capabilities and limitations of indirect collaboration. Journal of Russian and East European Psychology, 47(6), 48-69.
Briggs, D. C., Alonzo, A. C., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items. Educational Assessment,11(1), 33 – 63.
Brown, A. L., & Ferrara, R. A. (1999). Diagnosing zone of proximal development. In P. Lloyd & C. Fernyhough (Eds.), Lev Vygotsky: Critical assessments Volume Ⅲ: The zone of proximal development (pp. 225-256). New York, NY: Routledge.
Brown, N. J. S., Kulikowich, J. M., & Wilson, M. (2010, April). Where do we go from here? Modeling and measuring learning progressions. In S. J. Loper (Chair), Learning progressions in science: Theoretical, curricular, assessment, and psychometric considerations. Symposium conducted at the annual meeting of the American Educational Research Association, Denver, CO.
Chan, C., Burtis, J., & Bereiter, C. (1997). Knowledge building as a mediator of conflict in conceptual change. Cognition and Instruction, 15, 1–40.
Chandrasegaran, A. L., Treagust, D. F., & Mocerino, M. (2007). The development of a two-tier multiple-choice diagnostic instrument for evaluating secondary school students’ ability to describe and explain chemical reactions using multiple levels of representation. Chemistry Education Research and Practice, 8, 293-307.
Chen, J., & Anderson, C. W. (2011, April). The reliability and validity of items in different formats in assessing K-12 students’ learning progression of carbon cycling. In K. L. Huff (Chair), Assessment of learning progressions. Paper session conducted at the annual meeting of the American Educational Research Association, New Orleans, LA.
Chinn, C.A. & Brewer,W.F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science concepts. Learning and Instruction, 4, 27–44.
Chiu, M. H. (2007). A national survey of students' conceptions of chemistry in Taiwan. International Journal of Science Education, 29(4), 421-452.
Chiu; M. H., Guo, C. J., & Treagust, D. F. (2007). Assessing students' conceptual understanding in science: An introduction about a national project in Taiwan. International Journal of Science Education, 29(4), 379-390.
Claesgens, J., Scalise, K., Wilson, M., & Stacy, A. (2009). Mapping student understanding in chemistry: The perspectives on chemists. Science Education, 93, 56-85.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.
Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review, 95, 256-273.
Dreyfus, A., Jungwirth, E., & Eliovitch, R. (1990). Applying the "cognitive conflict" strategy for conceptual change. Some implications, difficulties and problems. Science Education, 74, 555-569.
Duit, R., & Treagust, D. F. (2003). Conceptual change: A powerful framework for improving science teaching and learning. International Journal of Science Education, 25, 671-688.
Duncan, R.G., & Hmelo-Silver, C. E. (2009). Learning progressions: Aligning curriculum, instruction, and assessment. Journal for Research in Science Teaching, 46, 606-609.
Duncan, R. G., Rogat, A. D., & Yarden, A. (2009). A learning progression for deepening students’ understandings of modern genetics across the 5th–10th grades. Journal of Research in Science Teaching, 46, 655-674.
Duncan, R. G., & Tseng, K. A. (2011). Designing project-based instruction to foster generative and mechanistic understandings in genetics. Science Education, 95, 21-56.
Freidenreich, H.B., Duncan, R. G., & Shea, N. (2011). Exploring middle school students' understanding of three conceptual models in genetics. International Journal of Science Education, 33, 2323-2349.
Garnett, P. J., Garnett, P. J., & Hackling, M. W. (1995). Students' alternative conceptions in chemistry: A review of research and implications for teaching and learning. Studies in Science Education, 25, 69-96.
Garnett, P. J., & Treagust, D. F. (1992). Conceptual difficulties experienced by senior high school students of electrochemistry: Electric circuits and oxidation-reduction equations. Journal of Research in Science Teaching, 29, 121-142.
Gebhardt, E., & Adams, R. J. (2007). The influence of equating methodology on reported trends in PISA. Journal of Applied Measurement, 8, 305–322.
Gotwals, A. W., & Songer N. B. (2010). Reasoning up and down a food chain: Using an assessment framework to investigate students’ middle knowledge. Science Education,94, 258-281.
Griffin, P. (2007). The comfort of competence and the uncertainty of assessment. Studies in Educational Evaluation, 33, 87-99.
Guzzetti, B. J., Snyder, T. E., Glass, G. V., & Gamas, W. S. (1993). Promoting conceptual change in science: A comparative meta-analysis of instructional interventions from reading education and science education. Reading Research Quarterly, 28(2), 116–159.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77, 81-112.
Herrmann-Abell, C. F., & DeBoer, G. E. (2011). Using distractor-driven standards-based multiple-choice assessments and Rasch modeling to investigate hierarchies of chemistry misconceptions and detect structural problems with individual items. Chemistry Education Research and Practice, 12, 184-192.
Herron, J. D. (1996). The chemistry classroom: Formulas for successful teaching. Washington, DC: American Chemical Society.
Hewson, P. W., & Hewson, M. G. (1984). The role of conceptual conflict in conceptual change and the design of science instruction. Instructional Science, 13, 1-13.
Holland, P. W., & Wainer, H. (Eds.) (1993). Differential item functioning. Hillsdale, NJ: Lawrence Erlbaum.
Jensen, M. S., & Finley, F. N. (1995). Teaching evolution using historical arguments in a conceptual change strategy. Science Education, 79, 147–166.
Johnson, P., & Tymms, P. (2011). The emergence of a learning progression in middle school chemistry. Journal of Research in Science Teaching, 48, 849-877
Kenney, C. A., Wilson, M., & Draney, K. (2008). ConstructMap (computer program). UC Berkeley, CA: BEAR Center.
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119, 254-284.
Kolen, M. J., & Brennan, R. L. (2004). Test equating, scaling, and linking: Methods and practices (2nd ed.). New York, NY: Springer.
Lee, H. S., & Liu, O. L. (2010). Assessing learning progression of energy concepts across middle school grades: The knowledge integration perspective. Science Education, 94, 665-688.
Limo´n, M., & Carretero, M. (1997). Conceptual change and anomalous data: A case study in the domain of natural sciences. European Journal of Psychology of Education, 12, 213–230.
Limón, M. (2001). On the cognitive conflict as an instructional strategy for conceptual change: A critical appraisal. Learning and Instruction, 11, 357–380.
Linn, R. L., & Miller, M. D. (2005). Measurement and assessment in teaching (9th ed.). Upper Saddle River, NJ: Pearson Education/ Prentice Hall.
Lipnevich, A., & Smith, J. (2009). Effects of differential feedback on students’ examination performance. Journal of Experimental Psychology: Applied, 15, 319-333.
Liu, X. & Lesnial, K. M. (2005). Students’ progression of understanding the matter concept from elementary to high school. Science Education, 89, 433-450.
Liu, X. F., & Lesniak, K. (2006). Progression in children's understanding of the matter concept from elementary to high school. Journal of Research in Science Teaching, 43(3), 320-347.
Liu, X., & Boone, W. J. (Eds.) (2006). Applications of Rasch measurement in science education. Maple Grove, Minnesota: JAM Press.
Masters, G. (1982). A Rasch model for partial credit scoring. Psychometrika, 49, 359–381.
Mohan, L., Chen, J., & Anderson, C. W. (2009). Developing a multi-year learning progression for carbon cycling in socio-ecological systems. Journal of Research in Science Teaching, 46, 675-698.
National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds). Washington, DC: National Academy Press.
National Research Council. (2006). Systems for state science assessment. Committee on Test Design for K-12 Science Achievement. M. Wilson & M. Bertenthal (Eds.), Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.
National Research Council. (2007). Taking science to school: Learning and Teaching Science in Grages K-8. Committee on Science Learning, Kindergarten through Eighth Grade. R. A. Duschl, H. A. Schweingruber, & A. W. Shouse (Eds.). Washington, DC: National Academy Press.
Novak, J. D. & Gowin, D. B. (1984). Learning how to learn. Cambridge, UK: Cambridge University Press.
Österlund, L. L., Berg, A., & Ekborg, M. (2010). Redox models in chemistry textbooks for the upper secondary school: friend or foe? Chemistry Education Research and Practice, 11, 182-192.
Pintrich, P. R., Marx, R. W., & Boyle, R. A. (1993). Beyond cold conceptual change: the role of motivational beliefs and classroom contextual factors in the process of conceptual change. Review of Educational Research, 63 , 167–200.
Pintrich,P . R., & De Groot, E. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82, 33-40.
Plummer, J. D., & Krajcik, J. (2010). Building a learning progression for celestial motion: Elementary levels from an earth-based perspective. Journal of Research in Science Teaching,47, 768-787.
Pinarbaşi, T., Canpolat, N., Bayrakçeken, S., & Geban, Ö. (2006). An investigation of effectiveness of conceptual change text-oriented instruction on students’ understanding of solution concepts. Research in Science Education, 36, 313–335.
Piquette, J. S., & Heikkinen, H. W. (2005). Strategies reported used by instructors to address student alternate conceptions in chemical equilibrium. Journal of Research in Science Teaching, 42, 1112–1134.
Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation of a scientific conception: toward a theory of conceptual change. Science Education, 66 , 211–227.
Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. Copenhagen: Institute of Educational Research. (Expanded edition, 1980. Chicago: The University of Chicago Press.)
Raju, N. S., Price, L. R., Oshima, T. C., & Nering, M. L. (2007). Standardized conditional SEM: A case for conditional reliability. Applied Psychological Measurement, 31, 169-180.
Roth, K. J. (1985, April). Conceptual change learning and student processing of science texts. Paper presented at the Annual Meeting of the American Educational Research Association. Chicago, IL.
Schmidt, H. J. (1997). Students’ misconceptions-Looking for a pattern. Science Education, 81, 123-135.
Schmidt, H. J., & Volke, D. (2003). Shift of meaning and students’ alternative concepts. International Journal of Science Education, 25, 1409-1424.
Schumacker, R. E., & Smith, E. V., Jr. (2007). Reliability: A Rasch perspective. Educational
and Psychological Measurement, 67, 394-409.
Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Ache´r, A., & Fortus, D., ... Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46, 632-654.
Shepard, L. A. (2005). Linking formative assessment to scaffolding. Educational Leadership, 63, 66-70.
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78, 153–189.
Songer, N. B., & Gotwals, A. W. (2012). Guiding explanation construction by children at the entry points of learning progressions. Journal of Research in Science Teaching, 49, 141-165.
Songer, N. B., Kelcey, B., & Gotwals, A. W. (2009). How and when does complex reasoning occur? Empirically driven development of a learning progression focused on complex reasoning about biodiversity. Journal of Research in Science Teaching, 46, 610–631.
Steedle, J. T., & Shavelson, R. J. (2009). Supporting valid interpretations of learning progression level diagnoses. Journal of Research in Science Teaching, 46, 699-715.
Stevens, S. Y., Delgado, & Krajcik, J. (2007, April). Developing a learning progression for the nature of matter. In A. C. Alonzo (Chair), Assessing understanding through the use of learning progressions. Symposium conducted at the annual meeting of the American Educational Research Association, Chicago, IL.
Stevens, S. Y., Delgado, C., & Krajcik, J. (2010). Developing a hypothetical multi-dimensional learning progression for the nature of matter. Journal of Research in Science Teaching, 47, 687-715.
Strike, K. A., & Posner, G. J. (1992) A revisionist theory of conceptual change. In R. A. Duschl & R. J. Hamilton (Eds.), Philosophy of science, cognitive psychology, and educational theory and practice (pp. 147–176). New York: State University of New York Press.
Tam, H. P., & Li, L. A. (2007). Sampling and data collection procedures for the national science concept learning study. International Journal of Science Education, 29(4), 405-420.
Taştan, Ö., Yalçinkaya, E., & Boz, Y. (2008). Effectiveness of conceptual change test-oriented instruction on students‟ understanding of energy in chemical reactions. Journal of Science Education & Technology, 17, 444-453.
Taylor, C. S., & Nolen, S. B. (2008). Classroom assessment: Supporting teaching and learning in real classroom (2nd ed.). Upper Saddle River, NJ: Pearson Education.
Towns, M. H. (2010). Developing learning objectives and assessment plans at a variety of institutions: Examples and case studies. Journal of Chemical Education, 87, 91-96.
Treagust, D.F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10, 159–169.
Treagust D. F., Chandrasegaran A. L., Zain A. N. M., Ong E. T., Karpudewan M. and Halim L., (2011). Evaluation of an intervention instructional program to facilitate understanding of basic particle concepts among students enrolled in several levels of study. Chemistry Education Research and Practice, 12, 251-261.
Treagust, D. F., Jocobowitz, R., Gallagher, J. L., & Parker, J. (2001). Using assessment as a guide in teaching for understanding: A case study of a middle school science class learning about sound. Science Education, 85, 137-157.
Uzuntiryaki, E., & Geban, O. (2005). Effect of conceptual change approach accompanied with concept mapping on understanding of solution concepts. Instructional Science, 33, 311-339.
Vosniadou, S. (1994). Capturing and modeling the process of conceptual change. Learning and Instruction, 4, 45–69.
Vygotsky, L. S. (1935/1978). Mind in society: The development of higher psychological processes (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds. and Trans.). Cambridge, MA: Harvard University Press. (Original works published 1930-1935).
Wang, W. C. (2008). Assessment of differential item functioning. Journal of Applied Measurement, 9, 387-408.
Wang, T., & Andre, T. (1991). Conceptual change text versus traditional text and application questions versus no questions in learning about electricity. Contemporary Educational Psychology, 16, 103–116.
Warbruck, F., & Stachelscheid, K. (2010). Adaptive learning environment for chemical reactions. In G. Cakmakci & M.F. Taşar (Eds.), Contemporary science education research: learning and assessment (pp. 425-429). Ankara, Turkey: Pegem Akademi.
Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Erlbaum.
Wilson, M. (2008a, March). Measuring progressions. In A. C. Alonzo & A.W. Gotwals (Chairs), Diverse perspectives on the development, assessment, and validation of learning progressions in science. Symposium conducted at the annual meeting of the American Educational Research Association, New York, NY.
Wilson, M. (2008b). Cognitive diagnosis using item response models. Zeitschrift fur Psychologie/Journal of Psychology , 216(2), 74-88.
Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning progression. Journal of Research in Science Teaching, 46, 716-730.
Wilson, M., & Carstensen, C. (2005). Assessment to improve learning in mathematics: The BEAR assessment system. Journal of Educational Research and Development, 1(3), 27-50.
Wolfe, E. W., & Smith, E. V., Jr. (2007a). Instrument development tools and activities for measure validation using Rasch models: Part I-Instrument development tools. Journal of Applied Measurement, 8, 97−123.
Wolfe, E. W., & Smith, E. V., Jr. (2007b). Instrument development tools and activities for measure validation using Rasch models: Part II-Validation activities. Journal of Applied Measurement, 8, 204-233.
Wright, B. D., & Linacre, J. M. (1994). Chi-square fit statistics. Rasch Measurement Transactions, 8: 2, 350.
Wu, M. L., Adams, R. J., & Wilson, M. R. (2007). ConQuest [Computer software and manual]. Camberwell, Victoria, Australia: Australian Council for Educational Research.
Zohar, A., & Aharon-Kravertsky, S. (2005). Exploring the effects of cognitive conflict and direct teaching for students of different academic levels. Journal of Research in Science Teaching, 2(7), 829–855.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:自定論文開放時間 user define
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code