Assessing Knowledge Integration in Science: Construct, Measures, and Evidence

被引:97
|
作者
Liu, Ou Lydia [1 ]
Lee, Hee-Sun [2 ]
Hofstetter, Carolyn [3 ]
Linn, Marcia C. [1 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
[2] Tufts Univ, Medford, MA 02155 USA
[3] Univ Calif San Diego, San Diego, CA 92103 USA
基金
美国国家科学基金会;
关键词
D O I
10.1080/10627190801968224
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
In response to the demand for sound science assessments, this article presents the development of a latent construct called knowledge integration as an effective measure of science inquiry. Knowledge integration assessments ask students to link, distinguish, evaluate, and organize their ideas about complex scientific topics. The article focuses on assessment topics commonly taught in 6th-through 12th-grade classes. Items from both published standardized tests and previous knowledge integration research were examined in 6 subject-area tests. Results from Rasch partial credit analyses revealed that the tests exhibited satisfactory psychometric properties with respect to internal consistency, item fit, weighted likelihood estimates, discrimination, and differential item functioning. Compared with items coded using dichotomous scoring rubrics, those coded with the knowledge integration rubrics yielded significantly higher discrimination indexes. The knowledge integration assessment tasks, analyzed using knowledge integration scoring rubrics, demonstrate strong promise as effective measures of complex science reasoning in varied science domains.
引用
收藏
页码:33 / 55
页数:23
相关论文
共 50 条
  • [21] Revisiting the construct of "relational integration" and its role in accounting for general intelligence: The importance of knowledge integration
    Hannon, Brenda
    Daneman, Meredyth
    INTELLIGENCE, 2014, 47 : 175 - 187
  • [22] Assessing the construct validity of strategic and SIC-based measures of corporate diversification
    Sambharya, RB
    BRITISH JOURNAL OF MANAGEMENT, 2000, 11 (02) : 163 - 173
  • [23] Assessing Teachers' Science Content Knowledge: A Strategy for Assessing Depth of Understanding
    McConnell, Tom J.
    Parker, Joyce M.
    Eberhardt, Jan
    JOURNAL OF SCIENCE TEACHER EDUCATION, 2013, 24 (04) : 717 - 743
  • [24] Tracking Truth: Knowledge, Evidence, and Science
    Sankey, Howard
    REVIEW OF METAPHYSICS, 2007, 61 (01): : 158 - 159
  • [25] Examining pedagogical content knowledge: The construct and its implications for science education
    Barnett, J
    SCIENCE EDUCATION, 2003, 87 (04) : 615 - 618
  • [26] EXAMINING THE CONSTRUCT-VALIDITY OF 3 PARENTING KNOWLEDGE MEASURES USING LISREL
    HAMILTON, MA
    ORME, JG
    SOCIAL SERVICE REVIEW, 1990, 64 (01) : 121 - 143
  • [27] Assessing Teachers' Educational Knowledge Construct Specification and Validation Using Mixed Methods
    Linninger, Christina
    Kunina-Habenicht, Olga
    Emmenlauer, Simone
    Dicke, Theresa
    Schulze-Stocker, Franziska
    Leutner, Detlev
    Seidel, Tina
    Terhart, Ewald
    Kunter, Mareike
    ZEITSCHRIFT FUR ENTWICKLUNGSPSYCHOLOGIE UND PADAGOGISCHE PSYCHOLOGIE, 2015, 47 (02): : 72 - 83
  • [28] Assessing the Overlap of Science Knowledge Graphs: A Quantitative Analysis
    Tabita Ciuciu-Kiss, Jenifer
    Garijo, Daniel
    NATURAL SCIENTIFIC LANGUAGE PROCESSING AND RESEARCH KNOWLEDGE GRAPHS, NSLP 2024, 2024, 14770 : 171 - 185
  • [29] MECHANISMS OF ASSESSING AND FORMING KNOWLEDGE IN NATURAL-SCIENCE
    MIRSKAYA, YZ
    VOPROSY FILOSOFII, 1979, (05) : 119 - 130
  • [30] Assessing Anger Expression: Construct Validity of Three Emotion Expression-Related Measures
    Jasinski, Matthew J.
    Lumley, Mark A.
    Latsch, Deborah V.
    Schuster, Erik
    Kinner, Ellen
    Burns, John W.
    JOURNAL OF PERSONALITY ASSESSMENT, 2016, 98 (06) : 640 - 648