Skip to main content
Top

Finding the Connections: A Scoping Review of Epistemic Network Analysis in Science Education

  • Open Access
  • 28-12-2024
Published in:

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The article presents a scoping review of epistemic network analysis (ENA) in science education, examining its use in capturing the complexity of mental models. It discusses the shift in understanding of learning from isolated facts to interconnected knowledge, and how ENA has emerged as a robust tool to analyze these connections. The review covers the fundamental components of ENA, its applications in various contexts such as learning analytics and curriculum analysis, and the trends found in research conducted with ENA. The article highlights the potential of ENA to extend the utility of traditional assessments and to model complex phenomena in novel ways, providing a promising tool for researchers in science education.

Supplementary Information

The online version contains supplementary material available at https://doi.org/10.1007/s10956-024-10193-x.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Introduction

In the last three decades, researchers’ understanding of how people learn has shifted beyond the learning of isolated facts towards the interconnectedness of knowledge (Bransford et al., 2000; American Association for the Advancement of Science, 2011; NGSS Lead States, 2013). As such, science education has improved its understanding of student learning, and instructors have begun asking students to synthesize ideas into cognitive models or networks (e.g., systems thinking, Momsen et al., 2022). Therefore, the methodologies we use to understand students’ and instructors’ ways of knowing need to capture the complexity of these mental models.
Within education, one methodology that has emerged to capture this complexity is epistemic network analysis (ENA). ENA focuses on epistemic elements, or cognitive structures, as a unit of analysis. The connections, or co-occurrences, among these elements can be identified and visualized as networks (Shaffer et al., 2009). Therefore, ENA is a robust tool for understanding connections between people’s conceptualizations of their knowledge by modeling co-occurrences in data via quantitative measures and network visualization (Schaffer et al., 2016).
ENA draws upon quantitative ethnography, a mixed-methods approach that begins with the researcher qualitatively coding data followed by quantitative analysis to create epistemic networks that can be compared using inferential statistics (Shaffer et al., 2016). This mixed methods approach enables the ENA methodology to provide a depth of analysis and synthesis to compare large quantities of qualitative data. While much of the ENA research and publications have focused on learning analytics, researchers within science education have begun to embrace it as a useful methodology to explore the connections learners make among ideas. In this review, we will present the trends found in research conducted with ENA in science education.

Knowledge Is an Interconnected System of Elements

Drawing from the Schema Theory, knowledge construction can be viewed as sets of interconnected elements that allow individuals to understand relationships across concepts (McVee et al., 2005). Schema can build upon each other, with connections continually changing as new information is brought into the schema. diSessa and colleagues (1998, 1993), diSessa and Sherin (1998), and Smith et al. (1994) conceptualized this idea as Knowledge in Pieces (KiP), which focuses on the change of less abstract schemata (phenomenological prims) into a more abstract schema. In addition to the Schema Theory and KiP, there has been an emphasis on how social interactions influence learning. In particular, social constructivism highlights the impact social interactions have on learners’ mental construction of knowledge (Palincsar, 1998). Therefore, the Schema Theory and KiP can provide insight into how learners are creating these mental maps, social constructivism can provide a perspective on what is influencing the construction of those maps.

From Network Theory to Epistemic Network Analysis

Network analysis has historically been used to understand the structure and content of relationships between elements (e.g., students, companies, words, academic papers, cognitive constructs) within the boundaries of a community (Borgatti & Ofem, 2010). Because learning is a process of connecting elements, network analysis has become an important component of many different disciplinary fields. These include teacher education (Reid et al., 2023), biology (Grunspan et al., 2014), science education (Peters-Burton et al., 2019), and social and organizational psychology (Shipilov et al., 2014). The use of network analysis allows researchers to move beyond identifying the presence and absence of variables and toward understanding how the entities are connected in specific contexts. For instance, Hora and Ferrare (2013) used network analysis techniques to highlight subtle nuances in faculty instructional practices by examining the interrelationships between actors, activities, and tasks in a classroom. Depending on the goal of the research, different types of network analyses may be used.
There are numerous types of network analysis approaches. One type is Social Network Analysis (SNA), the process of modeling the interactions among people to investigate social structures. These networks can represent the network of a single person (ego-centric) or a collective of actors, such as in an organization or department (whole network). For instance, in their essay in CBE-Life Sciences Education, Grunspan and colleagues (2014) discuss how SNA can be used to understand student interactions and behavior in classroom settings (whole class network). Alternatively, other network studies have focused more on individual networks. For instance, Polizzi et al. (2021) found that teachers’ communities of practice (as described by their networks) are related to their identity development. In addition to focusing on either whole network or ego-centric networks, SNA has been applied to both online and in-person settings. For instance, while Grunspan and colleagues (2014) discussed collecting network data in a classroom setting, with people, some studies have turned their attention to online settings and how communities and relationships form in virtual spaces (Mulvey et al., 2020). In their study of #NGSSchat on Twitter, the authors used social network analysis to model the behavior of educational stakeholders (teachers, principals, etc.) in this affinity group.
The techniques used in SNA gave rise to ENA, which focuses on capturing and exploring relationships among epistemic variables (e.g., cognition, social gaze coordination, and operative skills; Schaffer et al., 2016) and a wide variety of latent variables that can be identified via discourse analysis (e.g., identity). Similar to how SNA can model interactions among social actors in a given system, ENA can model interactions among epistemic and cognitive elements in a system. ENA has been used by researchers to investigate a variety of knowledge and skills by modeling the connections participants make among ideas (Peters-Burton et al., 2019; Shaffer et al., 2009; Zhang et al., 2019; Csanadi et al., 2018). For instance, Cheung (2020) found a misalignment between the biology curriculum and assessment. Furthermore, similar to SNA, ENA has been applied to both group and individual settings. Mulvey et al. (2021) used ENA in a multi-case study to collect individual epistemic networks of four graduate students taking a Nature of Science course. Alternatively, others have utilized ENA in a collective sense, modeling the group knowledge construction processes in learning environments.

Fundamental Components of Epistemic Network Analysis

Epistemic network models consist of two main features: nodes and ties. Nodes refer to the coded elements from a data source. In ENA, data sources often consist of discourse transcripts, chat dialogues, and open-ended assessments. This data is then coded for elements to be included in the epistemic network model as nodes. Nodes are connected through different types of connections called ties. Furthermore, nodes can have characteristics of their own called attributes, which can provide more information about the functioning of the network, how the individual node is affected by the network structure, and how the network structure is affected by the individual nodes. In ENA, nodes represent the epistemic or cognitive elements (i.e., NOS constructs) within a network and can be annotated with different attributes, such as being related to epistemic frames, beliefs, or knowledge structures. The diversity of elements present in networks can be used as a measure of the size of the epistemic network.
Ties (also called edges) refer to the connection between nodes. Ties represent the co-occurrence of two nodes within a stanza of a data source. These ties can be weighted (for example, there are more co-occurrences between node A and node B than between node A and node C) or unweighted. These ties can be used to better understand how epistemic elements are connected and related to one another within a schema. For instance, Peters-Burton and colleagues (2022) used ENA to compare the structural complexity of students’ and scientists’ views (i.e., schema) of the nature of science (NOS).
As individuals make ties among different epistemic elements, smaller clusters can form within a larger network that can be analytically useful. The smallest potential group formed within a network is referred to as a dyad (two nodes connected), from which larger clusters can form, such as triads (three nodes connected) and cliques (multiple nodes connected) (Carolan, 2014). Once these components are put together to form a network, researchers can gain quantifiable information about the network by calculating standardized network parameters. Table 1 describes some of these standardized parameters.
Table 1
Description of standardized network parameters used in ENA
Parameter
Description
Centralization
Measures the extent to which ties in a network focus on a single node (more centralized) or multiple nodes (less centralized)
Network density
The ratio of existing ties to all possible ties. For instance, a network of five nodes has a maximum number of ties equal to 10 unique ties (not counting reciprocal ties). If only four ties were present, then this graph would have a density of 0.4 or 40% and be dense. Density is calculated using the following formula: C = N(N − 1)/2, where C refers to the number of possible ties within the network, and N refers to the actual number of ties
Degree centrality
Measure of the number of times a node has connected it to other nodes
Cluster analysis
Allows for identification of sub-groups in the network. A sub-group would be a group of nodes with larger edge weights than other groups within the same network
Centroid comparison
Centroids are the arithmetic mean of edge weights. Each network can be summarized by a single centroid measure. This allows for comparisons between multiple networks
Edge weight
How often do two nodes co-occur with one another? The more frequent the co-occurrences, the larger the tie weight

Research Questions

As ENA becomes more widely used as a research tool, it is important to reflect on what we can glean from ENA compared to other methodologies. Knowing how this method is being used will provide insight into its utility to the field broadly. For this review, we were interested in both practical and methodological applications of ENA in science education. By practical applications, we are referring to what we have learned from using ENA in science education. This includes an analysis of the theoretical and conceptual framing used in each study as well as a synthesis of the research findings. By methodological applications, we are referring to how ENA is being used in science education research (i.e., individual versus group design, research methods used). The purpose of this review is to identify variations found in research methodologies and applications conducted with ENA in science education. This review was guided by two research questions:
1.
In what contexts has ENA been used within science education research?
 
2.
What are the applications of ENA in science education research?
 

Methodology

In this study, we followed the eight steps of scoping review outlined by Arskey and O’Malley (2005): (1) identifying the research question; (2) identifying relevant studies; (3) study selection: (4) charting the data; and (5) collating, summarizing, and reporting the results. Reporting of data will follow the guidelines set forth by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020 framework, which has been used extensively in health and social sciences as a systematic way to conduct and report findings from literature reviews (Tricco et al., 2018).

Search and Data Extraction Process

Identify the Purpose

Our study was driven by the absence of a comprehensive synthesis of ENA research within science education. Consequently, this review is aimed at systematically reviewing ENA in science education to identify variations found in research methodologies and applications of ENA.

Draft Protocol

Upon establishing the purpose and scope of the review, the authors proceeded to draft a data extraction protocol. This protocol serves as a blueprint for the review process, outlining strategies to minimize researcher bias during study selection and data extraction. The team regularly convened to refine the protocol, ensuring its adherence to predefined strategies for literature search, selection criteria, study assessment, data extraction, and a planned timetable for review execution.

Apply Practical Screen

The selection of studies (Fig. 1) was governed by inclusion and exclusion criteria aligned with the research questions and informed by prior reviews. The criteria for inclusion were as follows:
1.
Articles must be written in English.
 
2.
Studies must be published as an article in a peer-reviewed journal and available in full text. Consequently, dissertations, theses, editorials, conference abstracts, and workshop proposals were excluded.
 
3.
The study must be empirical, involving the collection and analysis of data with an appropriate methodology and results. Reviews, theoretical works, and incomplete reports were excluded.
 
4.
Studies only focusing on ENA in science education (or closely related fields) were included in this review.
 
5.
Studies must have included ENA as a methodology and/or presented network models as part of their data analysis and interpretations.
 
Fig. 1
The study selection process
Full size image

Database Search Process

For our search, we worked to identify studies within science and STEM education research journals that utilized ENA or quantitative ethnography as a methodology or framework. Three databases were selected for their comprehensive coverage of relevant research and accessibility: Google Scholar, ERIC (Education Resources Information Center), and our institutional libraries which include a host of databases such as PsychInfo, PubMed, and those already mentioned. Additionally, 17 prominent Science Education journals were individually selected and scrutinized (Supplementary Table 02) until no new articles were identified and to ensure we had reached data saturation. For our search, we used several keywords and permutations of those keywords to search in the databases. Although this review focuses on science education, STEM education articles were still included at this stage because several of these articles had a primary content focus on science, life sciences, or environmental sciences. The following search formula was utilized:
  • “STEM” and “epistemic network”
  • “Science” and “epistemic network”
  • “Science” and “quantitative ethnography”
  • “STEM” and “quantitative ethnography”
  • “STEM” and “ENA”
  • “Science” and “ENA”
The search was conducted from Fall 2023 to Spring 2024. It yielded a total of 6548 articles across the selected databases, with 190 articles from ERIC, 5804 articles from Google Scholar, and 554 articles from the university library databases. Each article was subjected to a title and abstract scan. After removing duplicates and excluding articles not focused on “Science Education” or “STEM Education,” we were left with n = 114 remaining manuscripts. This process resulted in 114 articles being eligible for full-text review, which ultimately led to the identification of 19 eligible studies that were exclusively focused on science education. Any conflicts encountered during the review process were discussed and resolved by the authors in subsequent meetings.

Data Extraction

We extracted several key pieces of data from the studies and recorded them using a data extraction form (Supplemental Material 1). We collected data on (1) study citation information, (2) study demographics, (3) study research design, and (4) study analysis and interpretations. To reduce bias in the data extraction process, each eligible study was coded independently by two members of the research team. Any discrepancies in coding were discussed and revisions were made based on agreement reached.

Study Citation Information

To assist with data organization and management, we collected several key pieces of information about each article. The coder’s initials recorded who extracted and recorded the data from the article. This served as a way to ensure reliability in coding practices. Publication year will serve as a metric demonstrating how often ENA has been used in recent years.

Extracted Variables

Study Demographics

The disciplinary field was collected to ensure each study was based on science education and to compare which sub-disciplines of science education are using ENA (i.e., biology, chemistry, physics). We also extracted which conceptual and/or theoretical framing was used by each study. Finally, we extracted data about the participant samples used in each study including a description of the sample (i.e., K-12 students, undergraduates, teachers) and the sample size.

Study Research Design

We extracted data on the number and context of the research questions that were asked to deduce whether each study was causal, mechanistic, or descriptive in nature. Causal studies ask whether one variable leads to another variable (i.e., does X lead to Y?) and examine the relationships between variables whereas, mechanistic studies often ask how or why one variable leads to another, further examining causal relationships (National Research Council [NRC], 2002). Descriptive studies are exploratory in nature, typically not hypothesis-driven, and often ask “What is happening?” (NRC, 2002). We then extracted the overall research methodology (quantitative, qualitative, or mixed-methods) and research design (random sample, quasi-experimental, descriptive/interviews). Finally, we extracted the data collection level (individual, small groups, large groups/whole class) and how this data was collected (card sorts, drawings, verbal/audio recordings, etc.).

Study Analysis and Interpretations

To help answer questions related to the application and utility of ENA methodology within science education, we extracted data on the type of coding of qualitative data that occurred to establish the key elements in the network. The coding type was reported as either inductive, deductive, or both inductive and deductive. If deductive coding was used then we also extracted which frameworks were used. We also identified which reliability and statistical analyses were reported and the tools used to generate ENA network models and metrics. Reliability was extracted as either “none reported,” “Cohen’s Kappa,” or “Other (provide the description)”. Statistical analysis was recorded as “None conducted,” “parametric analysis,” “non-parametric analysis,” and “effect size reported.” Then we recorded which software program(s) were used to conduct the ENA (i.e., epistemicnetwork.org; UCINET). We recorded whether each study reported network metrics (i.e., centrality, centroid analysis, density) and which metrics were reported. Finally, our data extraction form included an open-ended question for the coders to report the justification for the use of ENA provided by each study. These were either direct quotes taken from the articles or the coders’ interpretations. All coding was discussed by the research team until a consensus was reached.

Results

See Tables 2, 3, and 4 (end of paper) for a list of selected studies and extracted variables. Our final collection of articles that met the inclusion criteria consisted of 19 published, peer-reviewed journal articles published between 2012 and 2023 (Fig. 2). Since 2018, reporting of ENA as a research methodology has increased from 2 total publications between 2012 and 2018 to 17 total publications between 2019 and 2023. The maximum number of studies was in 2022 with five studies that year. This was followed by four studies in the years 2021 and 2023, 2 studies each in 2019 and 2020, and 1 in 2012 and 2015.
Table 2
References and extracted variables for selected studies
Authors, date [reference ID]
Title
Research question(s)/aim of study
Cheung and Winterbottom (2023) [1]
Students’ integration of textbook representations into their understanding of photomicrographs: epistemic network analysis
How do students with different levels of competence for visualizing photomicrographs integrate textbook representations into their understanding of a photomicrograph of villi?
Peters-Burton et al. (2023) [2]
Student, Teacher, and Scientist Views of the Scientific Enterprise: An Epistemic Network Re-analysis
1. What features characterize students’, teachers’, and scientists’ epistemic network models when their views are interpreted by the FRA framework?
2. What similarities and differences emerge across students’, teachers’, and scientists’ epistemic network models when interpreted by the FRA framework?
Fan et al. (2023) [3]
Possible future selves in STEM: an epistemic network analysis of identity exploration in minoritized students and alumni
How do minoritized students and alumni describe their identity exploration processes when asked to reflect on their possible future selves in STEM?
Mulvey et al. (2021) [4]
Making Connections: Using Individual Epistemic Network Analysis to Extend the Value of Nature of Science Assessment
1. How can we interpret changes in individual participants’ understanding of NOS through traditional VNOS analysis?
2. How might iENA extend those interpretations?
Petrovich et al. (2022) [5]
Identity exploration for maker educators: constructing meaning in after-school environmental science
To what extent did environmental educators change over time as they explored the identity of a teacher who adopts making activities in their instruction?
Wright and Delgado (2023) [6]
Generating a framework for gender and sexual diversity-inclusive STEM education
1. What are the underlying constructs of GSD-inclusive STEM teaching?
2. How do these constructs relate to one another
3. How do these constructs manifest in the different STEM disciplines?
Oshima et al. (2020) [7]
Analysis of students’ ideas and conceptual artifacts in knowledge-building discourse
1. How are learners’ ideas improved through their collaborative discourse?
2. How do learners engage in collectively improving their ideas by using their conceptual artifacts?
Omarchevska et al., (2022a, 2022b) [8]
It takes two to tango: How scientific reasoning and self-regulation processes impact argumentation quality
How do the temporal patterns associated with self-regulation and scientific reasoning processes differ in relation to argumentation quality?
Caramaschi et al (2022) [9]
Mapping the nature of science in the Italian Physics curriculum: from missing links to opportunities for reform
How can FRA and ENA be used for analyzing NOS content in science curricula?
Bressler et al. (2019) [10]
Using Epistemic Network Analysis to Examine Discourse and Scientific Practice During a Collaborative Game
1. How does scientific practice evolve during the course of gameplay?
2. Which elements of collaborative discourse support the development of scientific practice?
Cheung (2020) [11]
Exploring the Inclusion of Nature of Science in Biology Curriculum and High-Stakes Assessments in Hong Kong
1. Which NOS categories are represented in the Hong Kong biology curriculum?
2. Which NOS categories are selected and how are NOS categories examined in the Hong Kong biology high-stakes assessments? Are these NOS categories different from those represented in the Hong Kong biology curriculum?
Li et al. (2022) [12]
Examining the Interplay between Self-regulated Learning Activities and Types of Knowledge within a Computer-Simulated Environment
1. Does task complexity associated with students’ use of SRL activities, types of knowledge, and their interplay during clinical problem-solving within a computer-simulated environment?
2. Are there any differences in the SRL activities and types of knowledge between high and low performers?
Peters-Burton et al. (2023) [13]
Extending the Utility of the Views of Nature of Science Assessment through Epistemic Network Analysis
1. To what extent did the group understand selected NOS concepts pre- and post-course, as indicated by the “traditional” rating of participant VNOS responses?
2. In what ways did the group make connections among NOS concepts, as illustrated by an ENA model derived from participants’ pre- and post-course VNOS responses?
3. How do the results for traditional VNOS analysis and ENA compare?
4. What are the benefits and drawbacks of administering the VNOS assessment with an ENA extension?
Omarchevska et al., (2022a, 2022b) [14]
Do Video Modeling and Metacognitive Prompts Improve Self-Regulated Scientific Inquiry?
1. Can video modeling and metacognitive prompts improve scientific reasoning ability?
2. What are the immediate effects of video modeling and metacognitive prompts while working on an inquiry training task at the product level (hypothesis and argumentation quality) and process level (scientific reasoning and self-regulation)?
3. Do the effects of video modeling and metacognitive prompts on scientific reasoning products and processes transfer to a novel task?
Rachmatullah and Wiebe (2022) [15]
Building a computational model of food webs: Impacts on middle school students’ computational and systems thinking skills
1. What are the differences between paper-based pictorial and computational modeling activities regarding their impact on middle school students’ systems thinking when learning about food web concepts?
2. To what extent do paper-based pictorial and computational modeling activities differ in improving students’ CT skills?
3. Is there any correlation between systems thinking and CT skills?
Peters-Burton (2015) [16]
Outcomes of a Self-Regulated Learning Curriculum Model: Network Analysis of Middle School Students’ Views of Nature of Science
1. What are the shifts in the connections among ideas about NOS for the student group before and after a yearlong course that used a curricular model based on self-regulated learning?
2. What are the influences of the yearlong curriculum based on self-regulated learning theory on the ways student ideas about NOS shifted before and after the course?
Gao et al. (2023) [17]
Investigating the Nature of Science in Reformed Chinese Biology Curriculum Standards
1. Based on the FRA framework, what NOS content is required in the Chinese biology curriculum standards?
2. Are the descriptions and requirements of the NOS coherent within and across the Chinese biology curriculum standards?
Bodin (2012) [18]
Mapping university students’ epistemic framing of computational physics using network analysis
1. What are the students focusing on, in terms of knowledge and beliefs, when describing a numerical problem-solving task, before and after doing the task?
2. What role does physics knowledge take when students describe a computational physics problem-solving situation?
3. Are the proposed epistemic networks of the learning situation in this study useful for describing epistemic framing?
Chang and Tsai (2023) [19]
Epistemic Network Analysis of Students’ Drawings to Investigate Their Conceptions of Science Learning with Technology
What would be the overall structure of the students’ conceptions of science learning with technology as expressed through students’ drawings? (the one on ENA)
Note: Data is split across three tables for organization and keeping all data in a legible format
Table 3
References and extracted variables for selected studies
Authors, date [reference ID]
Disciplinary field (content area)?
Study context
Location
Participant description
Sample size
Conceptual or theoretical framing
Cheung and Winterbottom (2023) [1]
Life sciences/biology
Formal (in class)
Europe
9–12 students (high school/secondary)
12
Modified model of integration of texts and pictures
Peters-Burton et al. (2023) [2]
Life sciences/biology, chemistry, physics/astronomy/earth science (NOS)
Formal (in class)
United States
6–8 students (middle school/secondary), university faculty, in-service secondary teachers
113
Nature of science
Fan et al. (2023) [3]
Engineering, biochemistry
Informal (afterschool)
United States
Undergraduate majors in content area
21
Identity
Mulvey et al. (2021) [4]
Other: teacher Ed NOS course
Formal (in class)
United States
Graduate students, in-service elementary teachers, in-service secondary teachers
3
Nature of science
Petrovich et al. (2022) [5]
Environmental science, maker education
Informal (afterschool)
United States
In-service secondary teachers
3
Identity
Wright and Delgado (2023) [6]
Stem
Literature review
Not applicable
Artifacts/literature
13
Gender and sexual diversity—inclusive education
Oshima et al. (2020) [7]
Life sciences/biology
Formal (in class)
Asia
9–12 students (high school/secondary)
35
Structure-behavior-function
Omarchevska et al., (2022a, 2022b) [8]
Life sciences/biology
Formal (in class)
Europe
Undergraduate non-majors (not in the content area), undergraduate majors in the content area
30
Self-regulated learning, argumentation, scientific reasoning
Caramaschi et al (2022) [9]
Physics/astronomy/earth science
Curriculum/standards
Europe
Artifacts/literature
1
Nature of science
Bressler et al. (2019) [10]
Other: forensic science
Formal (in class)
United States
6–8 students (middle school/secondary)
12
Collaborative gaming
Cheung (2020) [11]
Life sciences/biology
Curriculum/standards
Asia
Artifacts/literature
1
Nature of science
Li et al. (2022) [12]
Other: medical science
Formal (in class)
North America
Med school students
34
Self-regulated learning
Peters-Burton et al. (2019) [13]
Other: NOS/general science
Formal (in class)
United States
In-service elementary teachers, preservice secondary
23
Nature of science
Omarchevska et al., (2022a, 2022b) [14]
Other: general science
Formal (in class)
Europe
Undergraduate non-majors (not in the content area), undergraduate majors in the content area
127
Self-regulated learning, argumentation, scientific reasoning
Rachmatullah and Wiebe (2022) [15]
Life sciences/biology
Formal (in class)
Asia
6–8 students (middle school/secondary)
751
Systems thinking
Peters-Burton (2015) [16]
Other: NOS course
Formal (in class)
United States
6–8 students (middle school/secondary)
40
Nature of science
Gao et al. (2023) [17]
Life sciences/biology
Curriculum/standards
Asia
Artifacts/literature
82
Nature of science
Bodin (2012) [18]
Physics/astronomy/earth science
Formal (in class)
United States
Undergraduate majors in content area
6
Epistemic framing
Chang and Tsai (2023) [19]
Environmental science
Formal (in class)
Asia
9–12 students (high school/secondary)
95
Community of inquiry (coi), technology-supported learning environments
Note: Data is split across three tables for organization and keeping all data in a legible format
Table 4
References and extracted variables for selected studies
Authors, date [reference ID]
Research design
Data collection source (for generating networks)
Network model type
Network model use
ENA analysis tool
Network metrics collected
Cheung and Winterbottom (2023) [1]
Between-sample comparison (i.e., comparing two groups)
Verbal discourse, interviews
Collective
Multiple (comparison/changes)
epistemicnetwork.org/
Edge weight
Peters-Burton et al. (2022) [2]
Between-sample comparison (i.e., comparing two groups)
Written discourse, researcher-created survey, card sort
Collective
Multiple (comparison/changes)
UCINET
Network density, node centrality, node clustering, edge weight
Fan et al. (2023) [3]
Between-sample comparison (i.e., comparing two groups)
Interviews
Collective
Multiple (comparison/changes)
unclear
Centroid comparison, edge weight
Mulvey et al. (2021) [4]
Case study
Validated survey
Individual
Multiple (comparison/changes)
UCINET
Cluster analysis
Petrovich et al. (2022) [5]
Within-sample comparison (i.e., pre-post design within the same sample)
Researcher-created surveys, participant reflections, researcher observations
Individual
Trajectories (more than 2 time points)
epistemicnetwork.org/
Edge weight, centroid comparisons
Wright and Delgado (2023) [6]
Literature review
Curriculum/assessment documents/research articles
Collective
Stand-alone (no comparison)
Flourish
Edge weight
Oshima et al. (2020) [7]
Quasi-experimental, within-sample comparison (i.e., pre-post design within the same sample)
Verbal discourse
Collective
Multiple (comparison/changes)
epistemicnetwork.org/
Degree centrality
Omarchevska et al., (2022a, 2022b) [8]
Between-sample comparison (i.e., comparing two groups)
Verbal discourse, researcher created survey, drawings
Collective, individual
Multiple (comparison/changes)
epistemicnetwork.org/
Edge weight, centroid comparisons, cluster analysis
Caramaschi et al (2022) [9]
Content analysis
Curriculum/assessment documents/research articles
Collective
Stand-alone (no comparison)
epistemicnetwork.org/
Centralization
Bressler et al. (2019) [10]
Random assignment at individual level
Verbal discourse
Collective
Multiple (comparison/changes)
epistemicnetwork.org/
Edge weight, centroid comparisons
Cheung (2020) [11]
Content analysis
Curriculum/assessment documents/research articles
Collective
Stand-alone (no comparison)
epistemicnetwork.org/
Edge weight
Li et al. (2022) [12]
Random assignment at the individual level, between-sample comparison (i.e., comparing two groups)
Verbal discourse
Collective
Multiple (comparison/changes), trajectories (more than 2 time points)
epistemicnetwork.org/
Centroid comparisons, edge weight
Peters-Burton et al. (2019) [13]
Descriptive or exploratory in nature
Card sort data
Collective
Multiple (comparison/changes)
UCINET
Degree centrality, cluster analysis, edge weight
Omarchevska et al., (2022a, 2022b) [14]
Random assignment at the individual level
Written discourse, researcher-created survey, validated survey, verbal discourse
Collective
Multiple (comparison/changes)
epistemicnetwork.org/
Centroid comparisons
Rachmatullah and Wiebe (2022) [15]
Quasi-experimental
Researcher-created survey, instructor-created course assessment, validated survey
Collective
Multiple (comparison/changes)
epistemicnetwork.org/
Centroid comparisons
Peters-Burton (2015) [16]
Descriptive or exploratory in nature
The researcher created a survey, card sort
Collective
Multiple (comparison/changes)
UCINET
Cluster analysis
Gao et al. (2023) [17]
Content analysis
Curriculum/assessment documents/research articles
Individual
Multiple (comparison/changes)
epistemicnetwork.org/
Centroid comparisons, edge weight
Bodin (2012) [18]
Quasi-experimental
Interviews
Collective
Multiple (comparison/changes)
unclear
Edge weight, node density
Chang and Tsai (2023) [19]
Between-sample comparison (i.e., comparing two groups)
Drawings
Collective
Multiple (comparison/changes)
epistemicnetwork.org/
Centroid comparisons
Note: Data is split across three tables for organization and keeping all data in a legible format
Fig. 2
Number of published research articles in science education that used ENA between 2012 and 2023
Full size image

RQ1: In What Settings has ENA been Used within Science Education Research?

Disciplinary Field, Study Context, and Geographic Location of Studies

The majority of studies identified were found to have been conducted in North America (n = 1 [5.3%]), with the bulk of those in the United States (n = 8 [42.1%]). This was followed by Asia (n = 5 [26.3%]) and Europe (n = 4 [21.1%]), respectively (Fig. 3A). The majority of studies are being conducted in formal learning environments (n = 13 [68.4%]); which highlights the need to explore the use of ENA in informal and other settings (Fig. 3B). There were some unique uses of ENA including analyzing standards documents, curriculum, and assessments (Fig. 3B). This suggests that ENA can be a powerful tool for individuals involved in policy decision-making and research. The majority of studies were in the life sciences. This was followed by the “Other” category and consisted of five studies (NOS n = 3, forensic science n = 1, medical school n = 1; Fig. 3C). Forensic science and medicine are closely related to biology/life sciences, further inflating the category.
Fig. 3
Analysis of geographic location (A), study context (B), and disciplinary field (C) by research study. Total sample n = 19. The first donut chart (A) shows the relative frequency of research articles by geographic location (Asia, Europe, North America, United States, or not specified). The second donut chart (B) shows the relative frequency of research articles by study context (curriculum/standards, formal, informal, and literature review). The third donut chart (C; on the right) shows the relative frequency of research studies by disciplinary field (biochemistry, environmental science, general science, life sciences/biology, maker education, physics/astronomy/earth science, STEM, or other)
Full size image

Description of Participants and Samples Studied

Figure 4 shows the frequency of participant group samples in the selected studies. Our sample of studies consisted of studies that included undergraduate STEM majors (n = 4 [21.1%]), middle school students (n = 4 [21.1%]), and high school students (n = 4 [21.1%]). Furthermore, several studies were categorized as document/content analyses of standards or curricula (n = 4 [21.1%]). This was followed by in-service secondary teachers (n = 3 [15.7%]), in-service elementary teachers (n = 2 [10.5%]), and undergraduate non-STEM majors (n = 2 [10.5%]). There was a single study each that sampled university faculty, pre-service secondary teachers, medical students, graduate students, and program alumni. Concerning sample size, the majority of studies had sample sizes between 1 and 20 (n = 7 [36.8%]) or 21 and 40 (n = 6 [31.6%]) (Fig. 4). Relatively few studies had sample sizes greater than 40 (n = 6 [31.6%]) (Fig. 5).
Fig. 4
Number of research articles by participant descriptions. Note that some studies included multiple samples and therefore are counted twice; therefore, the total sum of studies presented here is greater than 19 (> 100%) of the sample
Full size image
Fig. 5
Range of sample sizes included in the selected research articles. Most studies had sample sizes of less than 100 participants
Full size image

RQ2: What are the Applications of ENA in Science Education Research?

To assess what ENA has been used to study, studies were grouped into one of four categories based on their theoretical or conceptual framework and scope of the study Nature of Science, Identity, Student and Teacher Learning, and Curriculum and Standards. We used the theoretical and conceptual framing codes to categorize each study into one of these four categories. Note that percentages do not add up to 100% because some studies reported more than one theoretical or conceptual framework.

Nature of Science

Analysis of theoretical and conceptual framing highlighted that several studies (n = 7 [35%]) used the nature of science (NOS) as a conceptual framework (2, 4, 9, 11, 13, 16, 17). Aspects of NOS were used as coding frameworks for the elements in the network models. Two established NOS frameworks in science education were used, the Family Resemblance Approach (Erduran & Dagher, 2014) and the aspects of NOS described by Lederman (2004). In one study, the NOS framework was used as the interpretive lens when comparing the network outcomes to previously validated measures for views of NOS (Peters-Burton et al., 2022) to compare students’ views of NOS with those of the scientists. This is in contrast to other studies that compared alignment between standards and curriculum for NOS (9, 11).

Complex Phenomena: Identity and Systems Thinking

Two studies reported data on how students construct an identity. In study #407, the authors sought to explore how minoritized students explore identity development through the lens of possible selves. In study #5, researchers used the Projective Reflection framework to study how teachers explored their identity development during an after-school professional development program focused on maker education. Finally, two studies were framed using systems thinking (7, 15). One study in particular used a structure-behavior-function framework to analyze students’ connections between particular concepts related to the human immune system (15). Using ENA to explore systems-level thinking allowed researchers to go beyond traditional coding and counting techniques (Csanadi et al., 2018) and move into exploring relationships between elements within a system through co-occurrences.

Student and Teacher Learning

Seven studies were categorized as focusing on student or teacher learning. Three studies focused on capturing elements of students’ self-regulated learning (8, 12, 14). One study focused on how students reason about or interpret graphical representations and used a modified model of integration of texts and pictures framework (1). Following this were two studies that were framed around argumentation practices and scientific reasoning (8, 14). Both of these articles also included self-regulation as a framework (already included in the number above). One study explored students’ conceptions of science learning with technology (19).

Document Analysis

Apart from human subjects-based research, researchers used the ENA methodology to explore connections among elements in documents. We identified four studies that examined relationships among curriculum, standards, and assessment documents (6, 9, 11, 17). Three of these were specifically focused on the connections among NOS curriculum, standards, and assessments (9, 11, 17). One study (6) inductively analyzed themes in the gender and sexual diversity (GSD) inclusive education and equity literature in STEM education and used ENA to identify the connections among these themes to generate a framework for GSD‐inclusive STEM teaching.

Qualitative Data Source

As shown in Fig. 6, many studies used verbal discourse (n = 7 [36.8%]) or researcher-created surveys (n = 5 [26.3%]) as the primary qualitative data source to generate ENA network models. This was followed by interviews (n = 4 [21.1%]), validated surveys, curriculum/assessment documents/research articles, and card sort data, each at three. There were two studies that used participants’ drawings and written responses. Finally, researcher observations, participant reflections, and assessments were found in one study each (n = 1 [5.3%]). This pattern suggests that more open-ended assessments (i.e., reflections, observations) might be more challenging to collect and analyze ENA data from compared to verbal discourse and researcher-generated methods.
Fig. 6
Frequency of studies by type of data collection source
Full size image

Network Modeling

Network models can be used in many ways. We found that many studies used network models to discuss group-level network data (n = 15 [79%]), as opposed to individual-level (n = 3 [16%]) or a combination of individual- and group-level (n = 1 [5%]) (Fig. 7). Additionally, to compare model results, most studies used multiple networks to highlight differences between connections made and nodes present (n = 15 [79%]; Fig. 8). There was only one study that looked at longitudinal changes within a network (Fig. 8).
Fig. 7
Relative frequency of research articles by data collection level. N = 19. Collective indicates network models were reported as a summation of a group of individuals. Individual indicates network models were reported at the individual participant level
Full size image
Fig. 8
Relative frequency of network analysis design. N = 19. Multiple refers to the study reporting multiple network models (i.e., multiple individual models). Stand-alone refers to studies that reported only a single network model. Trajectories refer to studies that reported multiple networks but specifically compared changes across time (i.e., pre-post design)
Full size image

Network Model Metrics Reported

We found that the most utilized network metrics were centroids (n = 9 [47.4%]) and edge weight (n = 12 [63.2%]) (Fig. 9). Due to variation in how these metrics were reported, some studies qualitatively compared edge weight and connections. Cluster analysis was utilized by three studies to identify subgroups within a network. For instance, Peters-Burton et al. (2023) used cluster analysis within the overall network models to identify how NOS constructs were grouped together within students’ epistemic frames. Finally, only five of the studies used more traditional network measures, such as centrality measures and density.
Fig. 9
Analysis of network metrics reported in studies
Full size image

Discussion

ENA began as a research method to model and visualize coded discourse data (Schaffer et al., 2009). Since then, it has emerged as a research tool for scholars to address questions related to how components of a system are related to one another. In their review of ENA, Elmoazen and colleagues (2022) discussed how ENA has contributed to education as a field. Here, we discuss how findings from this review contribute to the further discussion of ENA methodology specifically focused on the research questions being asked in science education.

ENA Is an Emergent Methodology in Science Education

ENA goes beyond traditional methods of identifying and analyzing patterns of complex thinking (Shaffer et al., 2009, 2016), and our findings suggest this method is emerging as a promising tool in science education research. Since 2012, we have seen an increase in the number of published science education studies utilizing ENA. However, the method has yet to be used to its full potential in the field. This was evident as all studies identified cooccurrences among variables and generated network models, but only two studies examined trajectories of change (5, 12). Furthermore, most of the studies in our sample relied on more traditional network analysis metrics (edge weight, network density, centroid comparisons) available using free software without exploring the power of metrics that are commonly used in social network analysis, such as centralization and cluster analysis. Only four studies (4, 8, 13, 16) used cluster analysis. However, fewer studies asked questions that relied on centralization or cluster analysis, common approaches used in social network analysis (Borgatti & Ofem, 2010).
We found that ENA has been applied in science education research to understand how individuals or groups make connections among concepts, skills, and behaviors. Our review revealed two distinct approaches to using ENA—ENA that focuses on generating network models of individuals as the data collection unit (individual ENA or iENA) vs ENA that uses co-occurrence data to create network models that represent how a collective group connects ideas (collective ENA). iENA is a novel application of network analysis as evidenced by only one study employing this approach (4). Similar to how concept maps illuminate knowledge structures in individuals (Allen & Tanner, 2003; Gardner et al., 2018; Carley & Palmquist, 1992), iENA provides tools for understanding how knowledge statements or ideas are interconnected, which can help researchers to identify differences among participants (Mulvey et al., 2021). ENA extends what can be gleaned from concept maps by providing a quantitative analysis of the connections and concepts present. Most studies focused on collective groups to provide snapshots of how a whole class or subgroups within classes connected ideas. We noted that researchers often examined differences in network models among subgroups that were divided by ability levels to identify differences in a construct important in the field, such as scientific reasoning, argumentation, and metacognition. As such, ENA holds promise in science education research to examine trajectories of expertise development and learning. This approach goes beyond that of traditional approaches to studying learning and expertise trajectory by modeling the connections made between knowledge units or concepts.

The Utility of ENA for Science Education

We saw a wide variation in how data was collected and how networks were generated and analyzed, demonstrating the utility of this tool to answer wide-ranging research questions. For instance, card sorting techniques are relatively simple tasks and have been used in cognitive psychology for decades to examine how participants cognitively organize abstract ideas (Weller & Rommney, 1988). Card sorts, specifically open sorts where participants construct their own groups, provide access to tacit knowledge (Martine & Rugg, 2005). Similar to strides made in psychological and sociological sciences due to network analysis, science education is primed to make similar, large strides given this methodology.
This methodology allows for the analysis of large quantities of rich qualitative data which can then be quantified and statistically compared (Shaffer et al., 2016). Qualitative research is often associated with smaller sample sizes but with rich data. ENA can bring the power of quantitative research and large data sets to qualitative studies—making it a powerful mixed methods approach. ENA can be conducted with several types of data from several types of data sources. For instance, some studies we reviewed included data from verbal discourse, interviews, and surveys; other studies analyzed drawings and researcher observations using this methodology.
The emergence of ENA as a research methodology and framework in science education is not novel. Other fields have been utilizing this methodology, along with social network analysis and other network methods, for decades. Particularly in disciplinary fields such as psychological sciences, sociological sciences, business organization and management, and more. The utility of network analysis methods lies within the ability to visualize, quantify, and compare connections (interactions, relationships, etc.) within a given context or system. For instance, researchers can more efficiently ask questions related to shifts in the epistemic networks of students as they progress through a program or course. ENA also provides opportunities for researchers to explore relationships among complex constructs (i.e., science identity and persistence in STEM) beyond the traditional qualitative “code and count” approach and quantitative descriptive approaches.

ENA Can Be Applied to Understand the Alignment Between Standards, Curriculum, and Assessment

Some studies in this review did not focus on collecting human subjects’ data but rather applied ENA methods to analyze documents that influence policies and practices in science education. This took the form of researchers comparing the alignment of their curriculum and standards to a NOS framework. Because ENA uses co-occurrences to construct network models, it is especially useful to identify how documents such as standards and curriculum are aligned to one another or to assessments. When standards are co-occurring between standards and assessment, this might suggest a good alignment and valid assessment for that topic (i.e., NOS) whereas when there is not a co-occurrence between the documents, then this might represent poor alignment. Using network analysis gives researchers, and other stakeholders in educational policy, a visual way to see the alignment of their standards and curriculum. This holistic view can allow stakeholders to more easily identify gaps that need to be addressed.

ENA Allows Us to Model Complex Phenomena in Novel Ways

Most of these studies in this review were focused on generating knowledge or extending theory about a phenomenon. ENA provides us with the ability to examine theoretical frameworks in our fields in novel ways. Using a deductive approach for identifying thematic elements can lead to improving our understanding of that theory. For example, the work by Wright et al. (6) used ENA to construct a new framework for gender and sexual diversity inclusive education. As such, ENA may have many applications for translating theory into practice. In another example, Mulvey et al. (2021) used ENA to extend what could be gleaned from the Views about Nature of Science (VNOS) assessment. Traditionally, the VNOS would only be able to assess the presence or absence of participants’ thinking about NOS ideas. However, Mulvey et al. (2021) used ENA to uncover which tenets might be more central or more peripheral in students’ epistemic frames of NOS. Similarly, ENA can be applied to examine frameworks often more challenging to address, such as systems thinking. Systems-level thinking has been articulated as a core and crosscutting concept in science education (American Association for the Advancement of Science, 2011; NGSS Lead States, 2013). Epistemic network analysis techniques provide a methodological tool and are valuable resources for understanding these systems.

Limitations

Our study highlights the utility of ENA as a research method in science education. However, due to our exclusion criteria, our sample only consisted of 19 studies. Several studies were excluded due to not falling into the realm of science education, STEM education, being engineering education, or not providing network visuals. While fields such as engineering, technology, and mathematics are distinct from science, they do tend to overlap often in research literature. Perhaps a broader focus on STEM education would reveal a new pattern of ENA use.

Conclusion

Overall, science education has explored the utility of ENA in several contexts and to answer several types of research questions. This is seen in the utility of ENA to visualize co-occurrence data of coded data, compare standards and assessment documents, and in its ability to model complex phenomena such as systems-level thinking in students. This method moves beyond traditional qualitative approaches by transforming qualitative data into quantitative. One reason for the infancy of ENA in science education could be that ENA is time-intensive when an inductive approach is used to identify key elements to generate network models. Discourse analysis, analyzing drawings, and conducting interviews take significant time but with the introduction of AI in qualitative analysis (Jiang et al., 2021), the time required to generate network models will be reduced and enable ENA methodology to proliferate. While ENA provides a rigorous set of tools to quantify the relationships among ideas (Shaffer et al., 2009), it seems that science education as a field has only begun to explore the possibilities. Perhaps more communication about methodologies will support researchers in making this leap to harness the power of ENA.

Declarations

Ethical Approval

This paper is a systematic review that analyzes published studies, and, hence, it does not involve any primary data collection. Thus, no ethics approval was required for this review paper.
Not Applicable.
Not Applicable.
Not Applicable.

Competing Interests

The authors declare no competing interests.

Statement Regarding Research Involving Human Participants and/or Animals

Not applicable, because this research does not contain any studies with human participants or animals performed by any of the authors.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Title
Finding the Connections: A Scoping Review of Epistemic Network Analysis in Science Education
Authors
Joshua W. Reid
Jennifer Parrish
Shifath Bin Syed
Brock Couch
Publication date
28-12-2024
Publisher
Springer Netherlands
Published in
Journal of Science Education and Technology / Issue 5/2025
Print ISSN: 1059-0145
Electronic ISSN: 1573-1839
DOI
https://doi.org/10.1007/s10956-024-10193-x

Supplementary Information

Below is the link to the electronic supplementary material.
go back to reference Allen, D., & Tanner, K. (2003). Approaches to cell biology teaching: Mapping the journey—Concept maps as signposts of developing knowledge structures. Cell Biology Education, 2(3), 133–136.CrossRef
go back to reference American Association for the Advancement of Science (2011). Vision and change in undergraduate biology education: A call to action, Washington, DC.
go back to reference Bodin, M. (2012). Mapping university students’ epistemic framing of computational physics using network analysis. Physical Review Special Topics-Physics Education Research, 8(1), 010115.CrossRef
go back to reference Borgatti, S. P., & Ofem, B. (2010). Social network theory and analysis. Social Network Theory and Educational Change, 17, 29.
go back to reference Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn (Vol. 11). Washington, DC: National Academy Press.19
go back to reference Bressler, D. M., Bodzin, A. M., Eagan, B., & Tabatabai, S. (2019). Using epistemic network analysis to examine discourse and scientific practice during a collaborative game. Journal of Science Education and Technology, 28, 553–566.CrossRef
go back to reference Caramaschi, M., Cullinane, A., Levrini, O., & Erduran, S. (2022). Mapping the nature of science in the Italian physics curriculum: From missing links to opportunities for reform. International Journal of Science Education, 44(1), 115–135.CrossRef
go back to reference Carley, K., & Palmquist, M. (1992). Extracting, representing, and analyzing mental models. Social Forces, 70(3), 601–636.CrossRef
go back to reference Carolan, B. V. (2014). Social network analysis and education: Theory, methods, and applications. Thousand Oaks, CA: SAGE.
go back to reference Chang, H. Y., & Tsai, C. C. (2023). Epistemic network analysis of students’ drawings to investigate their conceptions of science learning with technology. Journal of Science Education and Technology, 32(2), 267–283.
go back to reference Cheung, K. K. C. (2020). Exploring the inclusion of nature of science in biology curriculum and high-stakes assessments in Hong Kong: Epistemic network analysis. Science & Education, 29(3), 491–512.CrossRef
go back to reference Cheung, K. K. C., & Winterbottom, M. (2023). Students’ integration of textbook representations into their understanding of photomicrographs: Epistemic network analysis. Research in Science & Technological Education, 41(2), 544–563.CrossRef
go back to reference Csanadi, A., Eagan, B., Kollar, I., Shaffer, D. W., & Fischer, F. (2018). When coding-and-counting is not enough: Using epistemic network analysis (ENA) to analyze verbal data in CSCL research. International Journal of Computer-Supported Collaborative Learning, 13, 419–438.CrossRef
go back to reference diSessa, A. A. (1993). Toward an epistemology of physics. Cognition and Instruction, 10, 105–225.CrossRef
go back to reference diSessa, A. A., & Sherin, B. (1998). What changes in conceptual change? International Journal of Science Education, 20(10), 1155–1191.CrossRef
go back to reference Elmoazen, R., Saqr, M., Tedre, M., & Hirsto, L. (2022). A systematic literature review of empirical research on epistemic network analysis in education. IEEE Access, 10, 17330–17348.CrossRef
go back to reference Erduran, S., & Dagher, Z. R. (2014). Reconceptualizing the nature of science for science education: Scientific knowledge, practices and other family categories. Springer.CrossRef
go back to reference Fan, Y. K., Barany, A., & Foster, A. (2023). Possible future selves in STEM: An epistemic network analysis of identity exploration in minoritized students and alumni. International Journal of STEM Education, 10(1), 22.CrossRef
go back to reference Gao, Q., Cao, Y., Xie, H., & Li, X. (2023). Investigating the nature of science in reformed Chinese biology curriculum standards: Epistemic network analysis. Science & education, 1–29.
go back to reference Gardner, G. E., Lohr, M. E., Bartos, S., & Reid, J. W. (2018). Comparing individual and group-negotiated knowledge structures in an introductory biology course for majors. Journal of Biological Education.
go back to reference Grunspan, D. Z., Wiggins, B. L., & Goodreau, S. M. (2014). Understanding classrooms through social network analysis: A primer for social network analysis in education research. CBE—Life sciences education, 13(2), 167–178.
go back to reference Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multidimensional analysis of math and science undergraduate course planning and classroom teaching. Journal of the Learning Sciences, 22(2), 212–257.CrossRef
go back to reference Jiang, J. A., Wade, K., Fiesler, C., & Brubaker, J. R. (2021). Supporting serendipity: Opportunities and challenges for human-AI collaboration in qualitative analysis. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), 1–23.
go back to reference Lederman, N. G. (2004). Syntax of nature of science within inquiry and science instruction. In L. B. Flick & N. G. Lederman (Eds.), Scientific inquiry and nature of science (pp. 301–317). Kluwer Academic Publishers.
go back to reference Li, S., Huang, X., Wang, T., Pan, Z., & Lajoie, S. P. (2022). Examining the interplay between self-regulated learning activities and types of knowledge within a computer-simulated environment. Journal of Learning Analytics, 9(3), 152–168.CrossRef
go back to reference Martine, G., & Rugg, G. (2005). That site looks 88.46% familiar: Quantifying similarity of web page design. Expert Systems, 22(3), 115–120.CrossRef
go back to reference McVee, M. B., Dunsmore, K., & Gavelek, J. R. (2005). Schema theory revisited. Review of Educational Research, 75(4), 531–566.CrossRef
go back to reference Momsen, J., Speth, E. B., Wyse, S., & Long, T. (2022). Using systems and systems thinking to unify biology education. CBE—Life sciences education, 21(2), es3.
go back to reference Mulvey, B. K., Parrish, J. C., Reid, J. W., Papa, J., & Peters-Burton, E. E. (2021). Making connections: Using individual epistemic network analysis to extend the value of nature of science assessment. Science & Education, 30(3), 527–555.CrossRef
go back to reference National Research Council (NRC). (2002). Scientific research in education. National Academies Press.
go back to reference Omarchevska, Y., Lachner, A., Richter, J., & Scheiter, K. (2022a). It takes two to tango: How scientific reasoning and self-regulation processes impact argumentation quality. Journal of the Learning Sciences, 31(2), 237–277.CrossRef
go back to reference Omarchevska, Y., Lachner, A., Richter, J., & Scheiter, K. (2022b). Do video modeling and metacognitive prompts improve self-regulated scientific inquiry? Educational Psychology Review, 34(2), 1025–1061.CrossRef
go back to reference Oshima, J., Oshima, R., & Saruwatari, S. (2020). Analysis of students’ ideas and conceptual artifacts in knowledge-building discourse. British Journal of Educational Technology, 51(4), 1308–1321.CrossRef
go back to reference Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., & Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Bmj, 372.
go back to reference Palincsar, A. S. (1998). Social constructivist perspectives on teaching and learning. Annual Review of Psychology, 49(1), 345–375.CrossRef
go back to reference Peters-Burton, E. E. (2015). Outcomes of a self-regulated learning curriculum model: Network analysis of middle school students’ views of nature of science. Science & Education, 24, 855–885.CrossRef
go back to reference Peters-Burton, E. E., Dagher, Z. R., & Erduran, S. (2023). Student, teacher, and scientist views of the scientific enterprise: An epistemic network re-analysis. International Journal of Science and Mathematics Education, 21(2), 347–375.CrossRef
go back to reference Peters-Burton, E. E., Parrish, J. C., & Mulvey, B. K. (2019). Extending the utility of the views of nature of science assessment through epistemic network analysis. Science & Education, 28(9), 1027–1053.CrossRef
go back to reference Petrovich, M. E., Barany, A., Shah, M., & Foster, A. (2022). Identity exploration for maker educators: Constructing meaning in after-school environmental science. Journal of Research on Technology in Education, 54(4), 535–556.CrossRef
go back to reference Rachmatullah, A., & Wiebe, E. N. (2022). Building a computational model of food webs: Impacts on middle school students’ computational and systems thinking skills. Journal of Research in Science Teaching, 59(4), 585–618.CrossRef
go back to reference Reid, J. W., Polizzi, S. J., Zhu, Y., Jiang, S., Ofem, B., Salisbury, S., & Rushton, G. T. (2023). Perceived network bridging influences the career commitment decisions of early career teachers. International Journal of STEM Education, 10(1), 17.CrossRef
go back to reference Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., & Mislevy, R. (2009). Epistemic network analysis: A prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33–53.CrossRef
go back to reference Shaffer, D. W., Collier, W., & Ruis, A. R. (2016). A tutorial on epistemic network analysis: Analyzing the structure of connections in cognitive, social, and interaction data. Journal of Learning Analytics, 3(3), 9–45.CrossRef
go back to reference Shipilov, A., Labianca, G., Kalnysh, V., & Kalnysh, Y. (2014). Network-building behavioral tendencies, range, and promotion speed. Social Networks, 39, 71–83.CrossRef
go back to reference Smith, J. P., III., DiSessa, A. A., & Roschelle, J. (1994). Misconceptions reconceived: A constructivist analysis of knowledge in transition. The Journal of the Learning Sciences, 3(2), 115–163.CrossRef
go back to reference States, L. (2013). Next generation science standards: For states, by states. The National Academies Press.
go back to reference Tricco, A. C., Lillie, E., Zarin, W., O’Brien, K. K., Colquhoun, H., Levac, D., & Straus, S. E. (2018). PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Annals of Internal Medicine, 169(7), 467–473.CrossRef
go back to reference Weller, S. C., & Romney, A. K. (1988). Qualitative research methods: Systematic data collection. SAGE Publications Inc.CrossRef
go back to reference Wright, G. W., & Delgado, C. (2023). Generating a framework for gender and sexual diversity-inclusive STEM education. Science Education, 107(3), 713–740.CrossRef
go back to reference Zhang, S., Liu, Q., & Cai, Z. (2019). Exploring primary school teachers’ technological pedagogical content knowledge (TPACK) in online collaborative discourse: An epistemic network analysis. British Journal of Educational Technology, 50(6), 3437–3455.CrossRef
Image Credits
in-adhesives, MKVS, Ecoclean/© Ecoclean, Hellmich GmbH/© Hellmich GmbH, Krahn Ceramics/© Krahn Ceramics, Kisling AG/© Kisling AG, ECHTERHAGE HOLDING GMBH&CO.KG - VSE, Schenker Hydraulik AG/© Schenker Hydraulik AG