International large-scale assessment (ILSA): Implications for pre-service teacher education in the Philippines
Authors
Allen A. Espinosa – Philippine Normal University, Philippines
Ma. Arsenia C. Gomez – Philippine Normal University, Philippines
Allan S. Reyes – Philippine Normal University, Philippines
Heidi B. Macahilig – Philippine Normal University, Philippines
Leah Amor S. Cortez – Philippine Normal University, Philippines
Adonis P. David – Philippine Normal University, Philippines
Cite
Espinosa A., Gomez M., Reyes A., Macahilig H., Cortez L., David A., (2023). International large-scale assessment (ILSA): Implications for pre-service teacher education in the Philippines. Issues in Educational Research, 33(2), 2023 Retrieved from: https://phedro.org/international-large-scale-assessment-ilsa/
Abstract
The participation of countries in various international large-scale assessments (ILSA) is motivated by different factors. In recent years however, there has been a growing popularity and importance placed on the results of ILSA. Recognizing the significance of ILSA as a valuable source of feedback for enhancing the basic education system, the Philippines, through its Department of Education, has actively engaged in participating. This discussion paper examines the current state of the Philippines in relation to ILSA and underscores the need to incorporate the findings of ILSA into the review of the preservice teacher education program. Although there is limited literature available on how ILSA should inform Philippine pre-service teacher education, existing studies suggest that the current teacher education curriculum falls short in terms of meeting the expectations of ILSA. By framing the discussion within the context of the value of ILSA as an assessment system that can provide feedback for educational improvement, it becomes evident that considering ILSA in the program design of pre-service teacher education is a recognition of its significance, despite its Western origins.
Introduction
Studies on international large-scale assessments (ILSA) have become increasingly popular in the past decade, leading to much debate among academics and the general public (ILSA Gateway, n.d.). While ILSA is considered an important tool for studying education systems and for formulating evidence-based policies, some researchers argue that “the use of international assessment data can result in a range of unintended consequences, such as the shaping and governing of school systems ‘by numbers’” (Johansson, 2016, p. 139).
The idea of ILSA can be traced back to the 1950s when a group of education researchers floated the idea of assessing the academic achievement of students across multiple countries at the United Nations Educational, Scientific and Cultural Organization (UNESCO) Institute for Education in Hamburg, Germany (Husén, 1979). Today, ILSA is described as “studies in which both achievements of a certain age/grade in one or more subjects are compared across education systems and effects of contextual factors at the system, school, classroom, and student level on achievement are studied” (Bos, 2002, p.2). Some major ILSA include the Program for International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), the Progress in International Reading Literacy Study (PIRLS), International Civic and Citizenship Education Study (ICCS), and the International Computer and Information Literacy Study (ICILS) (Hernández-Torrano & Courtney, 2021).
Kijima (2010) identified four models of motivation driving countries to participate in the ILSA studies: (1) the financial aid model; (2) the macro-dissatisfaction perspective; (3) the policy diffusion model; and (4) the rational choice model. In the case of the Philippines, the Department of Education seems to be motivated by the rational choice model. Malaluan (2021) stated that the “objective [in participating in PISA] was to look in the mirror and find out how our learners compared with the rest of the world, and to generate important data to deepen our understanding of the major factors that impact student performance” (p. 4). The Philippines’ participation in ILSA is explicitly spelled out in the country’s Department of Education’s Policy Guidelines on System Assessment Policy in the K to 12 Basic Education Program (DepEd, 2017a). The Philippines joined the PISA 2018, the first ILSA the country participated in since the current K to 12 Basic Education Program was implemented.
Given the growing importance of ILSA to educational policies, this paper examines ILSA in relation to pre-service teacher education in the Philippines. It reviews the performance of the Philippines in ILSA and assesses whether or not the country’s education system, including the provision of pre-service teacher education, is responding to the demands of ILSA. This paper also examines the consequential validity of ILSA and whether or not aligning the teacher education curricula with ILSA has the potential to aid in resolving pressing societal issues such as systemic inequities, social justice, and inclusion.
Full Text
You need to Login or Register an account before accessing this content.
International Large-Scale Assessments (ILSAs) such as PISA and TIMSS have become the prominent measure of mathematical literacy across countries and educational systems to date. As evidenced by released PISA and TIMSS items, math items in such ILSAs are characterized by an application of mathematics in a problem that supposedly simulates a real-world situation, where student test-takers are to use their mathematical knowledge and skills to come up with a solution. Math test items in such assessments are nested in contexts that have been defined in their assessment framework (e.g., the Personal, Occupational, Societal and Scientific context categories in PISA). This study followed the item-writing activities of four tertiary math instructors in the Philippines as they construct context-based math items. After undergoing an orientation on the PISA Math Assessment Framework, the respondents were asked to create PISA-like math items with a given set of specifications for content and context categories during an item-writing seminar-workshop. The data consists of transcripts from the focus-group discussion which was conducted after the seminar-workshop. The transcripts were then analyzed using thematic analysis. The results of this study showed that there were two themes that explain the phenomenon of item writing in the context of writing PISA-like math items. These themes are the phases of item-writing and the dimensions of item-writing. There were three phases and three dimensions captured from the participants’ narratives. Findings showed that the respondents primarily struggled with finding realistic contexts that fit the indicated item specifications based on the PISA categories. Additionally, the writers engaged in a problem-solving task similar to solving a puzzle as they created items that satisfied the content, context and process categories in the table of specifications. This study contributes to filling in the research gap on item-writing activities particularly those of math teachers in the Philippines- a country whose recent performance in the PISA 2018 and TIMSS 2019 was nothing short of dismal in terms of mathematical literacy.