Examining the Effects of Language Competencies on Academic Achievements of Special Needs Students and Their Peers Using Standardized Test Scores
##plugins.themes.bootstrap3.article.main##
The purpose of this study is to investigate the impact of academic language on the standardized test scores of special needs students and their peers in elementary (n = 1140) and middle (n = 451) public schools across the state of Georgia. Several univariate (ANOVA) and multivariate (MANOVA) analyses of variance are conducted on student classification (English language learners “ELL”, non-ELL, students with disabilities “SWD,” non-SWD, economically disadvantaged students “EDA”, and non-EDA) and student test scores in ELA, math, science, and social studies. Univariate (ANCOVA) and multivariate (MANCOVA) analyses of covariance are also conducted where ELA is treated as a predictor of students’ test scores in math, science, and social studies. MANOVA results reveal that the combined-subjects modeling of student test scores is significantly different by student classification with relatively large effect sizes (0.44 to 0.63) for all grade levels. Follow-up ANOVAs indicate that individual modeling of core subjects is significantly different by student classification, with effect sizes between 0.37 and 0.61. The results of ANCOVA and MANCOVA suggest a statistically significant effect of ELA on student test scores results. SWD and ELL groups benefit the most when controlling ELA test scores.
Introduction
The assessment of how students perform in school, including special needs students, is arguably the backbone of educational accountability (Cochran-Smithet al., 2013; Lee & Wu, 2017; Ro, 2018). However, standardized testing continues to be a controversial topic in education (Lewis & Young, 2013; Milneret al., 2021). Standards-based testing includes a broad spectrum of activities from student assessment to school funding, from statewide accountability tests to district benchmark performance, and from summative tests to everyday classroom tests. Educators generally agree that the more information we have about students, the better understanding we have of their achievements or where gaps may occur (Brown & Clift, 2010; Ro, 2018). By gaining such knowledge, applicable state and federal legislations can be critically scrutinized, and new policies can be instituted based on factors affecting student test scores and academic achievement.
In 2015, the state of Georgia launched the Georgia Milestone Assessment System (GMAS) to replace the 15-year-old Criterion-Referenced Competency Tests (CRCTs). The CRCT test results were used by the State as a summative assessment tool to gauge schools’ and teachers’ success in providing the quality of education required by the State. While the CRCTs set low expectations for student performance, the new GMAS tests raised the bar for passing, and more students are failing the tests now (Price, 2019). GMAS tests are more aligned with standards in other states and are guided by the 2009 national Common Core State Standards initiative of what students should learn in each grade level (CCSS, 2017). Students in Grades 3, 4, 6, and 7 must take end-of-grade (EOG) tests in the content areas of ELA and math, while students in Grades 5 and 8 take additional tests in the content areas of science and social studies (GaDOE, n.d.). The overall score for each content area and grade level varies between 140–830. The domain structure for each content area consists of (GaDOE, n.d.):
- ELA: Reading, Writing, Vocabulary, and Language.
- Math: Algebra, Geometry, and Measurement and Data. Statistics for Grades 6 to 8.
- Science: Earth, Physical, Life (Grade 5). Matter, Force, and Motion, and Energy Transformation (Grade 8).
- Social studies: History, Geography, Government and Civics, and Economics (Grades 5 and 8).
To assess how well students are mastering the knowledge and skills outlined in the GMAS standards, student test scores are categorized by the following four achievement-level groups (GaDOE, n.d.):
- Beginning–Does not demonstrate proficiency in Georgia’s content standards. Students are subject to being held back from moving to the next level.
- Developing–Partially demonstrate proficiency in Georgia’s content standards. Academic support is needed to move to the next level.
- Proficient–Demonstrate satisfactory proficiency in Georgia’s content standards. Students are ready to move to the next step.
- Distinguished–Advanced proficiency in Georgia’s content standards. Students are well prepared to move to the next level.
Literature Review
Achievement gaps between special needs students and their peers are well documented in the literature. For example, several studies by the Program for International Student Assessment (PISA) concluded that EDA and ELL students continue to demonstrate lower performance than their peers (OECD, 2004, 2006, 2010). A myriad of previous studies has shown that educational attainment is closely linked to mastery of academic language, and as such, it is expected that language competency will have a direct effect on students’ test scores (Haaget al., 2014; Snow, 2016; Walkeret al., 2008). Without mastery of academic language knowledge, students struggle to develop a complete understanding of concepts taught in class or make connections across core subjects (Caponeraet al., 2016). Unfortunately, in a typical classroom, teachers focus on delivering the knowledge related to the subject area and do not necessarily spend time assessing students’ literacy skills and their ability to fully comprehend the content (Galloway, 2016; Ryanet al., 2017). For example, in a math classroom, academic language is usually not the first priority for teachers, and when students are asked to solve “word” problems, they tend to miss the clues needed to solve these problems (Bairdet al., 2020; Hepptet al., 2015). However, a consensus on whether academic language unequivocally influences student performance has not been reached yet (Galloway, 2016; Shaftelet al., 2006). Research results addressing the impact of academic language on the standardized test scores of special needs students and their peers are not conclusive (Caponeraet al., 2016; Shaftelet al., 2006). Furthermore, prior research has often been impeded by using a small sample size, specific grade levels, and/or a limited number of core content areas.
ELL Students
ELLs represent a diverse group of students who consistently perform lower than their native English-speaking peers in all content areas (Abedi, 2002; Banse & Palacios, 2018; Hakutaet al., 2013; Sparks, 2016). The U.S. continues to see a surge in this student group year-over-year. In 2018, the National Center for Education Statistics (NCES) estimated the enrollment of ELLs in public schools to be around 5 million students or 10.2% of the student population (McFarlandet al., 2018). Most of these students arrived in the U.S. at an early age or were born in the U.S. but did not learn English until they attended school (Bairdet al., 2020; Kieffer, 2008; NCES, 2015). ELLs are estimated to gain conversational proficiency in English after a few years of being in the U.S. However, they will struggle to attain academic language proficiency that is commensurate with their native English-speaking peers (Polatet al., 2016; Vaughnet al., 2017). While some ELL students may possess the necessary language skills for daily classroom conversations and interactions, they normally struggle in the core content areas where more formal academic language is used. As such, teachers frequently miss the rippling effects of poor academic language on the test scores of ELL students and do not necessarily address it in their classrooms (Schleppegrell, 2012; Turkanet al., 2014). Furthermore, there is strong evidence in the literature that ELL students are greatly challenged by the implementation of Common Core State Standards (Hakutaet al., 2013; NRC, 2002; Szpara, 2017; Younget al., 2008).
A study by Younget al. (2008) investigated the educational equity of standards-based assessments used by states to determine students’ achievements in math and science in Grades 5 and 8. The study concluded that ELL students who used translated glossaries/word lists for accommodation achieved better test results. Abediet al. (2001) examined the impact of several simplified language modifications in math tests on ELL and non-ELL student test scores. They found that simplified language math tests resulted in reducing the score gap between ELLs and non-ELLs. However, most of the reduced score gap was due to lower scores by English proficient students, whereas ELLs’ score gain was marginal. A later study by Abedi (2002) revealed that the impact of language proficiency on ELLs’ standardized test scores was greater in content areas where a higher level of language was required. Several previous studies suggested that ELLs can possibly attain comparable performance to their peers in math when the language content is simplified (Bairdet al., 2020; Caponeraet al., 2016; Henryet al., 2014). Using a norm-referenced math instrument and computer-based math games, a study by Altet al. (2014) concluded that there was a strong relationship between math test scores and high language demand for ELLs. For SWDs, however, math test scores were related to limitations in linguistic and nonlinguistic abilities. Furthermore, researchers have shown desired student outcomes by integrating academic language with science content. In a study by Augustet al. (2014), the efficacy of the language intervention method in science tests for ELL students and their peers in seven middle schools was examined. The intervention method was designed to promote the development of academic language in science and included additional instructional resources and professional training materials. Findings from this study showed that the intervention was effective and provided gains in academic language and science knowledge for both ELLs and non-ELL groups. Other studies also concluded that science intervention for ELL students will be more impactful if it is preceded by language intervention (Lara-Alecioet al., 2012; Tonget al., 2014).
EDA Students
The socio-economic status (SES) of a student is generally determined by factors such as parents’ income, education, and job type (Jeynes, 2002). Studies of EDA students over time have generally demonstrated a causal relationship between EDA students and their academic achievements at school (Eamon, 2005; Hepptet al., 2015; Mukherjee, 1995; Snow & Matthews, 2016). Parents of EDA students have less income and struggle more with time management due to continual balancing and prioritizing of daily activities in their lives. Consequently, their children will normally receive less educational support at home (Bradburyet al., 2001; Matejevicet al., 2014; Orlich & Gifford, 2006). Furthermore, due to the economic hardships at home, EDA students exhibit greater behavioral problems at school and are more likely to drop out of school than their peers (Eamon, 2005; Hernandez, 2012; Kearney & Levine, 2016). However, it has been argued that the impact of low SES on students’ academic achievements can be mediated and even neutralized by a variety of individual and parental contextual actions (Considine & Zappalà, 2002; Eamon, 2005; Zappala & Parker, 2000). For example, EDA parents can motivate and transmit educational aspirations to their children by fostering a safe and open learning environment at home and by offering adequate parental involvement at school (Jeynes, 2002; Matejevicet al., 2014).
While the relationship between low SES status and student academic achievement is well established in the literature, its impact on the development of academic language and the subsequent contribution to students’ test scores in other content areas is not (Galloway, 2016; Hepptet al., 2015). Some studies linked restricted exposure to high-level literacy activities at home for EDA students to the delay in their academic language development. EDA parents do not necessarily provide adequate levels of etymological and linguistic variety of speech when communicating with their children (Hochschild, 2003). Furthermore, middle-class parents are more likely to use richer language and more sophisticated sentence structures during joint reading activities with their children (Suizzo & Stapleton, 2007). Research suggests that academic language development starts as early as preschool age. Hence, EDA students start preschool with lower vocabulary knowledge, less grammatical competencies, and inadequate mastery of the academic language than their peers (Eamon, 2005; Thomaset al., 2019). However, while several researchers (Hepptet al., 2015; Mukherjee, 1995; Snow & Matthews, 2016) indicated a negative impact of low SES on academic language development, there is no strong evidence that this disparity will result in lower student test scores in other content areas for this student group.
SWD
Students are generally classified as “disabled” if they experience a physical or mental impairment that can significantly limit their normal life activities (Connor & Cavendish, 2018). These students are expected to have lower test scores and academic achievement than those with no disability (Kirby, 2017; NRC, 2002; Shaftelet al., 2006). A study by Johnson and Monroe (2004) examined the responses of students in Grade 7 to two math test forms, original language and simplified language. The sample size was 1,232 students, including 34 ELLs and 138 SWD. As expected, the ELL and SWD groups scored lower than the general education group in both test forms, with the SWD scoring slightly better on the simplified language test, while the ELL and general education groups scored better on the original test. A similar study by Tindalet al. (2000) evaluated the performance of general education and SWD students in math by using simplified language and standard language test forms. Their results showed no statistically significant differences in test scores for either group on the simplified or standard form of the test. Shaftelet al. (2006) investigated the effects of language on math test items for SWD, ELL, and general student groups. Their findings revealed a meaningful impact of linguistic features on math test items for all student groups, with larger effect sizes for lower grade levels. Their results also showed no statistically significant differences in math test scores amongst the three groups.
The Present Study
The purpose of this study is to investigate the impact of language competencies on the standardized test scores of special needs students and their peers in elementary (n = 1140) and middle (n = 451) public schools across the state of Georgia. The analyses are based on data collected by the Georgia Department of Education (GaDOE) during the 2018–2019 school year. It included the entire student population in Grades 3 to 8 (n = 790,836) who participated in the Georgia Milestone Assessment System (GMAS) tests, an end-of-grade (EOG) summative assessment in the content areas of ELA and math. Students in Grades 5 and 8 are also required to take tests in science and social studies. Based on their GMAS tests, students are compartmentalized under four distinct learning levels (Beginning, Developing, Proficient, and Distinguished) to represent their mastery of the subject and its content as dictated by Georgia’s core standards. Test scores range between 140 and 830 for EOG, with slight variation across subject and grade levels. The conceptual framework for this study is designed to answer the following research questions:
- Does the classification of special needs students (SWD, ELL, and EDA) contribute to lower standardized test scores in core content areas (ELA, math, science, and social studies)?
- Are there any statistically significant mean differences in test scores according to student group membership?
- To what extent, if any, can ELA test scores be used as a predictor of students’ standardized test scores in other core content areas?
Method
Measures and Procedures
The methodology used for this study is threefold. The first method uses a combined core contents model to investigate the main effects of student classification on standardized test scores in core content areas. For this model, a One-Way MANOVA analysis for each grade level is conducted separately using student classification with six levels (ELL, non-Ell, SWD, non-SWD, EDA, non-EDA) as the independent variable and standardized student test scores in ELA, math, science, and social studies as dependent variables. Follow-up univariate ANOVA analysis (individual modeling) is used to explore the impact of student classification on each core content area independently. The second method uses Tukey HSD post-hoc analysis to control Type 1 error and to explore the mean differences in standardized test scores between student classification groups for students in Grades 3 to 8. Each comparison is tested at the ANOVA alpha level divided by the number of independent variable levels (0.0042 for Grades 3, 4, 6, & 7 and 0.0021 for Grades 5 & 8). The third method uses a univariate analysis of covariance (ANCOVA) for Grades 3, 4, 6, and 7 and a multivariate analysis of covariance (MANCOVA) for Grades 5 and 8 to determine whether the combined student test scores in math, science, and social studies are significantly different by the level of student classification, after controlling for ELA test score.
Data Analysis
SPSS software is used for all analyses of variance (ANOVA & MANOVA) and covariance (ANCOVA & MANCOVA) modeling. A confidence level of 95% is used for all analyses. The data from the Georgia Department of Education (GaDOE) contained a few missing values that were deleted without the likelihood of having an impact on the results due to the large data size. The testing of univariate and multivariate assumptions reveals no outliers in the data. Normality is checked using histogram, skewness and Kurtosis measures, bivariate distributions, and the Shapiro-Wilk test. Transformation of variables is not needed since the results show a relatively normal distribution for all variables. The covariance matrix between the groups is statistically equal. The tests of assumptions warrant no case removal and no data transformation. In the end, the data uphold all assumptions for univariate and multivariate normality. Furthermore, the evaluation of results for homogeneity of variance-covariance matrices, linearity, and multicollinearity is satisfactory.
Results
Research Question 1
The combined contents model attempts to predict the main effects of student classification on the combined standardized test scores for Grades 3 to 8. Using Wilk’s criterion, MANOVA results reveal that the combined core contents’ models of student test scores are significantly different by group membership (student classification) with relatively large effect sizes (0.44 to 0.63, Table I) for all grade levels. A measure of effect size is determined using Wilk’s lambda (see (1)):
(1)η2=1−λ
Grade level | Wilks’ lambda | F | df | p | Partial ETA squared |
---|---|---|---|---|---|
3 | 0.562 | 377 | 10 | 0 | 0.438 |
4 | 0.541 | 410 | 10 | 0 | 0.459 |
5 | 0.377 | 311 | 20 | 0 | 0.623 |
6 | 0.364 | 326 | 10 | 0 | 0.636 |
7 | 0.360 | 339 | 10 | 0 | 0.640 |
8 | 0.368 | 136 | 20 | 0 | 0.632 |
This equation represents the variance accounted for by the best linear combination of all dependent variables (core subjects). The multivariate effect size is lowest for Grade 3 (η2 = 0.438), indicating that approximately 44% of the multivariate variance of student combined test scores is associated with student classification (η2 = 0.438, F10,11296 = 377, p = 0.001). The highest multivariate effect size (η2 = 0.640) is for Grade 7, indicating that 64% of the multivariate variance of student combined test scores is associated with student classification (η2 = 0.640, F10,5082 = 339, p = 0.001).
Follow-up ANOVAs using individual modeling of core subjects show statistical differences in student test scores due to group membership for all grade levels, with effect sizes between 0.37 and 0.61 (Table II). Fifth-grade science has the lowest univariate effect size (η2 = 0.366), indicating that approximately 37% of the univariate variance of student test scores in science is associated with student classification (η2 = 0.366, F5,5489 = 634, p = 0.001). Seventh grade ELA has the highest univariate effect size (η2 = 0.605), indicating that approximately 61% of the univariate variance of student test scores in ELA is associated with student classification (η2 = 0.605, F5,2542 = 780, p = 0.001). The variability in students’ ELA test scores is attributed more to group membership than any other core subjects for all grade levels (Table II).
Grade level | Subject | F | df | p | Partial ETA squared |
---|---|---|---|---|---|
3 | ELA | 765 | 5 | 0 | 0.404 |
Math | 656 | 5 | 0 | 0.367 | |
4 | ELA | 925 | 5 | 0 | 0.448 |
Math | 742 | 5 | 0 | 0.394 | |
5 | ELA | 1,271 | 5 | 0 | 0.537 |
Math | 806 | 5 | 0 | 0.423 | |
Science | 634 | 5 | 0 | 0.366 | |
Social studies | 47 | 5 | 0 | 0.410 | |
6 | ELA | 713 | 5 | 0 | 0.590 |
Math | 450 | 5 | 0 | 0.476 | |
7 | ELA | 780 | 5 | 0 | 0.605 |
Math | 494 | 5 | 0 | 0.493 | |
8 | ELA | 678 | 5 | 0 | 0.592 |
Math | 313 | 5 | 0 | 0.402 | |
Science | 296 | 5 | 0 | 0.388 | |
Social studies | 510 | 5 | 0 | 0.522 |
Research Question 2
The results of the pairwise comparisons between student classifications are shown in Table III. The Type I error across the univariate ANOVAs was controlled by testing each at the 0.0042 alpha level for Grades 3, 4, 6, and 7 and 0.0021 alpha level for Grades 5 and 8. For Grade 3, there are significant pairwise mean differences between all student classifications for the math and ELA subjects except for ELA between “non-SWD” and “non-ELL” (p = 0.094). For Grade 4, there are significant mean differences between all student classifications for both math and ELA subjects. The pairwise comparisons post-hoc test for Garde 5 show statistically insignificant mean differences between “SWD” and “ELL” in ELA (p = 0.398), science (p = 0.513), and social studies (p = 0.616). There are also statistically insignificant mean differences in social studies between “non-SWD” and “non-ELL” (p = 0.352), “non-SWD” and “EDA” (p = 0.138), “non-SWD” and “non-EDA” (p = 0.084), and “non-ELL” and “EDA” (p = 0.575). For Grades 6 and 7, all pairwise comparisons show statistically significant mean differences for ELA and math subjects except in math between “SWD” and “ELL” (p = 0.381 for Grade 6 and p = 0.616 for Grade 7). For Grade 8, the only statistically insignificant pairwise comparison is in math subjects between “SWD” and “ELL” (p = 0.171).
SWD | Non-SWD | ELL | Non-ELL | EDA | Non-EDA | |
---|---|---|---|---|---|---|
SWD | 1 | X | NS1 | X | X | X |
Non-SWD | X | 1 | X | NS2 | NS3 | NS3 |
ELL | NS1 | X | 1 | X | X | X |
Non-ELL | X | NS2 | X | 1 | NS3 | X |
EDA | X | NS3 | X | NS3 | 1 | X |
Non-EDA | X | NS3 | X | X | X | 1 |
Research Question 3
A one-way ANCOVA is conducted to compare the impact of student classification on standardized test scores in math for Grades 3, 4, 6, and 7 after controlling for ELA test scores. ANCOVA results suggest significant differences in math test scores for all grade levels (Table IV). The effect size is large for all grade levels and ranges between 0.679 (Grade 4) and 0.811 (Grade 6). When removing the ELA test scores from the analyses, the math test scores for all grade levels still statistically differ according to student classification. However, the impact is much lower and has extremely small effect sizes (0.021 to 0.113).
Grade level | Covariate | Subject | F | df | p | ETA squared |
---|---|---|---|---|---|---|
3 | ELA | Math | 16,838 | 1 | 0 | 0.749 |
ELA removed | Math | 69 | 5 | 0 | 0.058 | |
4 | ELA | Math | 12,056 | 1 | 0 | 0.679 |
ELA removed | Math | 24 | 5 | 0 | 0.021 | |
5 | ELA | Math | 12,652 | 1 | 0 | 0.697 |
Science | 11,967 | 1 | 0 | 0.686 | ||
Social studies | 3,160 | 1 | 0 | 0.365 | ||
Combined subjects | 10,685 | 3 | 0 | 0.854 | ||
ELA removed | Math | 23 | 5 | 0 | 0.021 | |
Science | 55 | 5 | 0 | 0.048 | ||
Social studies | 164 | 5 | 0 | 0.13 | ||
Combined subjects | 78 | 15 | 0 | 0.066 | ||
6 | ELA | Math | 10,650 | 1 | 0 | 0.811 |
ELA removed | Math | 63 | 5 | 0 | 0.113 | |
7 | ELA | Math | 10406 | 1 | 0 | 0.804 |
ELA removed | Math | 49 | 5 | 0 | 0.088 | |
8 | ELA | Math | 3,252 | 1 | 0 | 0.582 |
Science | 1,772 | 1 | 0 | 0.431 | ||
Social studies | 5,591 | 1 | 0 | 0.705 | ||
Combined subjects | 5,586 | 3 | 0 | 0.878 | ||
ELA removed | Math | 22 | 5 | 0 | 0.045 | |
Science | 7 | 5 | 0 | 0.016 | ||
Social studies | 15 | 5 | 0 | 0.031 | ||
Combined subjects | 16 | 15 | 0 | 0.034 |
Fig. 1 shows the impact of controlling the covariate (ELA test score = 486) on the math test scores for the six student groups in Grade 3. SWD and ELL students can benefit the most when controlling ELA test scores, with an 8.7% increase for SWD and a 5.1% increase for ELLs. EDA students can see a marginal increase in their math test scores (only 1.6%) after controlling ELA test scores. The math test scores for SWD, non-ELL, and non-EDA students dropped 2.2%, 1.7%, and 6.6%, respectively, when controlling for ELA test scores. Similar trends are obtained for Grades 4, 6, and 7 (Figs. 2, 4, and 5).
Fig. 1. ANCOVA, 3rd grade mean test scores.
Fig. 2. ANCOVA, 4th grade mean test scores.
Fig. 3. MANCOVA, 5th grade mean test scores.
Fig. 4. ANCOVA, 6th grade mean test scores.
Fig. 5. ANCOVA, 7th grade mean test scores.
A one-way MANCOVA is conducted to compare the impact of student classification on standardized test scores in math, science, and social studies for Grades 5 and 8 after controlling for ELA test scores. With the use of Wilks’ criterion, MANCOVA results suggest significant differences in math, science, and social studies test scores by student classification for all grade levels (Table IV). The effect size is relatively large for fifth-grade math (0.697) and science (0.686) subjects and eighth-grade social studies (0.705) subjects. Furthermore, the effect size is relatively moderate for fifth-grade social studies (0.365), eighth-grade math (0.582), and science (0.431) subjects. The impact of student classification on the combined student test scores in math, science, and social studies is statistically significant after controlling ELA test scores, with large effect sizes for Grades 5 and 8 (0.854 and 0.878, respectively). When removing the ELA test scores from analyses, the combined and individual test scores in math, science, and social studies still statistically differ according to student classification but with much less impact than having ELA as a covariate. The effect sizes range between 0.016 (Grade 8 Science) and 0.066 (Grade 5 combined subjects).
Fig. 3 shows the impact of controlling the covariate (ELA test score = 492) on the math, science, and social studies test scores for the six student groups in Grade 5. Similar to the ANCOVA results, SWD and ELLs can benefit the most when controlling ELA test scores, with up to 20% improvement for SWD in core content test scores and up to 22% improvement for ELLs. EDA students can see a marginal increase in their core content test scores (only 1.3%) after controlling ELA test scores. The core content test scores for non-SWD, non-ELL, and non-EDA students drop between 2.1% and 11.8% when controlling for ELA test scores. Similar trends are obtained for Grade 8 students (Fig. 6).
Fig. 6. MANCOVA, 8th grade mean test scores.
Discussion
This study overcomes some of the gaps of earlier research addressing the impact of academic language on special needs students and their peers. For example, previously described research yielded mixed results. Some studies provided strong evidence of the impact of academic language on student test scores for general education students and/or certain vulnerable student groups (Bairdet al., 2020; Caponeraet al., 2016; Snow, 2016; Uccelliet al., 2015). However, other studies argued that the disparities in assessment test scores between special needs and general education students are attributed to factors beyond academic language, and any correlation between the two is inconclusive (Lara-Alecioet al., 2012; Shaftelet al., 2006; Smetanaet al., 2019). In addition, most of these studies have been impeded by the lack of a diverse inclusion of special needs students and/or by the small sample sizes used. Alternatively, this study offers comprehensive insights into disparities in standardized assessment test scores associated with mastery of academic language between six student groups. It also encompasses all required EOG core subjects’ GMAS tests (ELA, math, science, and social studies) and all students in Grades 3 to 8 across the state of Georgia. The raw data from GADOE were used without modification and were imported to SPSS software to conduct the rigorous statistical analyses used in this study.
The first research question of this study is whether student classification has an impact on standardized test scores. The results of this study suggest that this is the case. Special needs students (SWD, ELLs, and EDA students) do not attain test scores that are commensurate with their peers in all core content areas and all grade levels. The result is consistent with findings from prior research assessing the impact of academic language on student achievement and success (Bairdet al., 2020; OECD, 2010; Ro, 2018; Snow & Matthews, 2016). However, the results from this study provide better quantification of such impact and cover a broader spectrum of student classification, core content areas, and grade levels. The MANOVA combined-subjects models indicate that the variability in test scores is attributed more to group membership in grades 5 to 8 than in grades 3 to 4. Attaining academic knowledge becomes more demanding as students advance to higher grades in their education. Consequently, without proper mitigation plans, special needs students tend to struggle more at higher grade levels (Greenleafet al., 2011; Lager, 2006). Follow-up ANOVAs show that individual modeling of core subjects is significantly different by group membership for all grades and all subjects. The variability in test scores is more affected by group membership for ELA than any other core subject for all grade levels. The domain structure for ELA (Reading, Writing, Vocabulary, and Language) is more influential and has a greater negative impact on the academic achievement of vulnerable groups than the general student population (Smetanaet al., 2019; Uccelliet al., 2015). Proficiency in academic language is generally understood as the attainment of certain linguistic knowledge that allows a student to comprehend textual language efficiently and to make inferences accurately (Perfetti & Stafura, 2014). As such, the educational equity between special needs students and their peers is not easy to achieve due to the gap in acquiring academic language. For example, a study by Hakutaet al. (2000) indicated that it may take up to seven years for ELL students to acquire academic language knowledge that is on par with their peers.
In answer to the second research question, Table III shows statistically significant mean differences between the three special needs student groups and their counterparts in general education (SWD vs. non-SWD, ELL vs. non-ELL, and EDA vs. non-EDA). These results accentuate the disparity in standardized test scores between these groups despite the push by legislatures at the state and federal levels during the last few decades to address achievement gaps amongst the entire student population (Lee & Wu, 2017; Lewis & Young, 2013; Ro, 2018). Based on these results and similar results from previous studies (Brown & Clift, 2010; Cochran-Smithet al., 2013; Milneret al., 2021), it seems that schools are still struggling to implement effective quality teaching programs that accurately measure and promote special needs students’ success and learning. Furthermore, for the purpose of accountability, the No Child Left Behind Act (NCLB) of 2001 demanded individual states put in place effective strategies that can improve the academic achievement and progress of special needs students (NCLB, 2001). However, after the launch of the Common Core State Standards (CCSS, 2017) and assessments in 2009, special needs students have consistently underperformed their peers in all subjects and grade levels due to the rise in inclusion of these students in large-scale state assessments (Banse & Palacios, 2018; Ryanet al., 2017). Table III also shows statistically significant mean differences across various student groups and for all grade levels. The only exceptions are:
- Between SWD and ELLs (Grade 5: ELA, science, and social studies; Grades 6–8: Math)
- Between non-SWD and non-ELLs (Grades 3: ELA; Grade 5: Social studies)
- Between non-SWD and EDA and between SWD and non-EDA (Grade 5: Social studies)
Unfortunately, GADOE compartmentalizes a student in one classification group and does not report the percentage of students who might belong to more than one group. Hence, caution must be followed when making inferences about the mean differences across non-counterpart student groups.
When investigating the extent of using ELA test scores as a predictor of student standardized test scores in other core content areas, ELA test score is used as a covariate while the other core content test scores are used as the dependent variables. When controlling ELA test scores for third-grade students, there is a moderate improvement in math test scores for SWD and ELLs (8.7% and 5.1%, respectively), while the math test scores for non-EDA students are improved by only 1.6%. Similar results are obtained for Grades 4 to 8, but the impact of covarying the ELA test scores on math test scores was higher. This result indicates that SWD and ELLs may require a higher dosage of academic language enrichment than other student groups in order to improve their math test scores. It is possible that these two groups obtained lower math test scores because the assessment tests were conducted without accommodation, which could have impeded their ability to comprehend the test items. Since the mean ELA test score for EDA students is higher than that of SWD and ELLs and comparatively closer to the controlled ELA test score, the improvement in math scores for this student group is marginal. Increasing the controlled value of the ELA test score is expected to yield higher math test scores for this student group. Furthermore, covarying ELA test scores yields lower math test scores for the general student population. This indicates that a decline in academic language proficiency for students in this group will have a negative impact on their math test scores. The variability in the math test scores statistically differs according to group membership when controlling or removing ELA test scores from the analyses (Table IV). However, the impact of covarying ELA test scores on the math test scores is considerably higher than removing it. Based on these results, one can conclude that mastery of academic language contributes to math test scores for all student groups, with a distinct level of impact on each group.
The state of Georgia requires students in Grades 5 and 8 to take additional EOG tests in the content areas of science and social studies. For these grade levels, the impact of language proficiency on science test scores is more prominent than on math or social studies. Science is a highly specialized content area that involves the use of complex language, scientific methods, and experimental investigations. Students are expected to be inquisitors of the natural world around them and to systemically observe, report, and explain their own findings or findings by other scientists (Caponeraet al., 2016; Greenleafet al., 2011). A flurry of previous studies has recommended the integration of language literacy in science education (Akbashet al., 2016; Caponeraet al., 2016; Fang & Wei, 2010; Gallowayet al., 2019; Tonget al., 2014). Due to the high language demands in science, the literature has also shown that providing language accommodations such as translated glossaries/words can be effective in improving science test scores for ELL students (Fang & Wei, 2010; Younget al., 2008). As a result of these intervention programs, special needs students can improve their vocabulary and complex language use, gain new reading strategies, and actively engage in science knowledge and acquisition (Akbashet al., 2016; Gallowayet al., 2019).
Conclusion
Academic language is frequently assumed to be a building block for all core subjects in a classroom. Without it, students cannot fully understand concepts or make connections within content areas to become adept learners (Smetanaet al., 2019; Uccelliet al., 2015). This is true even more so for special needs students such as SWD, ELLs, and EDA students (August & Shanahan, 2008; Banse & Palacios, 2018; Jeynes, 2002; Kirby, 2017; Saunders & O’Brien, 2006). The present study is designed to investigate the relationship between student classification and standardized test scores of special needs students and their peers in elementary and middle schools across the state of Georgia. The efficacy of using ELA test scores as a predictor of student test scores in other core content areas is also investigated. The findings from this study indicate that there is evidence of disparities in ELA, math, science, and social studies test scores between special needs students and their peers. MANOVA results reveal that the combined-subjects models of student test scores are significantly different by student classification, with relatively large effect sizes (0.44 to 0.63) for all grade levels. Up to 64% of the variance in the combined-subjects’ models is attributed to group membership. Follow-up ANOVAs indicate that individual modeling of content subjects is also significantly different by group membership, with effect sizes between 0.37 and 0.61. ANCOVA and MANCOVA (with the use of Wilks’ criterion) results suggest significant differences in math, science, and social studies test scores by student classification for all grade levels after controlling ELA test scores. SWD and ELLs benefited the most when controlling ELA test scores, with up to 22% improvement in core contents test scores for ELLs and up to 20% improvement for SWD. There is no marginal improvement in core contents test scores for EDA students when controlling the ELA test score; there is only a 2% increase.
This study is beneficial since it serves as an example of one type of research that should be conducted frequently to examine the efficacy of standardized assessments as a tool to close achievement score gaps between special needs students and their peers. Increasing numbers of special needs students must compel us to rethink the role of assessment in promoting academic language proficiency and in driving student learning and success. Many researchers are advocating the implementation of new assessment strategies utilizing new classroom practices, such as making student involvement an integral part of the student assessment system (Panaderoet al., 2018; Stigginset al., 2011; Thomaset al., 2019; Zenget al., 2018). Stigginset al. (2011) proposed the use of the so-called “Assessment for Learning” strategy, which occurs during classroom learning and uses proven research-based techniques to improve student achievement. Classroom teachers can take advantage of this assessment method by diagnosing the shortcomings of special needs students, planning instructional strategies accordingly, and providing necessary day-to-day feedback to drive improvements. This study is also one of only a few, if any, to compare ELA test scores as a predictor of standardized test scores in math, science, and social studies for students in elementary and middle schools. The findings from this case study of the standards-based assessment system in the state of Georgia have reinforced the importance of academic language in the academic success of all students, in particular, students with special needs.
References
-
Abedi, J. (2002). Standardized achievement tests and English language learners: Psychometric issues. Educational Assessment, 8(3), 234-257.
Google Scholar
1
-
Abedi, J., Hofstetter, C., Baker, E., & Lord, C. (2001). NAEP math performance and test accommodations: Interactions with student language background (CSE Tech. Rep. No. 536). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.
Google Scholar
2
-
Akbash, S., Sahin, M., & Yaykiran, Z. (2016). The effect of Reading comprehension on the performance in Science and Mathematics, Journal of Education and Practice, 7(16), 108 – 121.
Google Scholar
3
-
Alt, M., Arizmendi, G. D., & Beal, C. R. (2014). The relationship between mathematics and language: Academic implications for children with specific language impairment and English language learners. Language, Speech, and Hearing Services in Schools, 45(3), 220–233.
Google Scholar
4
-
August, D., Branum-Martin, L., Cardenas-Hagan, E., Francis, D. J., Powell, J., Moore, S., & Haynes, E. F. (2014). Helping ELLs meet the Common Core State Standards for literacy in science: The impact of an instructional intervention focused on academic language. Journal of Research on Educational Effectiveness, 7(1), 54–82.
Google Scholar
5
-
August, D., & Shanahan, T. (Eds.). (2008). Developing reading and writing in second-language learners: Lessons from the report of the National Literacy Panel on language-minority children and youth. Routledge.
Google Scholar
6
-
Baird, A. S., Garrett, R., & August, D. (2020). Math and English language development: MELDing content and academic language for English learners. NABE Journal of Research and Practice, 10(1), 1–12.
Google Scholar
7
-
Banse, H., & Palacios, N. (2018). Supportive classrooms for Latino English language learners: Grit, ELL status, and the classroom context. The Journal of Educational Research, 111(6), 645–656.
Google Scholar
8
-
Bradbury, B., Norris, K., & Abello, D. (2001). Socio-economic disadvantage and the prevalence of disability, SPRC Report 1/01 (Report). Social Policy Research Centre, University of New South Wales. https://www.aihw.gov.au/getmedia/d5b1b546-009c-451e-8af7-5c85df144e35/dis-54-10703.pdf.
Google Scholar
9
-
Brown, A. B., & Clift, J. W. (2010). The unequal effect of adequate yearly progress evidence from school visits. American Educational Research Journal, 47(4), 774-798.
Google Scholar
10
-
Caponera, E., Sestito, P., & Russo, P. M. (2016). The influence of reading literacy on mathematics and science achievement. The Journal of Educational Research, 109(2), 197–204.
Google Scholar
11
-
CCSS. (2017). Application of Common Core State Standards for English language learners. Common Core State Standards (CCSS) Initiative. http://www.corestandards.org/assets/application-for-english-learners.pdf.
Google Scholar
12
-
Cochran-Smith, M., Piazza, P., & Power, C. (2013). The politics of accountability: Assessing teacher education in the United States. The Educational Forum, 77(1), 6–27.
Google Scholar
13
-
Connor, D. J., & Cavendish, W. (2018). ‘Sit in my seat’: Perspectives of students with learning disabilities about teacher effectiveness in high school inclusive classrooms. International Journal of Inclusive Education, 24(3), 288–309.
Google Scholar
14
-
Considine, G., & Zappalà, G. (2002). The influence of social and economic disadvantage in the academic performance of school students in Australia. Journal of Sociology, 38(2), 129–148.
Google Scholar
15
-
Eamon, M. K. (2005). Social-demographic, school, neighborhood, and parenting influences on the academic achievement of Latino young adolescents. Journal of Youth and Adolescence, 34(2), 163–174.
Google Scholar
16
-
Fang, Z., & Wei, Y. (2010). Improving middle school students’ science literacy through Reading infusion. The Journal of Educational Research, 103(4), 262–273.
Google Scholar
17
-
GaDOE. (n.d.). Georgia Milestones Assessment System. Georgia Department of Education. https://www.gadoe.org/Curriculum-Instruction-and-Assessment/Assessment/Pages/Georgia-Milestones-Assessment-System.aspx.
Google Scholar
18
-
Galloway, E. P. (2016). The Development of Core Academic Language and Reading Comprehension in Pre-Adolescent and Adolescent Learners (dissertation). http://nrs.harvard.edu/urn-3:HUL.InstRepos:27112709
Google Scholar
19
-
Galloway, E. P., Qin, W., Uccelli, P., & Barr, C. D. (2019). The role of cross-disciplinary academic language skills in disciplinary, source-based writing: investigating the role of core academic language skills in science summarization for middle grade writers. Reading and Writing, 33(1), 13–44.
Google Scholar
20
-
Greenleaf, C. L., Litman, C., Hanson, T. L., Rosen, R., Boscardin, C. K., Herman, J., Schneider, S. A., Madden, S., & Jones, B. (2011). Integrating literacy and science in biology: Teaching and learning impacts of reading apprenticeship professional development. American Educational Research Journal, 48(3), 647–717.
Google Scholar
21
-
Haag, N., Heppt, B., Roppelt, A., & Stanat, P. (2014). Linguistic simplification of mathematics items: effects for language minority students in Germany. European Journal of Psychology of Education, 30(2), 145–167.
Google Scholar
22
-
Hakuta, K., Butler, Y. G., & Witt, D. (2000). How long does it take English learners to attain proficiency? University of California Linguistic Minority Research Institute Policy Report 2000-1. University of California-Santa Barbara. www.lmri.ucsb.edu
Google Scholar
23
-
Hakuta, K., Santos, M., & Fang, Z. (2013). Challenges and opportunities for language learning in the context of the CCSS and the NGSS. Journal of Adolescent & Adult Literacy, 56(6), 451-454.
Google Scholar
24
-
Henry, D., Nistor, N., & Baltes, B. (2014). Examining the relationship between math scores and English language proficiency. Journal of Educational Research and Practice, 4(1), 11-29.
Google Scholar
25
-
Heppt, B., Haag, N., Böhme, K., & Stanat, P. (2015). The role of academic‐language features for reading comprehension of language‐minority students and students from low‐SES families. Reading Research Quarterly, 50(1), 61-82.
Google Scholar
26
-
Hernandez, D. J. (2012). Double jeopardy: How third-grade reading skills and poverty influence high school graduation. The Annie E. Casey Foundation.
Google Scholar
27
-
Hochschild, J. L. (2003). Social class in public schools. Journal of Social Issues, 59(4), 821–840.
Google Scholar
28
-
Jeynes, W. H. (2002). Examining the effects of parental absence on the academic achievement of adolescents: The challenge of controlling for family income. Journal of Family and Economic Issues 23(2), 189–210.
Google Scholar
29
-
Johnson, E., & Monroe, B. (2004). Simplified language as an accommodation on math tests. Assessment for Effective Intervention, 29(3), 35–45.
Google Scholar
30
-
Kearney, M. S., & Levine, P. B. (2016). Income inequality, social mobility, and the decision to drop out of high school. Brookings Papers on Economic Activity, 2016(1), 333–396.
Google Scholar
31
-
Kieffer, M. J. (2008). Catching up or falling behind? Initial English proficiency, concentrated poverty, and the reading growth of language minority learners in the United States. Journal of Educational Psychology, 100, 851-868.
Google Scholar
32
-
Kirby, M. (2017). Implicit assumptions in special education policy: Promoting full inclusion for students with learning disabilities. Child & Youth Care Forum, 46(2), 175–191.
Google Scholar
33
-
Lager, C. (2006). Types of mathematics-language reading interactions that unnecessarily hinder algebra learning and assessment. Reading Psychology, 27, 165–204.
Google Scholar
34
-
Lara-Alecio, R., Tong, F., Irby, B. J., Guerrero, C., Huerta, M., & Fan, Y. (2012). An experimental study of science intervention among middle school English learners: Findings from first year implementation. Journal of Research in Science Teaching, 49, 987–1011.
Google Scholar
35
-
Lee, J., & Wu, Y. (2017). Is the Common Core racing America to the top? Tracking changes in state standards, school practices, and student achievement. Education Policy Analysis Archives, 25, 35.
Google Scholar
36
-
Lewis, W. D., & Young, T. V. (2013). The politics of accountability: Teacher education policy. Educational Policy, 27(2), 190–216.
Google Scholar
37
-
Matejevic, M., Jovanovic, D., & Jovanovic, M. (2014). Parenting style, involvement of parents in school activities and adolescents’ academic achievement. Procedia - Social and Behavioral Sciences, 128, 288–293.
Google Scholar
38
-
McFarland, J., Hussar, B., Wang, X., Zhang, J., Wang, K., Rathbun, A., Barmer, A., Cataldi, E. F., & Mann, F. B. (2018). The Condition of Education 2018. NCES 2018-144 (Report). National Center for Education Statistics. https://eric.ed.gov/?id=ED583502.
Google Scholar
39
-
Milner, A. L., Mattei, P., & Ydesen, C. (2021). Governing education in times of crisis: State interventions and school accountabilities during the COVID-19 pandemic. European Educational Research Journal, 20(4), 520-539.
Google Scholar
40
-
Mukherjee, D. (1995). The relationship between socio-economic background and participation in education: ACEE Research Monograph No. 1. ACEE.
Google Scholar
41
-
NCLB. (2001). No Child Left Behind Act of 2001, Pub. L. No. 107–110, § 115, Stat. 1425 (2002).
Google Scholar
42
-
NRC. (2002). Reporting test results for students with disabilities and English-language learners: Summary of a workshop. National Research Council, The National Academies Press.
Google Scholar
43
-
OECD. (2004). Learning for tomorrow’s world: First results from PISA 2003 (Report). OECD Publishing. https://doi.org/10.1787/9789264006416-en.
Google Scholar
44
-
OECD. (2006). Where immigrant students succeed—A comparative review of performance and engagement in PISA 2003 (Report). OECD Publishing. https://doi.org/10.1787/19963777.
Google Scholar
45
-
OECD. (2010). PISA 2009 results: What students know and can do: Student performance in reading, mathematics, and science (Report). OECD Publishing. https://doi.org/10.1787/19963777.
Google Scholar
46
-
Orlich, D. C., & Gifford, G. (2006). Test Scores, Poverty and Ethnicity: The New American Dilemma. Retrieved from http://www.cha.wa.gov/?q=files/Highstakestesting_poverty_ethnicity.pdf
Google Scholar
47
-
Panadero, E., Broadbent, J., Boud, D., & Lodge, J. M. (2018). Using formative assessment to influence self- and co-regulated learning: the role of evaluative judgement. European Journal of Psychology of Education, 34(3), 535–557.
Google Scholar
48
-
Perfetti, C., & Stafura, J. (2014). Word knowledge in a theory of reading comprehension. Scientific Studies of Reading, 18(1), 22–37.
Google Scholar
49
-
Polat, N., Zarecky-Hodge, A., & Schreiber, J. B. (2016). Academic growth trajectories of ELLs in NAEP data: The case of fourth- and eighth grade ELLs and non-ELLs on mathematics and reading tests, The Journal of Educational Research, 109(5), 541-553.
Google Scholar
50
-
Price, M. (2019, March 30). What parents should know about Georgia Milestones tests. The Ledger-Enquirer. https://www.ledger-enquirer.com/news/local/education/article68973067.html.
Google Scholar
51
-
Ro, J. (2018). Lost in transition: Learning to teach in the era of test-based accountability. In C. Wyatt-Smith & L. Adie (Eds.), Innovation and Accountability in Teacher Education: Setting Directions for New Cultures in Teacher Education (pp. 51–63). Springer.
Google Scholar
52
-
Ryan, S. V., von der Embse, N. P., Pendergast, L. L., Saeki, E., Segool, N., & Schwing, S. (2017). Leaving the teaching profession: The role of teacher stress and educational accountability policies on turnover intent. Teaching and Teacher Education, 66, 1–11.
Google Scholar
53
-
Saunders, W. M., & O’Brien, G. (2006). Oral Language. In F. Genesee, K. Lindholm-Leary, W. M. Saunders, & D. Christian (Eds.), Educating English language learners: A synthesis of research evidence (pp. 14–63). Cambridge University Press.
Google Scholar
54
-
Schleppegrell, M. J. (2012). Academic language in teaching and learning. The Elementary School Journal, 112(3), 409–418.
Google Scholar
55
-
Shaftel, J., Belton-Kocher, E., Glasnapp, D., & Poggio, J. (2006). The impact of language characteristics in mathematics test items on the performance of English language learners and students with disabilities. Educational Assessment, 11(2), 105–126.
Google Scholar
56
-
Smetana, L. K., Sanei, J. C., & Heineke, A. J. (2019). Pedagogical language knowledge: An investigation of a science teacher candidate’s student teaching strengths and struggles. Action in Teacher Education, 42(2), 149–166.
Google Scholar
57
-
Snow, C. E., & Matthews, T. J. (2016). Reading and language in the early grades. The Future of Children, 26(2), 57–74.
Google Scholar
58
-
Snow, M. A. (2016). Content-based language teaching and academic language development. In E. Hinkel (Ed.), Handbook of research in second language teaching and learning (1st ed.). Routledge.
Google Scholar
59
-
Sparks, S. D. (2016) Teaching English-learners: What does the research tell us? Education Week, 35(30), 3-6.
Google Scholar
60
-
Stiggins, R. J., Arter, J. A., Chappuis, J., & Chappuis, S. (2011). Classroom assessment for student learning: Doing it right – using it well. Educational Testing Service.
Google Scholar
61
-
Suizzo, M. A., & Stapleton, L. M. (2007). Home‐based parental involvement in young children’s education: Examining the effects of maternal education across U.S. ethnic groups. Educational Psychology, 27(4), 533–556.
Google Scholar
62
-
Szpara, M. (2017). Injustice for English language learners – Common Core raises academic standards without increasing supports: A descriptive case study of a Philadelphia teacher cohort. Literacy Practice & Research, 43(1), 26-33.
Google Scholar
63
-
Thomas, V., Muls, J., De Backer, F., & Lombaerts, K. (2019). Exploring self-regulated learning during middle school: views of parents and students on parents’ educational support at home. Journal of Family Studies, 27(2), 261–279.
Google Scholar
64
-
Tindal, G., Anderson, L., Helwig, R., Miller, S., & Glasgow, A. (2000). Accommodating students with learning disabilities on math tests using language simplification. Eugene, OR: University of Oregon Research Consultation, and Teaching Program.
Google Scholar
65
-
Tong, F., Irby, B. J., Lara-Alecio, R., & Koch, J. (2014). Integrating Literacy and Science for English Language Learners: From Learning-to-Read to Reading-to-Learn. The Journal of Educational Research, 107(5), 410-426.
Google Scholar
66
-
Turkan, S., de Oliveira, L., Lee, O., & Phelps, G. (2014). Proposing a knowledge base for teaching academic content to English language learners: Disciplinary linguistic knowledge. Teachers College Record, 116(4), 1-30.
Google Scholar
67
-
Uccelli, P., Galloway, E. P., Barr, C. D., Meneses, A., & Dobbs, C. L. (2015). Beyond vocabulary: Exploring cross-disciplinary academic-language proficiency and its association with reading comprehension. Reading Research Quarterly, 50(3), 337-356.
Google Scholar
68
-
NCES. (2015). English language learners in public schools. US Department of Education, National Center for Education Statistics, The Common Core of Data (CCD). https://nces.ed.gov/programs/coe/indicator/cgf/english-learners.
Google Scholar
69
-
Vaughn, S., Martinez, L. R., Wanzek, J., Roberts, G., Swanson, E., & Fall, A.-M. (2017). Improving content knowledge and comprehension for English language learners: Findings from a randomized control trial. Journal of Educational Psychology, 109(1), 22-34.
Google Scholar
70
-
Walker, C. M., Zhang, B., & Surber, J. (2008). Using a multidimensional differential item functioning framework to determine if reading ability affects student performance in mathematics. Applied Measurement in Education, 21(2), 162–181.
Google Scholar
71
-
Young, J. W., Cho, Y., Ling, G., Cline, F., Steinberg, J., & Stone, E. (2008). Validity and fairness of state standards-based assessments for English language learners. Educational Assessment, 13(2-3), 170–192.
Google Scholar
72
-
Zappala, G., & Parker, B. (2000) The Smith family’s learning for life program a decade on: Poverty and educational disadvantage: Background Paper No. 1. Research and Advocacy Team, The Smith Family.
Google Scholar
73
-
Zeng, W., Huang, F., Yu, L., & Chen, S. (2018). Towards a learning-oriented assessment to improve students’ learning - a critical review of literature. Educational Assessment, Evaluation and Accountability, 30(3), 211–250.
Google Scholar
74
Most read articles by the same author(s)
-
Saoussan Maarouf,
The Impact of Teacher, School, and Student Factors on Standardized Student Test Scores Using Multidimensional Approach , European Journal of Education and Pedagogy: Vol. 3 No. 1 (2022)
Similar Articles
- Konstantinos T. Kotsis, Integrating Inquiry-based Learning in the New Greek Primary Science Curriculum , European Journal of Education and Pedagogy: Vol. 5 No. 6 (2024)
You may also start an advanced similarity search for this article.