Most teachers want to know what caused one student to score one way and another student to score another way. When it comes to AAPPL testing, the story is no different. Here are a few common scenarios that tend to stimulate this question and ways you can address them.   

Contact Us Today For More Information

Concern about the Form Selected 

It is not uncommon to find teachers’ and administrators’ opinions shared in Facebook forums or other professional networks discussing the selection of AAPPL forms. LTI  provides information regarding how the forms are organized, which proficiency levels they target, and the potential ratings a student may receive on each form, so teachers and administrators can make informed decisions when selecting which AAPPL form to use. Sometimes teachers are surprised a learner didn’t receive an I-5 on a Form A test, because they were so impressed with the responses of the learner. However, A Form A test cannot produce an I-5 rating! That’s because ratings cannot be issued outside the intended proficiency range of a prompt or Test Form, no matter how stellar the learner’s responses may be.   

  • Form E: assesses Novice to Intermediate Mid (Grades 3-4 only) 
  • Form A: assesses Novice to Intermediate Mid 
  • Form B: assesses Novice High to Advanced Low

Note that both Form A and Form B assess the Intermediate levels of proficiency; however, Form A does not provide a score beyond I-4. Thus, if you anticipate students are likely to achieve beyond Intermediate Mid or have begun to exhibit emerging Advanced-level characteristics, Form B may be more appropriate.  (Form B will provide them with the opportunity to show what they can do at the Intermediate and Advanced levels).   

For the Seal of Biliteracy, some states require Form B of the AAPPL; you can verify your state’s requirements via your Department of Education. Also, either form of the test may be used to qualify for varying levels of recognition via the Global Seal of Biliteracy. Some states, such as New York, also specify which Form of the AAPPL to use to fulfill curriculum standards in middle and high school. Check with your state Department of Education to verify whether a certain AAPPL Form may be required for specific testing criteria. 

Difference in Reading and Listening Scores for the Same Student 

It is not uncommon to see different Interpretive Listening (IL) and Interpretive Reading (IR) scores for the same candidate. Listening and reading do not develop identically! Language acquisition is not a linear process, nor do all skills develop in the same way and at the same pace. It is a misconception that reading and listening, while both receptive skills, will develop at the same pace or in a linear fashion. Likewise, productive skills of speaking and writing do not necessarily develop synchronously.  

Several studies have examined the phenomenon of language skills development. One study of second and third grade students found that “reading comprehension explained 34% of the variance in listening comprehension and listening comprehension [explained] 40% of [variance in] reading comprehension,” noting that several factors may impact reading and listening development, independent of one another (Wolf et al, 2019). This means candidates may have different scores on listening and reading. 

Looking for Patterns  

Identifying language use patterns among your students can be done via the robust reports available in the Client Site Portal. You may identify some common patterns in test scores that help you understand something happening with learner performance. Within the reporting dashboard, you can filter data by school, language, and grade, and you will see scores broken out by test component. You can also see data from year to year and identify where there may be patterns.    

Perhaps you may see IR scores going up while IL scores are going down, or an overall trend of scores increasing from grade 7 to grade 9, for example. Looking at patterns across grades, languages, or specific skills, for example, can help you identify areas that may need more attention in your curriculum. It can also help you identify and celebrate positive trends in language development! 

So, What Factors Might Contribute to Test Scores? 

Many external factors can impact how students do on a test. Test fatigue, personal/home issues, not eating or sleeping well, apathy toward the test, timing of test administration (e.g., close to the end of the school day or lunch time), testing the same day as other subjects are testing, changes in curriculum or teaching approaches, COVID-related matters (e.g., lack of practice speaking and listening), and more may contribute to differences in test results. For this reason, it is difficult to identify specific reasons someone may have tested the way they did. However, it can be very useful to monitor students’ language progress over several years, via longitudinal reporting, and to look at national averages to help contextualize what you may be seeing in test results. Consider listening to spoken responses and reading written responses of your learners for added insight. And remember it is possible to request rating reviews of tests via the Client Site Portal if you have persistent concerns about test scores.  

Contact Us Today to Bring Language Proficiency Testing to Your Organization

References 

ACTFL. (2024). ACTFL proficiency guidelines 2024. Alexandria, VA: American Council on the Teaching of Foreign Languages. Retrieved from https://www.actfl.org/uploads/files/general/Resources-Publications/ACTFL_Proficiency_Guidelines_2024.pdf  

 Wolf, M. C., Muijselaar, M. M., Boonstra, A. M., & de Bree, E. H. (2019). The relationship between reading and listening comprehension: shared and modality-specific components. Reading and Writing, 32, 1747-1767. 

Recommended Posts