Jen Ouellette-Schramm & Jen Vanek
We are pleased to bring you the Spring 2020 issue of MinneTESOL Journal. This issue responds to some of the challenges and opportunities this coronavirus era has posed to English language learners and teachers. In her invited piece, “The Education of Latinx Bilingual Children in Times of Isolation: Unlearning and Relearning,” Dr. Ofelia García challenges conventional conceptions of minoritized Latinx bilinguals, language proficiency, and language teaching, and suggests new ways of teaching Latinx children with care. In her invited piece, “New Ways of Serving Adult ESOL Learners: Innovation Stems from Disruption,” Senior Editor Dr. Jen Vanek shares resources and strategies to help instructors who support adult ESOL learners employ educational technology to not only meet the demands of teaching at a distance during the pandemic, but also to rethink how lessons learned now might create opportunity for further expanding learner access and personalization after a return to face-to-face education is possible.
In their article, “How Dispositions Are(n’t) Addressed in the English Learner Case Study Assignment,” Dr. Miranda Schornack, Dr. Michelle Benegas, and Amy Stolpestad analyze the potential of a teacher education methods course assignment for promoting dispositional development toward equitably serving English language learners. Finally, in his piece, “Building an Integrative Classroom,” Matt Delaini reflects on applying social work concepts to ESL teaching to support student motivation and develop humane ways of responding to motivational challenges.
This issue also marks a change in MinneTESOL Journal leadership, as Dr. Jen Ouellette-Schramm concludes her three-year tenure co-editing the journal and welcomes Dr. Michelle Benegas as a new editor. We are all very excited to have Michelle take on this leadership role. Michelle Benegas, Ph.D., is an assistant professor at Hamline University. She has taught ESL in K-12, adult basic education, and college settings. In her work with teachers and schools, she promotes a model in which ESL teachers serve as site-based experts and coaches to their general education colleagues. Her research interests include ESL teacher leadership, teacher leader identity, and systemic approaches to improving EL services.
The “pause” offered during the coronavirus pandemic permits me to reflect on principles about language, children’s bilingualism, and their education long considered mainstream. I propose that this is a time to unlearn, and relearn anew. I address the invalidity of traditional principles for Latinx bilingual students and propose other understandings.
Key words: Bilingualism, language, language teaching, Latinx, pedagogical practices, translanguaging
I write this as I sit home in isolation after having recovered from the coronavirus. New York City is silent, except for the sounds of sirens carrying patients to hospitals. What can I still say about the education of Latinx minoritized bilinguals when interaction with others is limited and schools are closed? when standardized tests have been suspended and educational authorities have stopped talking about standards, academic language, and categories of children? There is much suffering and much darkness in this time of crisis, but there is also time to unlearn and relearn.
Children in the United States and all over the world are suffering. In New York City, children are questioning their isolation, the absence of parks and playgrounds, of friends and family. A health crisis like the one we are facing hits all children with fear, even if some can escape to summer homes and have the advantages of technology and homeschooling by parents whose jobs can be done from home. What will children know when they come out of this? How will educators continue to care for them, to relieve the fear? What lessons will we have learned? These are all questions that we will have to face.
The question for me now as I write this is: What understandings do I still hold on to when language education, as we knew it, has ceased to exist? How do we navigate the wounds, the heridas that have surfaced in these dark times to reconstruct life anew for all children, and especially for those like Latinx minoritized bilinguals who are most vulnerable?
In what follows, I reflect on some principles about language education and the education of Latinx bilingual children that have been considered mainstream understandings. I propose that we need to unlearn, so that we can relearn anew. I address three categories of mainstream understandings about language and education––1) our understandings about language, 2) our understandings about language proficiency and how these produce categories of learners, 3) our understandings of language teaching.
Here I take up the call made by the Portuguese philosopher Boaventura de Sousa Santos in his work on the “epistemologies of the South” (2007, 2014). Santos calls for a different logic, a way of knowing that includes the knowledge systems of those who have suffered most from the effects of colonialism and global capitalism. I first discuss mainstream understandings, and I propose some alternative thinking of alternatives (Santos, 2007). Thinking from “both sides of the line” allows us to adopt a measure of cognitive justice for these children. The education of these children then is refocused as we relearn what it means to educate with difficult loving care so as to attend to their suffering and fear.
I address here two mainstream understandings about the language of Latinx bilinguals:
These understandings of language have been constructed in ways that render these bilingual children deficient because they are compared to what is understood as the only valid knowledge—that of monolingual white middle-class children and their communities. When knowledge of language is seen only from the powerful side of the line, with what is said to be “modern science and scholarship,” what is, in reality, the practice of one group is then expected of those whose knowledge has been relegated to the other side of the line, and thus rendered invisible or non-existent.
When these monolingual white middle-class students learn another language in schools, their additive bilingualism, with two languages that reflect different nation-states and cultural systems, is then the only form of bilingualism that is validated. In this way, the dynamic bilingualism that characterizes bilingual communities who live their lives in what Gloria Anzaldúa has called “borderlands” (1987) is maligned. The community’s bilingualism is seen as a “mixture” of languages; their knowledge of language is rendered incomplete, full of errors. When their bilingualism is studied, it is to point out phenomena that does not conform to monolingual use—the use of loans, calques, and what is described as code-switching. In reality, however, the language of bilinguals in communities simply does not fall squarely within the boundaries that have been constructed around named languages like English or Spanish and what is fashioned as “standard language.” The concept of a standard language has been constructed by nation-states and their institutions in an effort to control whose language and knowledge systems are rendered valid. The language of bilingual communities has been made deficient by imposing the knowledge-system of white monolingual middle-class people. In so doing, those on “the other side of the line” have undergone a process of minoritization. Latinx bilingual children’s language is characterized by absences, by what is not there. This renders their translanguaging, that is, their own complex language which does not fit the constructed canons of what states and their institutions propose to be English or Spanish, more and more silent, until it is rendered inaudible and non-existent (for more on translanguaging, see especially García & Li Wei, 2014; Otheguy et al., 2015, 2018).
In the last few years, schools have imposed another language construct that restricts our view of Latinx bilingual students as knowledgeable about language. This construct is what has been called academic language. It is now said that Latinx bilingual students fail not just because they do not “have” English or Spanish, but because they also do not have academic language.
Although scholars have worked assiduously to try to define it (cf. Snow & Uccelli, 2009; Uccelli et al., 2015), we understand less and less what it is. Is it just the language of written academic texts used in the United States? And if this is so, does it include all texts said to be academic, including those in the Humanities and the Social Science? Does it include texts of Latin American philosophers, for example? Is it the language of teachers? Which teachers? Doing what?
System Functional Linguistics (SFL) has been applied to the construct of academic language to identify how grammatical structures are derived from different types of socially relevant tasks within varied social contexts (Schleppegrell, 2012). But even when this work is done by critical sociolinguists who incorporate the language and cultural repertoires of Latinx bilingual students, SFL leaves out the knowledge-system, the forms of consciousness of those considered to be “on the other side of the line.” That is, since Latinx bilingual students are not considered valid members of the only culture and group that has been constructed as legitimate, their knowledge-practice, that is, the ways in which they think about and act on language has been left out. Thus, the concept of academic language adds to the burden and the failure of Latinx bilingual students and renders their knowledge of language and bilingualism as non-academic, popular, intuitive, incomprehensible, or simply wrong.
There are two mainstream understandings about language proficiency and how it relates to the categorization of students that circulate as “truths” in educational circles. They are:
The concept of language proficiency is one that responds to the advent of measurement, with modern science restricting the field of knowledge so that it fits within the contours of what can be measured. In order to measure language, it had to be made into an entity made up of grammatical components, an object that human beings either have or do not have more or less.
But language is an activity, a product of complex social action (Becker, 1995; Maturana & Varela, 1984). Language is always a languaging, a verb, always in motion and in relationship to life and its context. As such, language is immeasurable, an ongoing process that defies measurement.
Yet it is the first definition of language as an object that is used in education. Through measurements of what is objectified as language, reflecting the language of white monolingual middle-class people, the “others” are rendered “limited.” And thus, many Latinx bilingual students are labeled as “Limited English Proficient,” or as “English Language Learners.” Note well what I am saying, which is worth repeating. It turns out that Latinx bilingual children are “invented” through these measurements as “limited” and “learners” of a language that actually makes up their bilingualism. The translanguaging of Latinx bilinguals, a more complex and dynamic way of doing language, of languaging with many different interlocutors, is then reduced to a limitation and a deficiency, a lack of proficiency. This in turn makes it possible to create categories of children—those who can be educated, and those who have to first learn “English,” in ways that are simply not theirs.
Latinx bilingual children labeled “English learners” are then seen and listened to through absences, through what they do not have, through what are seen as their limitations. Their emergent bilingualism is negated. Instead of being recognized for what they do with language, with their complex translanguaging, they are penalized for not “having” a language that has been constructed precisely to leave out their own language. The limitation is not that of the children; it is of an educational system that uses invalid measures to rob some of them from rich instruction and enrichment programs in the arts. It is a limitation of an educational system that then reduces instruction for these children on remediating what they are said not to have. Instruction becomes a way to make these bilingual children reach an English language “standard,” that will remain out of reach for them because it requires them to “have” something that has been defined a priori as simply not theirs.
The so-called objective measures of language proficiency have served to amplify categories of limitation, so that more Latinx bilingual children qualify for remedial instruction. Instead of opening up a more generous space where all children can receive an enriching education, more and more Latinx bilingual children fall short of standards that were never meant to include them.
Language educators often adhere to two principles that are accepted as universal:
Curriculum for language teaching follows a scope and sequence that responds mostly to the language use and development of monolingual middle-class children. But most Latinx bilingual children are simultaneous bilinguals, which means that they are developing their bilingualism at home, usually from the time they are born, as they interact with siblings and family and community members. And yet, the teaching of, for example, English as a second language to Latinx bilinguals labeled “English learners” proceeds as if they have little practice with English, although many have heard it and have used it from the time they learned to talk. For some, now labeled “Long Term English Language Learners” because of faulty notions of language proficiency, English may be the only language they speak.
When teaching Latinx bilingual students who have recently arrived in the United States, the scope and sequence followed in English as a second language programs also treats the language as an object, a series of phonological, morphological, syntactical and lexical elements that can be taught through skill and drill. And although the curriculum of bilingual education programs breaks from this focus on teaching and learning language as an object, the creation and growth of dual language programs where white English-speaking monolingual students participate has meant that a sequence based on a tradition of “foreign language” learning is now given priority. This means that Latinx bilingual children are asked (at least officially, even if it does not happen in reality) to never use “Spanish” during “English” instruction, and never use “English” during “Spanish instruction. This demeans even further the bilingual community’s use of translanguaging. As such, many dual language education programs have become simply a language education program that ignores and punishes with even more fury the bilingualism of the Latinx community.
Everything that we have done in the past to “remediate” the language of Latinx bilinguals has failed us. It is time to unlearn these understandings that we have held dear.
But then, what is it that we must relearn? How can we then teach Latinx bilingual children with loving care that is not simply an emotion, but an action? The answer has to do with teaching Latinx bilingual children lovingly about the difficult histories that have surrounded language. The answer has to do with incorporating the knowledge-practice from both sides of the line, not just from the powerful side of the line.
Educators of Latinx bilingual students must pose two questions of their teaching:
Instead of teaching with a goal of helping Latinx bilingual children meet externally-imposed criteria, educators must ask themselves:
Living with the coronavirus crisis might help give educators the courage to act differently when they return to classrooms. To heal we will need to understand the difficult histories of how the crisis evolved. This might give us the courage we need to help children understand the role that language in schools has played in the systemic and unjust suffering of Latinx bilingual children. As language educators, we must relearn, as we reflect during this time of coronavirus. Only by shifting gears will we ensure that Latinx bilingual children resignify their lives and education with dignity.
Anzaldúa, G. (1987). Borderlands/La frontera: The new mestiza. Aunt Lute Books.
Becker, A. L. (1995). Beyond translation: Essays toward a modern philosophy. University of Michigan Press.
García, O., & Li Wei. (2014). Translanguaging: Language, bilingualism and education. Palgrave Macmillan Pivot.
Maturana, H., & Varela, F. (1984). El árbol del conocimiento: Las bases biológicas del entendimiento humano. Lumen/Editorial Universitaria.
Otheguy, R., García, O. & Reid, W. (2015). Clarifying translanguaging and deconstructing named languages: A perspective from linguistics. Applied Linguistics Review, 6(3), 281-307. https://doi.org/10.1515/applirev-2015-0014
Otheguy, R., García, O., & Reid, W. (2018). A translanguaging view of the linguistic system of bilinguals. Applied Linguistics Review, 10(4), 625-651. https://doi.org/10.1515/applirev-2018-0020
Santos, B. de S. (2007). Beyond abyssal thinking: From global lines to ecologies of knowledges. Review (Fernand Braudel Center), 30(1), 45-89. https://www.jstor.org/stable/40241677
Santos, B. de S. (2014). Epistemologies of the South: Justice against epistemicide. Routledge.
Schleppegrell, M. J. (2012). Systemic functional linguistics: Exploring meaning in language. In J. P. Gee & M. Handford (Eds.), The Routledge handbook of discourse analysis (pp. 21-34). Routledge.
Snow, C. E., & Uccelli, P. (2009). The challenge of academic language. In D. R. Olson, & N. Torrance (Eds.), The Cambridge handbook of literacy (pp. 112-133). Cambridge University Press.
Uccelli, P., Barr, C., Dobbs, C., Galloway, E. P., Meneses, A., & Sanchez, E. (2015). Core academic language skills (CALS): An expanded operational construct and a novel instrument to chart school-relevant language proficiency in preadolescent and adolescent learners. Applied Psycholinguistics, 36(5), 1077-1109. https://doi.org/10.1017/S014271641400006X
The COVID-19 pandemic has forced adult basic skills and ESOL programs to offer instruction at a distance. The uncertainty of the future means programs must rethink sustainable alternatives to traditional classroom programming. In this way, the pandemic has forced a change that might just reshape adult learning—potentially making it more flexible and personalized in the days to come.
Key words: COVID-19, adult ESOL, online learning, Adult Basic Education
The COVID-19 pandemic has forced teachers to rethink the way they support learners and provide instruction. This is a global challenge: according to data from UNESCO (n.d.), the coronavirus has instantly forced 1.9B students and teachers worldwide, effectively 70% of total enrolled learners and their educators, online. Since mid-March 2020, the Ed Tech Center @ World Education has been supporting adult basic skills programs, as they have moved their instruction online, by sharing research, resources, and innovative strategies we’ve seen percolating in Adult Basic Education (ABE) and English for Speakers of Other Languages (ESOL) “classes” across the United States. As programs have risen to the challenge to meet the needs of their learners, the growth in distance education we’ve observed has been unprecedented.
This rapid expansion of distance education in many states is built on a foundation of incremental growth in adult basic skills and ESOL programs over the past several years. Federal enrollment data for IDEAL Consortium1 member states, states that have prioritized development of distance education programs and collaborate as a community of practice to do so, show that in FY 2014-2015, 12,820 distance learners took more than 50% of their coursework online in these states. In FY 2018-2019, that number grew to nearly 30,000.2
This is promising to be sure, yet, those learners are only a fraction of the total number of students enrolled in adult basic skills and ESOL programs, and those enrolled represent only a fraction of adults in this country who have basic skills and literacy needs or lack a high school diploma. This reality is mapped nicely in a new Barbara Bush Foundation resource which shows the Programme for the International Assessment of Adult Competencies (PIAAC) adult literacy data at the county level. Figure 1 shows a screenshot of the interactive map on the foundation’s website, which illustrates areas of the United States with the lowest scores on the most recent PIAAC literacy assessment and then layers on data showing the connection between literacy levels and factors that negatively impact well-being.
How is this related to the current pandemic? The map shows there is clearly great need for programming to support adult literacy in the United States. Even before the pandemic, the programs in place were not meeting all of the need. Leveraging technology can extend and enrich learning to make it more available to a greater pool of adult learners (Rosin et al., 2017; Vanek et. al, 2019). Despite the steady progress establishing distance education, there have always been programs that have struggled making distance and online learning a priority—perhaps because of concerns about access to technology or the challenge of helping students and teachers feel comfortable working together remotely. These are still valid concerns, yet today, because of the pandemic, reticence about moving online is no longer an option. Programs that had previously been uncertain about distance education have now had no choice but to figure out how to make it work. Though it has not been easy, many have made the shift and are now offering at least some instruction at a distance, and much of that using online resources and technology.
As I’ve watched all this unfold, I’ve been reminded of a book called Disrupting Class by Clayton Christensen. Somewhat dated now (written in 2011), the basis of the book is a concept called disruptive innovation, a theory that explains how a new strategy, process, or tool disrupts existing structures (Christensen et al., 2011). The theory, applied to education, describes how new technologies (or use of them) can disrupt existing learning structures (traditional classroom programming). The theory of disruptive innovation suggests the trajectory of online learning is a path that could lead toward creating relevant and more flexible learning experiences for more students—expanding opportunities that are more accessible and personalized, in contrast to the classroom-bound structures that don’t always work for learners because of time and place constraints (e.g., learners can’t make it to class because of work or family obligations).
The pandemic and the forced move toward more use of online technologies is a disruption that has required a shift to more flexible learning opportunities. Since mid-March, I’ve seen programs offering a range of online learning options and teachers across the United States move more to a facilitator role. The result has been increased student-centered learning made possible as instructors draw on multiple technologies and online resources to meet different students’ needs. As teachers and learners grow more comfortable working together online, instruction becomes more and more student-driven because learners can work independently, collaborate via technologies with their classmates, and access personally-relevant instructional content.
For example, adult ESOL instructors are creating opportunities for learning online outside of their scheduled Zoom class sessions. A common practice is to use a class website as a place to bank supplemental resources that they can assign according to the needs or interests of the learners who are present. The same site can be used to integrate Google docs and slides to support collaborative learning online. Another common example is use of such a website in tandem with a free online learning curriculum (e.g., USA Learns or We Speak NYC). The teacher can monitor learners’ work and assign supplemental resources according to what they observe in the curriculum. Other teachers are delivering micro-learning opportunities sharing media-rich content through WhatsApp. Specific examples of how teachers have used these strategies and the resources they created can be found in the EdTech Center’s archive of Distance Education Strategy Sessions.
Such instruction may seem far-fetched in some places, but I think, starting small and building on the forced innovation in place now during the pandemic, we can reshape the work of adult basic skills and ESOL programming to deliver more flexible, truly personalized, and relevant learning experiences for more adult learners than our programs have had the capacity to reach via the old structures.
You may have already taken the plunge and have worked out how to move instruction online. For programs still feeling their way and hoping to expand their distance offerings beyond paper packets, here are some critical first steps, based on work published on the Ed Tech Center’s Tips for Distance Learning, to help.
Collect all relevant contact information for your learners (e.g., cell phone numbers, email, home addresses), so you know how you will keep in contact with them. Be sure to know which of them is preferred by the students; possible communication channels might be phone calls, videoconferences, texting, apps (such as WhatsApp or Remind), or email.
What technology access do your teachers and students have (e.g., licenses to online products, web-based teacher-created curricula, devices, Internet)? Make a list of these assets and plan how you will communicate them to learners and staff. Consider steps you might take to prevent access-equity gaps from getting wider. What resources can you refer your learners to—things that leverage the access they have, like mobile phones? If you know that there is no access, you may consider packets with some essential learning materials.
The National Digital Inclusion Alliance (NDIA) has curated a list of special offers for access to broadband from Internet Service Providers across the United States. Consider starting a tablet or laptop lending program if you have the devices.
The Minnesota Department of Education, Adult Basic Education, has published guidance to help programs move instruction online. Updates can be found on the MNABE support website. If you are not from Minnesota, do check with your state professional development leaders to see what’s available.
There are a multitude of technical support and professional learning resources for teachers. ABE teachers in Minnesota should follow the COVID-19 resource site put together by ATLAS. The linked newsletters featuring teacher stories, ABE Voices Across the Distance, are very useful. The EdTech Center @ World Education also has a site, Tips for Distance Learning, which links to offers from curriculum developers and directories of free open education resources. The site also shares briefs describing essential distance education components, which are based on past IDEAL Consortium and EdTech Center research.
Learners need proactive support. Consider recording a webinar or creating a frequently asked questions (FAQ) page that maps out key steps for accessing the technology resources used in instruction. Make some screencasts with audio instructions to help learners navigate the online resources you hope they will use remotely. You can use free screencasting tools like Loom, Screencast-o-Matic, or Screencastify. Send links to the videos using communication channels or technologies you know are most easily accessed by learners. Many resources have developed tutorials, several in multiple languages, so check first before spending time to create your own and generate inspiration by building off of existing community resources. For example, St. Paul ABE has posted this site with multilingual how-to videos. New York TESOL has created and posted videos about how to use Google Classroom in this YouTube playlist.
After providing some initial proactive support, you need some way for students to get help with educational technology if they cannot meet with a teacher. Consider a dedicated phone line or chat system (you could use WhatsApp). You might have office hours using the free version of Zoom or BigBlueButton. Have teachers practice with each other and practice using these with as many learners as you can while they are still with you in person.
Learning at home is going to be very difficult for many learners. They might be trying to work from home, raise children, or educate their children while trying to continue their studies with you. Even if they have technology skills, they may not have independent learning experience, so they may not know where to begin when it comes to making choices about scheduling time, choosing resources, or reaching out to you with questions. While this is a unique opportunity to push learners toward more digital literacy, be wary of the cognitive load required.
Preferably introduce one technology at a time, and start with one that learners know how to use already. Texting and use of the app WhatsApp is a solid choice. (The image to the right is adapted from WhatsApp promotional materials.)
In 2019, 68.1 million United States mobile phone users accessed WhatsApp to communicate (Clement, 2020). Anecdotal reports and informal surveys done by teachers who have posted to the LINCS Integrating Technology Community show that many ABE learners are among this group. WhatsAPP is great because students can communicate with a familiar technology through a familiar action—texting. Pew Research data from as far back as 2011 show that 76% of Americans texted (Smith, 2019).
WhatsApp (or Remind as an alternative) is an excellent means by which to communicate essential information. Because you can send video, audio recordings, images, links, and text notes, you can deliver instructional resources to learners easily. For a comprehensive guide to using WhatApp for instruction, check out ABE expert and LINCS facilitator David Rosen’s continuously updated resource. As posts to the forum show more WhatsApp strategies, Rosen adds them to the document.
Though options are now falling into place for remote standardized testing using (e.g., CASAS, TABE, and Best Plus), you may not yet be doing official pre- and posttesting to record level gains. You will still need to assess your learners to better understand what learning resources you might share with them. There are several strategies I’ve noticed in use by programs across the country during the pandemic.
The easiest way to pretest is to make use of placement assessments that may be integrated into an online curriculum your program may have access to. The benefit to doing this is you’ll have a range of assessments across the content areas you are teaching, and the learner will likely get a learning plan created for them within the product.
If your program does not have access to a licensed curriculum, you can still do an initial assessment for literacy level. Read Theory is a free app that gauges a learner’s reading level and then helps students improve their reading comprehension skills by moving through increasingly more complicated passages.
For students who cannot pretest using a technology, you might consider an oral assessment. A simple phone call with a student can help you understand their speaking and listening proficiency. For more formal assessment, consider using the verbal skills proficiency assessments from CASAS or Best Plus if your program uses either one of them.
Because learners are likely struggling to balance supporting their children, working, and managing the stress of the pandemic, they may not be in the best mindset for making academic progress. This does not mean they should be dropped! Many of these learners may not have other sources of information and support at this time, so the goal for those who are not making progress is to stay connected. You might sustain the connection by providing vital information about how to stay safe or access support resources. The connections you can sustain now will make it more likely that these students will return when your program doors reopen. A very useful site for information about the pandemic is Switchboard, a resource hub for refugee service providers developed with the support of the Office of Refugee Resettlement (ORR). The site features multilingual videos, posters, and informational PDFs explaining COVID-19 and how to stay safe during the pandemic.
The distance education options in place now provide a glimpse of what adult ESOL classes in the United States might look like after the pandemic—a new model of more personalized blended learning. It is likely that even when doors to programs open, it may be with a requirement for social distancing. With fewer learners in the classrooms, more will need to be taught online. Using a blended approach, a teacher might support a classroom of students, but instead of all being in class at the same time, small groups of students would take turns being in the classroom. The teacher could work with each small group in person, in a classroom setting, on activities that expand on or prepare students for online learning that happens in between class meetings.
Some programs new to distance education during the pandemic have begun to circle back and revisit initial strategies and processes put into place; they’ve moved beyond the triage way of working and are looking to strengthen distance education with sustainable instructional practices and administrative processes. This work is being done with the view that we may not return to the old normal for quite some time, if at all. There is so much in the works now that will support forward momentum of distance education—more gracious National Reporting System policies, including one that allows for remote testing; a plethora of professional development opportunities; increased access to devices and broadband for many students across the country; more digital instruction resources; and most importantly, students who have now had a glimpse of the flexibility and personalization afforded through distance education. We’re not likely to look back at this time and see any silver lining, but hopefully, we’ll look back and see a time of incredible growth and innovation in the field of adult ESOL and literacy instruction.
Christensen, C. M., Horn, M. B., & Johnson, C. W. (2011). Disrupting class: how disruptive innovation will change the way the world learns (2nd ed.). McGraw Hill.
Clement, J. (2020, January 8). WhatsApp Status daily active users 2019. Accessed 19 May, 2020 at https://www.statista.com/statistics/730306/whatsapp-status-dau/
Rosin, M., Vanek, J., & Webber, A. A. (2017). How Investment in technology can Accelerate Collective Impact in Adult Learning. World Education, Inc. https://edtech.worlded.org/resources/investment-in-technology/
Smith, A. (2019, December 31). Americans and Text Messaging. Accessed 19 May, 2020 at https://www.pewresearch.org/internet/2011/09/19/americans-and-text-messaging/
UNESCO. (n.d.) COVID-19 Educational Disruption and Response. Accessed 19 May, 2020 at https://en.unesco.org/covid19/educationresponse
Vanek, J., Rosin, M., Silbert, J. H., Tashjian, K., & Webber, A. A. (2019). Technology, innovation, and adult career pathways. COABE Journal: The Resource for Adult Education (November), 124-132. https://coabe.org/wp-content/uploads/2019/09/TheResourceforAdultEducationCareerPathwaysSpecialEdition.pdf
Miranda Schornack, Michelle Benegas, & Amy O. Stolpestad
This article examines an assignment common in ESL methods courses—the English learner case study (or learner profile)—for dispositional development and explores how teacher educators can be more explicit and thorough in cultivating educator dispositions for working with English Learners.
Key words: Teacher education, English learner case study, dispositions
The struggle to gain footing on the notion that “every student is my student”—that all teacher candidates1 perceive learning about working with English learners (ELs) as central to their work as teachers—is ongoing. The need for such work is critical as the current U.S. sociopolitical context is fraught with examples of problematic dispositions toward immigrant and language minoritized communities. In this article, we will share what we have learned from our collective 30 years of experience across five institutions of higher education (IHEs) working to foster the dispositions needed to work effectively and respectfully with ELs and advocate for further work in this area.
We analyze the presence of dispositions in the EL case study assignment, what we term one of the “high impact practices” (HIPs) in our English as a Second Language (ESL) methods courses for elementary and secondary teacher candidates. While dispositional work was often inherent of HIPs like the case study assignment, it was not given the full attention we believe dispositions deserve. To illustrate this, we use a local dispositions framework (MnEDS™ Research Group, 2017-2018) to examine the ways in which the case study assignment provides opportunities to develop dispositions. We selected the MnEDS™ framework because of our familiarity with it as a local resource, its powerful three-pronged conceptual foundation, and its unique rubric structure (these are articulated in the section below). We then call for IHEs to be more explicit and thorough in the cultivation of candidate dispositions for working with ELs. It is important to note that we have used the EL case study assignment in methods courses for teacher candidates pursuing language-centered credentials (e.g., ESL, world language) and those pursuing non-language centered licenses (e.g., math, elementary). We believe the ideas we present have implications for all types of credentialing programs.
Dispositions is one of the three major constructs in educator development (Bransford et al., 2005). Unlike the two other constructs, knowledge and skills, respectively, dispositions “has failed to garner the same type of gravitas in the field” (Hill-Jackson & Lewis, 2010, p. 61). In this landscape, IHEs have come to define dispositions locally (Damon, 2007; Rose, 2013).
One such example is the University of Minnesota-Twin Cities which, through the work of a research group comprised of doctoral students, clinical and instructional staff and faculty, and educational researchers, created the Minnesota Educator Dispositions System™ (MnEDS™). The MnEDS™ Research Group (2017-2018, p. 1) defined dispositions as:
The commitments you make as a classroom teacher are evident in the pedagogical choices you make, the curriculum you write, your interactions with students, teachers, colleagues, families, and community members, and in the ways you carry yourself as an educator. We call these dispositions for teaching.
The three conceptual underpinnings of MnEDS™ are 1) dispositions are formative, and they can be coached and cultivated; 2) knowledge of a person’s dispositions is distributed across contexts and people, therefore dispositions development can only be done in dialogue with others; and 3) dispositions must be equity-oriented (MnEDS™ Research Group, 2017-2018).
From that conceptual framework, MnEDS™ identified eight disposition strands: assets, role of self, collaboration and communication, critical care, intentional professional choices, navigation: flexibility and adaptability, imagination and innovation, and advocacy (see Figure 1: MnEDS 8 Dispositional Strands, MnEDS™ Research Group, 2017-2018).
The MnEDS™ framework offers a rubric structure that is unique in two key ways. First, the rubrics are descriptive. Unlike numeric or progressive rubrics, the MnEDS™ rubrics—see Figure 2 below—name four different ways of expressing dispositions that are a part of an individual’s ongoing dispositions development.
In other words, the MnEDS™ framework expects a person to flow across the four descriptive categories across time and space, as the teaching and learning context shifts.
Second, the three descriptive columns on the right side of the vertical bold line represent three distinct ways of developing dispositions. Awareness signals the knowledge-base a person has regarding a particular disposition strand. Commitment reflects a person’s belief in the value of that disposition strand for teaching and learning. Enactment is when a person engages in a practice or behavior that takes up the disposition in a clear way. The MnEDS™ framework proposes that all three ways of developing dispositions are important and intertwined with one another, rather than developed linearly or in a defined progression.
In this paper we introduce HIPs to refer to course elements (e.g., activities, assignments) that resulted in palpable differences (Kubanyiova, 2019) in teacher candidates. HIPs are the activities and assignments that candidates reported as being particularly impactful and that we instructors observed as moments that shifted candidates’ perspectives. It is important to highlight that HIPs are less about candidates’ demonstration of technical skill or knowledge of content or theories of child/human development and more about candidates’ enactment of dispositions for working with language minoritized students. In other words, HIPs shift the focus from “What do I need to do to teach ELs effectively?” to “How do I need to be to teach ELs ethically?” One such HIP is the EL case study assignment.
We chose this particular HIP for analysis for two reasons. First, in our experiences as methods course instructors, the case study most robustly attends to candidate dispositions. Second, we have found that the case study is a common assignment across IHEs preparing candidates to work with language minoritized students, families, and communities.
The case study assignment requires candidates to work closely with one EL for an extended period of time. Theoretically, the close and meaningful interactions between a candidate and EL can foster not only the development of knowledge and skills but also dispositions. The following assignment analysis illustrates what we have learned about the opportunity to focus on candidate dispositions in the EL case study assignment. We provide contextual details intentionally to either illuminate our analysis and/or provide key clarifying information that would be useful to fellow instructors of ESL methods courses.
In our analysis, we examine five common components of the case study assignment: acknowledging funds of knowledge, reflecting on shifts in perspective, building relationships, analyzing instruction, and recognizing the teaching and learning context. For each of the five components, we illustrate where and how MnEDS™ dispositions were addressed. When appropriate, we offer a loving critique of the current version of the MnEDS™ framework. Our goal in providing critique is to demonstrate the need for teacher educators to be critical consumers of resources and stimulate dialogue and ongoing research, implementation, and development of frameworks that cultivate educator dispositions for working with language minoritized students. Following the analysis, we discuss how we could improve our focus on candidate dispositions by being more explicit and thorough about the dispositional aspects of teaching in our assignments. It is worth noting that each author taught the EL case study assignment in a different context. The specific assignment descriptions varied, the licensure area of the teacher candidates were different, and whether candidates had a clinical placement was different. Therefore, we discuss how dispositions were typically part of case study assignments, not highlighting any particular assignment description.
One component of a case study assignment is for candidates to learn more about the funds of knowledge (Moll et al., 1992) of ELs, their families, and communities. This aligns with MnEDS™ Strand 1: Assets and MnEDS™ Strand 3: Collaboration and Communication. Candidates were expected to have one-on-one interactions with their focal student and sometimes conduct an interview. Although one-on-one interactions are normal aspects of pre-service teacher work in clinical placements, an interview with one student—particularly a student who is a member of a social group that has been historically marginalized in the education system—is not. Therefore, the act of conducting an interview with a language-minoritized student may actually serve to further “other” them from the perspective of the pre-service teacher (Gitlin et al., 2003). It is also clear that the case study falls short in addressing engagement with families and collaboration with colleagues, which is integral to Strand 3. Common explanations for this are the limited time that candidates are in a clinical placement as well as their positionality as pre-service teachers. Further, where the candidate will fall on the developmental rubrics depends, in part, on whether the candidate is reporting on internal shifts in their perspective or demonstrating those shifts in new praxis. For instance, the second indicator in the awareness column for Strand 1 is “Desires to learn about students’ backgrounds and communities.” Candidates who write about their desire to learn about students’ backgrounds, without actually demonstrating how they’ve taken up their desire with real students, would be situated there. Alternatively, candidates could be situated in the enactment column if they “[use] critical inquiries about culture to build relationships and inform teaching and learning,” the third indicator there. The question for teacher educators becomes how candidates can demonstrate their dispositions, particularly when teacher educators have not directly observed what candidates report in written assignments. One challenge to using MnEDS™ is that the indicators in each column are not always aligned to the indicators in the same position in other columns. For example, MnEDS™ Strand 5: Intentional Professional Choices contains four indicators in the critical incidents and enactment columns but only three indicators in the awareness and commitment columns. Positioning the same number of indicators, in the same order in each column, could facilitate the use of the MnEDS™ descriptive rubrics.
Another common aspect of the case study assignment is for candidates to reflect on new learning or shifts in perspective that occurred while working closely with one EL. The new learning has often been related to perspectives on multilingualism, how mainstream teachers can responsibly work with ELs, newly developed empathy for learning a second language, and/or how prior opinions or biases have been challenged. This aspect of the case study aligns with MnEDS™ Strand 2: Role of Self. The first indicator across each of the four columns in the descriptive rubric is centered on personal biases. As an example, the language in the commitment column is “Critically reflects on the ways in which their personal biases, characteristics, and identities impact teaching and learning.” A key consideration when using MnEDS™ rubrics in the development of teacher candidates for working with language minoritized students, families, and communities is that the language of the rubrics might be too general to point to specific biases regarding language. Language biases can be challenging for candidates who are monolingual in a society driven by monolingual, English-only stances (de Jong & Gao, 2019), monoglossic language ideologies (Flores & Rosa, 2015), and English imperialism (Motha, 2014).
In the case study, candidates are expected to build a meaningful relationship with an EL and this nods to MnEDS™ Strand 4: Critical Care. While such a relationship may have developed during the candidate’s clinical placement, it was not an explicit feature of the case study assignment, nor was it assessed. A number of students reported that they had a heightened understanding of their focal student’s lived experiences, as well as increased empathy for challenges that they faced. However, this outcome was not consistent across candidates and, similar to Strand 3: Collaboration and Communication, there was no carry through to application. Using the language of the rubric, candidates did not “build students’ self-efficacy and achievement.” Strand 4 requires that the candidates position themselves as a source of support, working in solidarity with their students and this was not directly attended to or assessed in the case study assignment. One barrier to building a meaningful relationship with a focal learner is the limited time they spent with them. However, given the newly developed MnEDS™ framework, the assignment could be redesigned to better reflect a stance of critical care in working with ELs.
MnEDS™ Strand 5: Intentional Professional Choices asks the teacher candidate to participate in “ongoing professional learning and decision making that is ethical, based on multiple forms of evidence and feedback, and extends opportunities for professional growth and leadership” (MnEDS™ Research Group, 2017-2018). One goal of the case study is for candidates to observe instructional choices teachers made in order to attend to the teaching and learning needs of the focus student. The case study allowed for a rare but important look at how professional choices impacted a student’s development. However, the case study assignment was limited in that the candidate was not the one making the instructional pivot in order to respond to the student, but rather watching as another teacher did or did not do so. Viewed through the lens of Strand 5, the case study provided an opportunity to evaluate other teachers’ practices rather than their own, so there are several dispositional qualities laid out in the rubric that are entirely missed in the assignment. Further complicating this is the fact that some of the ways in which candidates demonstrate their strengths in Strand 5 are difficult to capture in a university-based course assignment, such as engagement in teacher leadership activities.
MnEDS™ Strand 6: Flexibility and Adaptability includes a candidate’s ability to understand the learning context and make changes as necessary in order to best meet the needs of students and their families. Under the commitment column of the rubric, the case study clearly provides candidates with an opportunity to “passively learn from students, colleagues, and like-minded people in communities as a means of finding a navigational compass,” but the assignment does not allow for the demonstration of enactment of this disposition because the candidate remains passive for the better part of the experience. True enactment would require instructional autonomy on the part of the candidate, which is constrained by the time limitations in the clinical placement context. For example, some candidates were able to complete the case study while student teaching, allowing for opportunities to work one-on-one with instructional materials, while others completed the assignment in schools where they were observers only. These systemic conditions influenced the degree to which the case study had the potential to address many of the criteria laid out in Strand 6.
While some areas for improvement of the MnEDS™ framework were suggested (e.g., consistent indicator language across descriptive rubric categories), it is evident that there is a critical need for such a tool if we seek to foster dispositional development in educators toward equitably serving ELs. Perhaps the most significant finding in this analysis is that a capstone assignment, such as the EL case study that is common across teacher education programs, lacked explicit attention to, application of, and assessment of dispositional development. Relying on student epiphany falls short in intentionality and assurance that needed dispositions are attended to. Further, our analysis revealed that even a capstone project like the case study assignment can be completed “fully” and still be largely theoretical—not bridging to a candidate’s praxis or enactment of dispositions. Using a framework such as MnEDS™ can bolster assignments in teacher education so that dispositions are addressed and assessed in intentional, applicable, and assessable ways.
Our analysis of a single HIP illuminated the ways in which we, as teacher educators, partially addressed equity-oriented dispositions. Excluding dispositions, or failing to attend to them in sufficient detail, is like removing one leg from a three-legged stool. Without dispositions for working with ELs, teacher knowledge about them and skills to serve them are incomplete. Moving forward, we are committed to being more explicit and thorough about the dispositional expectations of coursework. Ongoing and rigorous examination of our practices will allow the field of teacher education to evolve toward a more robust understanding of how we can cultivate and assess dispositions in teacher candidates. Analyses such as this one can lead us to such a place.
Bransford, J., Darling-Hammond, L., & LePage, P. (2005). Introduction. In L. Darling-Hammond, & J. Bransford (Eds.), Preparing teachers for a changing world: What teachers should learn and be able to do, (pp. 1-39). Jossey-Bass.
Damon, W. (2007). Dispositions and teacher assessment: The need for a more rigorous definition. Journal of Teacher Education, 58(5), 365-369. https://doi.org/10.1177/0022487107308732
de Jong, E., & Gao, J. (2019). Taking a multilingual stance: A continuum of practices. MinneTESOL Journal, 35(1). http://minnetesoljournal.org/current-issue/mtj-2019-1/taking-a-multilingual-stance-a-continuum-of-practices/
Flores, N., & Rosa, J. (2015). Undoing appropriateness: Raciolinguistic ideologies and language diversity in education. Harvard Educational Review, 85(2), 149-171. https://doi.org/10.17763/0017-8055.85.2.149
Gitlin, A., Buendía, E., Crosland, K., & Doumbia, F. (2003). The production of margin and center: Welcoming-unwelcoming of immigrant students. American Educational Research Journal, 40(1), 91-122. https://doi.org/10.3102/00028312040001091
Hill-Jackson, V., & Lewis, C. W. (2010). Dispositions matter: Advancing habits of the mind for social justice. In V. Hill-Jackson, & C. W. Lewis (Eds.), Transforming teacher education: What went wrong with teacher training, and how we can fix it (pp. 61-92). Stylus Publishing, Inc.
Kubanyiova, M. (2019, May). The promise of “disturbing encounter” as meaningful language teacher education. Keynote address presented at the 11th International Language Teacher Education Conference. Minneapolis, MN.
MnEDS™ Research Group. (2017-2018). Minnesota educator dispositions systems (MnEDS™): A framework for equity-oriented teaching. Accessed 15 May 2020 at https://sites.google.com/a/umn.edu/umn-dispositions-assessment-framework/home
Moll, L., Amanti, C., Neff, D., Gonzalez, N. (1992). Funds of knowledge for teaching: Using a qualitative approach to connect homes and classrooms. Theory into Practice, 31(2), 132–141. https://doi.org/10.1080/00405849209543534
Motha, S. (2014). Race, empire, and English language teaching. Teachers College Press.
Rose, S. (2013). How do teacher preparation programs promote desired dispositions in candidates? SAGE Open, 3(1), 1-8. http://doi.org/10.1177/2158244013480150
The gap between a teacher’s desire for classroom engagement and a student’s motivation can sometimes lead to a frustrating struggle for control in the classroom. Understanding what is happening beneath the surface in such situations can help teachers to select techniques that build relationships and increase the likelihood of engagement.
Key words: motivation, autonomy, relationships, strengths-based, ESL, participation, integrative
Teaching can be incredibly rewarding, but it doesn’t always feel that way. A 2019 PDK poll shows that 50% of teachers are thinking about quitting, with stress cited as a major factor, and Gallup found that 48% are actively looking for a way out of the field (Gewertz, 2019; McFeely, 2018). And it’s more than just daydreaming—9.5% of teachers actually do quit during their first year and a further 40-50% leave after five years (Riggs, 2013). Low salaries and high workloads contribute to the problem, but there’s more to it than that. It also comes down to what happens in the classroom and how that affects teacher well-being (Shen et al., 2015). Working with students means working with people, and working with people isn’t easy.
When those people are English as a Second Language (ESL) students, the work can be especially complex. In addition to navigating the standard life stage hurdles common to all children and young adults, ESL students face acculturative stresses and questions of identity that can significantly impact their ability to learn (Berry, 1997; Ushioda & Dörnyei, 2009). Much of this happens below the surface, with struggling students often presenting as simply unmotivated or “difficult.” Although it can often feel challenging to engage with such students, developing a perspective that takes their struggles into account and using techniques that show understanding can help teachers to foster a positive classroom atmosphere that draws students into communication. In advocating for such an approach, this article draws on existing research in the areas of motivation and strengths-based systems theory, my own journey as an ESL instructor, findings on the importance of relationships in the ESL classroom, and counseling techniques that can help to build those relationships.
When it comes to ESL students who appear unmotivated or “difficult,” it helps to understand how motivation works. Research suggests that intrinsic motivation (internal personal interest) is the optimal learning fuel, resulting in higher rates of learning, achievement, attendance, and graduation (Froiland & Worrell, 2016). To build intrinsic motivation, however, students need to feel autonomous (Rudy et al., 2007; Ryan & Deci, 2011). The idea is that we are at our most motivated when we feel like we have the power (autonomy) to take an active hand in our own happiness. The downside of all this is that none of us are completely autonomous; we all depend to some degree on others in pursuing our goals. If that dependency overshadows our sense of autonomy, intrinsic motivation dissipates and performance declines (Ryan & Deci, 2011). When this happens, our reactivity to stress increases and cognition necessary to memory and learning suffers (Hill et al., 2018; Yaribeygi et al., 2017).
ESL students living and studying in the L2 culture are an at-risk group when it comes to such losses. For starters, moving into a new culture with limited language skills entails a complex process of mental and behavioral adjustment that can undermine a student’s sense of autonomy (Berry, 1997; Rudy et al., 2007). Students who are unsure of how to get their needs met in the L2 environment may be dealing with lower levels of intrinsic motivation, greater stress reactivity, and a decreased ability to learn. The problem can be unknowingly compounded by teachers who attempt to supply motivation extrinsically (from outside forces) via coercive rewards and punishments. External inducements are only effective if the “norms, rules, and values” behind them have been internalized by the student; if they have not been internalized, the student may simply see the coercive tactics as yet another threat to their autonomy (Ryan & Deci, 2011). If all of this goes unaddressed, the student and the teacher are in danger of entering a cyclical process of demotivation. Students who present as unmotivated or “difficult” may cause teachers to feel stressed, and a teacher who displays high levels of stress further harms student motivation (Harmsen et al., 2018; Shen et al., 2015). The result can be a push-and-pull between student motivation and teacher desire for engagement that fossilizes into a pattern of frustration and obscures the benefits of the classroom system.
If looking at how motivation works can serve to highlight some of the dangers facing ESL students, strengths-based systems theory can illustrate some of the positives. In social work, systems theory holds that systems and the individuals who participate in them are mutually dependent. At its best, this mutual dependence helps us to meet our needs and to engage in positive growth (Hutchison, 2015; Rothery, 2016). When the benefits of mutual dependence are obscured, however, problems arise and needs go unmet.
To break a cyclical process of demotivation and restore healthy mutual dependence, it helps to address how we view the classroom. A control-based perspective that relies overmuch on extrinsic motivation via coercive punishments and rewards assumes student belief in its legitimacy, undervalues the benefits of intrinsic motivation, and puts too much responsibility for learning and growth on external agents like teachers (Ryan & Deci, 2011). In my own experience, a control-based perspective also tends to place undue stress on instructors because it sees engaged classrooms as depending entirely on flawless lesson design, the application of power, and a superhuman ability to motivate.
A strengths-based systems theory perspective takes a different view. It starts with the assumption that each student in the classroom has the strength to pursue their natural inclination towards growth and learning (Shulman, 2016; Simmons et al., 2016). It does not assume that students are always acting on this natural inclination. It simply assumes that the strength to do so exists within each student and that it is more likely to manifest given the right conditions. From this perspective, the job of the teacher is to facilitate access to concepts and “to recognize that resistant behavior has meaning” (Shulman, 2016, p. 124).
When I first began teaching ESL in 2005, I didn’t know anything about motivation or strengths-based systems theory. In fact, the first teacher training seminar I attended was at an institute in which the trainer gave the following advice—never admit mistakes. Airtight lesson plans would impress students and eliminate error. If I did make a mistake, admitting it would only undermine my authority as an instructor. It was horrible advice that presented teaching as being entirely about knowledge transmission and applying this advice put up a wall between myself and my students. When confronted with students who weren’t engaged or motivated, my primary tools were extrinsically-oriented power moves. I could threaten them with poor grades, I could punish them by sending them out of the room to speak to someone with more power, I could shame them in front of others, or I could blame them for not applying themselves. Mostly, I went home and wondered why I’d gotten into teaching.
Eventually, out of sheer desperation, I started telling stories. When my students were running out of energy, and when I’d either exhausted my power moves or gotten sick of using them, I told stories about growing up in rural New England. I told stories about my friends and I trying, and often failing, to evade the various dogs guarding the local farms so that we could steal fistfuls of rhubarb and raspberries. I told them stories of how I would spend hours convincing my younger brother to join me in stunts that would leave both of us injured and a story about a tree falling on a friend at his own birthday party. And that’s how it went—my students and I would exhaust whatever motivation they had, we’d exhaust my ability to coerce them into participating, I’d dig into my library of stories, and then we’d repeat the cycle. Each time I told stories, I’d feel like we were connecting, but then I’d worry about wasting classroom time and I’d try to be a “real” teacher again. I hadn’t found my groove in the classroom, but I could see that the storytelling was having a positive impact on my relationship with my students.
I was beginning to understand that extrinsically-based knowledge transmission isn’t the only way to frame teaching and that relationship matters in the classroom. To borrow a quote from Dr. Thomas Gordon’s (2003) Teacher Effectiveness Training, I was learning that “school isn’t cramming a lot of stuff into the heads of the students. It’s helping them get ready to grab ideas and concepts when they can and how they can” (p. 43). U.S.-based second language learners of all levels may come into the classroom freighted not only with everyday motivational factors like quantity of sleep and level of hunger, but also with different culturally-based educational schema and complex psychological acculturation processes that affect their ability to build intrinsic motivation (Berry, 1997; McCargar, 1993; Rudy et al., 2007; Ryan & Deci, 2011). Since most students have neither the training nor the vocabulary to identify all of the motivational factors affecting them, it often falls to the teacher to begin building a communicative learning atmosphere that “helps them get ready” to learn by giving them a safe space to express themselves and build a sense of control. In this context, positive teacher-student working relationships can have a significant bearing on student motivation (Daniels & Piayoff, 2015).
The teacher-student working relationship also matters because so much of what happens in the ESL classroom requires collaborative communication. Integrativeness, a key factor in second language motivation, has been defined as “a genuine interest in learning the second language in order to come closer to the other language community” (Gardner, 2001 as cited in Ortega, 2009, p. 170). Without that motivation to come closer to the language community, language learning decreases or halts entirely (Benson, 2001; Dörnyei, 1998; Guilloteaux & Dörnyei, 2008; Lam, 2009; Rees-Miller, 1993; Spratt et al., 2002; Zimmerman et al., 1992). If we think of the ESL classroom as being a system that establishes its own culture through discourse and which requires a certain level of acculturation, what happens there can have a significant effect on whether a student wants to “come closer” and communicate (Baek & Choi, 2002; Berry, 1997; Poole, 2005; Schmitz, 1997).
A teacher’s focus on control and the use of extrinsically-focused power moves like threatening, punishing, shaming, and blaming can turn the classroom into an environment that actually decreases intrinsic motivation and drives the student away from the language learning community (Ortega, 2009). It’s no coincidence that control-focused behaviors like threatening, punishing, shaming, and blaming are also labeled as elements of abusive relationships by the Duluth Domestic Abuse Intervention Program (DAIP, n.d.). The injudicious use of power, in short, is corrosive and unlikely to foster integrativeness.
When it comes to building an integrative classroom environment, I’ve found that using counseling techniques from the field of social work is an efficient way to build rapport. Skills like Tuning In, I-Messages, and Active Listening allow me to communicate the understanding and acceptance necessary for strengthening classroom relationships without adding an undue burden to classroom work.
The skill of Tuning In is a key element in this process because it asks teachers to “try to experience the client’s feelings” and to try “to get in touch with their own feelings” (Shulman, 2016, p. 91, 93). Social categorization is the often unconscious process of categorizing people, and it’s important to acknowledge because how we categorize people determines how we treat them (Liberman et al., 2017). Tuning In is a way of slowing down this process and making it more conscious so that we categorize compassionately and with a sense of purpose. It does this through three stages: tuning in to general categories occupied by the student, tuning in to specific knowledge about the student, and tuning in to what’s happening in class (Shulman, 2016).
Tuning In to general categories involves building compassion by thinking about what it’s like to be an “ESL student” or an “adolescent.” When I was living abroad, taking language classes usually wasn’t the highlight of my day. And as a teenager, there were many days when I was physically present in the classroom but mentally elsewhere. Keeping this in mind helps me to be less judgemental when I have a student who isn’t participating—I can think about the general forces affecting them and I can then approach them in a more understanding manner.
After Tuning In to my students generally, I can then Tune In to my unmotivated or “difficult” students specifically. What do I know about them individually? Are there relationship issues? A substance abuse problem? When a student is distracted due to such issues, a punitive approach probably isn’t going to change their behavior because it doesn’t affect the root cause of the problem. As a university-level instructor, I’ve had students who are unable to concentrate in class because they’re heartbroken, hungover, or distracted by other issues. Coercing them into participating has never worked for me—their physical and emotional states don’t change because I want them to. Rather than fight a battle I’m not going to win, I prefer to use the moment to build our communicative relationship. If I can tell them it’s okay and that we’ll try again in the next class, I’m able to communicate care and understanding. This approach has worked for me—I’ve had conversations with students that contextualized their behaviors, brought us to greater understanding, and enhanced our ability to work together in the classroom. To me, it’s what makes teaching worth doing. While such a stance might seem overly permissive, research has shown it to be effective—data from the Children of Immigrants Longitudinal Study, which looked at 5,262 children with immigrant parents, have shown that supportive work in the classroom mitigates problems stemming from acculturation issues and negative developmental patterns (Haller et al., 2011).
Although it’s good to start with a student-centered approach, both generally and specifically, it’s also a good idea to Tune In to the class as an environment in order to find opportunities to make the atmosphere integrative. When I do this, it mostly involves me thinking about how the room feels or what activities do or don’t work. In regard to the way my classrooms feel, I find that speaking up in an otherwise silent room can be an extremely uncomfortable experience. To mitigate this, I play music at a moderate volume in all of my classes both because it takes the edge off what can otherwise be an oppressively quiet atmosphere and because, sooner or later, my students get sick of hearing my lo-fi hip hop playlists looping endlessly in the background. Eventually they start making song requests and asking to DJ from their own laptops, and then we have a communication point. In a similar fashion, I find that some classes can be reluctant to participate in mock negotiations or discussions. Instead of railing on about the need to participate, I create assignments that are intentionally onerous and, when my students object, I pretend that I’m on the fence and draw them into a discussion or negotiation about making the assignment easier. It doesn’t take them long to catch on to the fact that I’m always swayed in the end, and negotiation and discussion around work becomes a natural part of the class. Tuning In to the classroom environment and the structure of each class is about trying to find adjustments that contribute to a more comfortable and integrative learning environment.
Tuning In to the self is where teachers get in touch with their own feelings and think about how managing those feelings can either enhance or detract from the learning environment (Shulman, 2016). If I haven’t slept well or if something is bothering me, I’m far more likely to see problems than if I’m rested and happy. Teachers are human beings, as subject to exhaustion and frustration as anyone else. Trying to hide our feelings is impractical, both because we rarely hide our emotions as well as we think we do and because the research suggests that sharing helps to create a positive environment. It’s been found to foster trust, co-constructive interaction, and enhanced communicative competence while reducing acculturative stress (Cait, 2016; Hou et al., 2018; Shulman, 2016). Much like playing music or creating negotiative opportunities, sharing feelings is a small tweak that can act as a communicative point.
I-Messages are another healthy way to create an integrative atmosphere because they can enhance clarity and minimize defensiveness by removing negative evaluation (Gordon, 2003). They consist, first and foremost, of a factual report on a specific student behavior devoid of negative editorializing. Saying “When I see you using your phone in class . . .” is better than “When you’re inconsiderate . . .” (p. 143). The second part of an I-Message addresses the “tangible or concrete [emphasis in the original] effect on the teacher” (p. 144). It could be something like “When I see you using your phone in class, I start thinking about how I’m going to have to repeat everything for you later.” The third part of an I-Message is where the teacher expresses the feelings associated with the tangible effect—“When I see you using your phone in class, I start thinking about how I’m going to have to repeat everything for you later and I get frustrated” (p. 145). I-Messages are honest, separate the problem from the person, and avoid the resentment that teachers can feel when students fail to pick up on their indirect messages.
When students do communicate, it’s important to listen carefully and authentically. Active listening skills involve paraphrasing what a student says to verify meaning, asking follow-up questions, and using non-verbal cues that show attention (Jones et al., 2019). The overall goal is to use the opportunity to learn more about the student’s world and develop connection. In a study involving 115 conversational pairs, researchers found that it’s worth the effort—active listening made participants feel more understood and enhanced their willingness to communicate (Weger et al., 2014).
Active listening, when done correctly, is not easy. The urge to advise or judge can be hard to suppress. When I first started teaching, my students would talk about their daily plans and I would advise them to study instead. My students eventually stopped telling me their plans and, even worse, they started telling me what I wanted to hear. They were learning not to take communicative risks and it was negatively affecting communication in the classroom. The problem was that my listening wasn’t student-centered, and I was challenging the legitimacy and quality of what I was hearing (Jones et al., 2019). Tuning In to the self and sharing feelings is important, but it’s also important to be mindful of the impact on the integrative atmosphere. Now, when my students talk about their plans, I listen actively both to show that I care and that it’s safe to communicate. If I want to comment, I use I-Messages and I respect the student’s right to have a different opinion. I save the judgement for grading, where it belongs, and I save my advice for when a student asks for it.
I’ve focused on motivation, strengths-based systems theory, research on relationships, and counseling techniques in this article because these are the things that have helped me the most as a teacher. I came very close to being one of the 40-50% of teachers who walk away after the first five years because I didn’t feel good about my work in the classroom. I didn’t like that my students and I always seemed to be at odds, and I didn’t want to spend my days coercing people into participating. Coming to the realization that teaching is less about knowledge transmission and more about cultivating the optimal conditions for people to pursue their natural inclination toward growth and learning changed how I felt about the classroom.
Before I bring this to a close, I want to note that the framework I’ve presented here does not preclude the judicious use of teacher power when needed. There is nothing wrong with failing a student who hasn’t done the work or asking a disruptive student to leave the room. Like any other teacher, I have done both of these things. The trick is to do it in the context of a caring atmosphere that objects to the behavior in question and not to the student as a person. Building that atmosphere is not an easy process—it takes mindfulness, self-regulation, and patience. Teachers are only people, and our ability to manifest these qualities is better on some days than it is on others. The good thing about building a caring and understanding learning environment is that it’s forgiving for all involved—students and teachers.
Baek, S. & Choi, H. (2002). The relationship between students’ perceptions of classroom environment and their academic achievement in Korea. Asia Pacific Education Review, 3(1), 125-135. https://doi.org/10.1007/BF03024926
Benson, P. (2001). Autonomy in language learning. Longman.
Berry, J. (1997). Immigration, acculturation and adaptation. Applied Psychology: An International Review, 46(1), 5-34. https://doi.org/10.1111/j.1464-0597.1997.tb01087.x
Cait, C. (2016). Relational theory. In N. Coady & P. Lehmann (Eds.), Theoretical perspectives for direct social work practice (pp. 185-202). Springer Publishing Company.
DAIP. (n.d.). Understanding the power and control wheel [Video]. DAIP. https://www.theduluthmodel.org/wheels/
Daniels, E., & Pirayoff, R. (2015). Relationships matter: Fostering motivation through interactions. Voices from the Middle, 23(1), 19-23. https://library.ncte.org/journals/vm/issues/v23-1
Dörnyei, Z. (1998). Motivation in second and foreign language learning. Language Learning, 31(3), 117-135. https://doi.org/10.1017/S026144480001315X
Froiland, J. M., & Worrell, F. C. (2016). Intrinsic motivation, learning goals, engagement, and achievement in a diverse high school. Psychology in the Schools, 53(3), 321-336. https://doi.org/10.1002/pits.21901
Gewertz, C. (2019, August 21). “I am a fool to do this job”: Half of teachers say they’ve considered quitting. Education Week. https://www.edweek.org/ew/articles/2019/08/05/half-of-teachers-considered-quitting.html
Gordon, T. (2003). Teacher effectiveness training (2nd edition). Three Rivers Press.
Guilloteaux, M. J. & Dörnyei, Z. (2008). Motivating language learners: a classroom-oriented investigation of the effects of motivational strategies on student motivation. TESOL Quarterly, 42(1), 55-77. https://doi.org/10.1002/j.1545-7249.2008.tb00207.x
Haller, W., Portes, A., & Lynch, S. M. (2011). Dreams fulfilled, dreams shattered: Determinants of segmented assimilation in the second generation. Social Forces, 89(3), 733-762. https://doi.org/10.1353/sof.2011.0003
Harmsen, R., Helms-Lorenz, M., Maulana, R., & Veen, K. (2018). The relationship between beginning teachers’ stress causes, stress responses, teaching behaviour and attrition. Teachers and Teaching, 24(6), 626-643. https://doi.org/10.1080/13540602.2018.1465404
Hill, P. L., Sin, N. L., Turiano, N. A., Burrow, A. L., & Almeida, D. M. (2018). Sense of purpose moderates the associations between daily stressors and daily well-being. Annals of Behavioral Medicine, 52(8), 724-729. https://doi.org/10.1093/abm/kax039
Hou, Y., Kim, S. Y., & Benner, A. D. (2018). Parent-Adolescent discrepancies in reports of parenting and adolescent outcomes in Mexican immigrant families. Journal of Youth and Adolescence, 47(2), 430-444. https://doi.org/10.1007/s10964-017-0717-1
Hutchison, E. (2015). Dimensions of human behavior (5th ed.). Sage Publications.
Jones, S. M., Bodie, G. D., & Hughes, S. D. (2019). The Impact of mindfulness on empathy, active listening, and perceived provisions of emotional support. Communication Research, 46(6), 838-865. https://doi.org/10.1177/0093650215626983
Lam, W. (2009). Examining the effects of metacognitive strategy instruction on ESL group discussions: A synthesis of approaches. Language Teaching Research, 13(2), 129-150. https://doi.org/10.1177/1362168809103445
Liberman, Z., Woodward, A. L., & Kinzler, K. D. (2017). The origins of social categorization.Trends in Cognitive Sciences, 21(7), 556-568. https://doi.org/10.1016/j.tics.2017.04.004
McCargar, D. (1993). Teacher and student role expectations: Cross-cultural differences and implications. The Modern Language Journal, 77(2), 192-207. https://doi.org/10.1111/j.1540-4781.1993.tb01963.x
McFeely, S. (2018, March 27). Why your best teachers are leaving and 4 ways to keep them. Gallup. https://www.gallup.com/education/237275/why-best-teachers-leaving-ways-keep.aspx
Ortega, L. (2009). Understanding language acquisition. Hodder Education.
Poole, D. (2005). Cross-cultural variation in classroom turn-taking practices. In P. Bruthiaux, D. Atkinson, W. Eggington, W. Grabe, & V. Ramanathan (Eds.), Directions in applied linguistics (pp. 201-222). Multilingual Matters.
Rees-Miller, J. (1993). A critical appraisal of learner training: Theoretical bases and teaching implications. TESOL Quarterly, 27(4), 679-689. https://doi.org/10.2307/3587401
Riggs, L. (2013, October 18). Why do teachers quit? The Atlantic. https://www.theatlantic.com/education/archive/2013/10/why-do-teachers-quit/280699/
Rothery, M. (2016). Critical ecological systems theory. In N. Coady, & P. Lehmann (Eds.),Theoretical perspectives for direct social work practice (pp. 81-107). Springer Publishing Company.
Rudy, D., Sheldon, K. M., Awong, T., & Tan, H. H. (2007). Autonomy, culture, and well-being: The benefits of inclusive autonomy. Journal of Research in Personality, 41(5), 983–1007. https://doi.org/10.1016/j.jrp.2006.11.004
Ryan, R., & Deci, E. (2011). A self-determination theory perspective on social, institutional, cultural, and economic supports for autonomy and their importance for well-being. In V. Chirkov, R. Ryan, & K. Sheldon (Eds.), Human autonomy in cross-cultural context: Cross-cultural advancements in positive psychology (pp. 45-64). Springer Publishing Company.
Schmitz, P. (1997). Individual differences in acculturative stress reactions: Determinants of homesickness and psychosocial maladjustment. In M. Tilburg, & A. Vingerhoets (Eds.), Psychological aspects of geographical moves (pp. 91-103). Tilburg University Press.
Shen, B., McCaughtry, N., Martin, J., Garn, A., Kulik, N., & Fahlman, M. (2015). The relationship between teacher burnout and student motivation. British Journal of Educational Psychology, 85(4), 519-532. https://doi.org/10.1111/bjep.12089
Shulman, L. (2016). The skills of helping individuals, families, groups, and communities (8th ed.). Cengage Learning.
Simmons, A., Shapiro, V., Accomazzo, S., & Manthey, T. (2016). Strengths-based social work: A social work metatheory to guide the profession. In N. Coady, & P. Lehmann (Eds.), Theoretical perspectives for direct social work practice (pp. 131-154). Springer Publishing Company.
Spratt, M., Humphreys, G., & Chan, V. (2002). Autonomy and motivation: Which comes first? Language Teaching Research, 6(3), 245-266. https://doi.org/10.1191/1362168802lr106oa
Ushioda, E., & Dörnyei, Z. (2009). Motivation, language identity, and the L2 self. Multilingual Matters.
Weger, H., Bell, G., Minei, E., & Robinson, M. (2014). The relative effectiveness of active listening in initial interactions. International Journal of Listening, 28(1), 13-31. https://doi.org/10.1080/10904018.2013.813234
Yaribeygi, H., Panahi, Y., Sahraei, H., Johnston, T. P., & Sahebkar, A. (2017). The impact of stress on body function: A review. EXCLI Journal, 16, 1057-1072. https://doi.org/10.17179/excli2017-480
Zimmerman, B., Bandura, A., & Martinez-Pons, M. (1992). Self-motivation for academic attainment: The role of self-efficacy beliefs and personal goal setting. American Educational Research Journal, 29(3), 663-676. https://doi.org/10.3102/00028312029003663
This article describes a procedure for training second-language writing raters to use scoring rubrics, and presents ideas for practical adaptation or research projects associated with the training procedure.
Imagine a novice language teacher doing the following:
These descriptors are examples from a commonly used rubric for assessing second language writing (Jacobs et al., 1981), in which the example descriptors above represent only 5% of all the descriptors that a rater is supposed to be familiar with while reading and rating compositions.
In programs in which graduate teaching assistants (TAs) need to be trained over a short period of time to rate second language writers’ essays and make course placement decisions, rater training must ensure that the novice raters quickly become familiar with the descriptors used in a scoring rubric. Regardless of the specific rubrics used, raters are typically expected to demonstrate knowledge of pre-determined descriptors (such as some knowledge of subject, adequate range, limited development of thesis, mostly relevant to topic but lacks detail) and perform the task of applying this knowledge consistently. Such expectations may not be met easily. Joe, Harmes, and Hickerson (2011) show, for example, that lack of transparency in rating scale descriptors can be a factor influencing raters’ performance. At the same time, rater-related factors can also add to the challenge in achieving reliable outcomes in rating (see Barkaoui, 2011 for a comprehensive review).
In this article, first, I will briefly review studies on raters and rater training. Then, I will describe a gap in published studies on rater training. Finally, I will introduce a rater training procedure currently implemented at a four-year university in the U.S. Midwest, for two purposes:
Among several rater-related factors that could influence raters’ interpretation and application of rating scales, raters’ experience (novice vs. experienced) seems to be the most frequently researched factor (Barkaoui, 2011; Greer, 2013; Hamp-Lyons, 1989; Harsch & Martin, 2012; Joe et al., 2011; Weigle, 1994). Research findings on this issue are mixed, contrary to what might be assumed (i.e. the more experienced, the more proficient). In fact, the mixed findings might be a function of the complexity in the way raters’ experience interacts with non-rater factors. In Barkaoui (2011), for example, with regard to severity in rating, novice raters and experienced raters behaved more similarly when using analytic scales than they did when using holistic scales. In other words, raters’ experience impacts ratings differently depending on the type of rating scales used.
The intriguing nature of raters’ experience as a factor was documented with great detail in Joe et al. (2011), which explored rater cognition based on data collected through verbal protocols. In this study, eight faculty experts (experienced raters) and eight undergraduate students (inexperienced raters) participated in rating oral speech performances. Both groups were trained to use an analytic scoring rubric, which included 39 features comprising ten competency dimensions relevant to the construct of the speech performances to be evaluated. The study found that inexperienced raters started out paying attention to rubric features more consistently than did experienced raters, who were found to pay attention to construct-irrelevant features (i.e. features not listed in the rubric) at a higher rate than inexperienced raters did. Over time, however, inexperienced raters attended to rubric features less and less while attending to construct-irrelevant features (such as the use of note cards or memorable thesis statement, which were not included in the rubric) more and more.
Such findings are quite alarming in that not all changes exhibited by raters as they become more experienced seem to be in the expected direction – i.e. experience is generally expected to be a positive factor. Some researchers even suggest that raters should not be selected based on teaching experience as it is not a significant factor (Royal-Dawson & Baird, 2009). Although research findings about rater experience might be mixed, many researchers have emphasized the importance of rater training in enhancing the quality of raters’ performance (Greer, 2013; Lovorn & Rezaei, 2011; Weigle, 1994). The following section will review studies on rater training.
One of the aims in most rater training involves monitoring rater behavior associated with rater-related factors such as experience, rating style, or rating preferences, and then providing feedback accordingly to achieve the ultimate goal of increasing inter-rater reliability (i.e. different raters performing similarly to one another). On the one hand, studies such as Pufpaff, Clarke, and Jones (2015) and Weigle (1994) reported that rater training did not improve inter-rater reliability. On the other hand, Weigle emphasized the point that “rater training cannot make raters into duplicates of each other, but it can make raters more self-consistent” (p. 32). This statement, then, naturally leads to the question as to what factors might contribute to developing rater self-consistency. Although there has not been much research that directly explored this question, several studies have reported the positive effects of rater training on rater performance in different aspects of the rating task.
In Greer (2013), novice raters practiced assessing ESL compositions following a training workbook, which included experienced raters’ feedback on the same compositions that the novice raters were evaluating. After the training, the novice raters reported increased confidence in their rating performance. In another study based on a two-month rater training program (Harsch & Martin, 2012), 13 novice raters completed rigorous weekly assignments consisting of tasks commonly included in rater training such as individual practice and group discussion, using over 1700 writing samples (whittled down from an initial set of over 6000 samples). Although the scope of the study is truly impressive, it is the depth of its rater training that makes it rather unique and remarkable. As a part of their weekly assignments, for example, the novice raters were actively engaged in revising the wordings on the rating scale. In fact, researchers recommend engaging raters in the development of rating scales (Barkaoui, 2010; Stevens & Levi, 2005). Harsch and Martin (2012) concluded that rater agreement increased when a revised rating scale (i.e. revised based on the novice raters’ discussion and input during the training period) was used.
Because the task of rating is replete with a myriad of interacting factors that could influence the process and outcome of rating, research-guided rater training may be essential in most contexts. For example, when training raters, feedback should be provided immediately after rating has occurred (Knoch, 2011). Rater training should promote detailed and analytical understanding of the scoring rubric (Lovorn & Rezaei, 2011; Rezaei & Lovorn, 2010). An eye-movement study on raters’ use of a scoring rubric showed that even the physical layout of the rubric can affect raters’ attention to each category on the rubric (Winke & Lim, 2015). These are just a few examples of published studies that could guide the design of a rater training.
In most published studies that either directly or indirectly report the outcome of rater training (Barkaoui, 2011; Joe et al., 2011; Knock, 2011; Lovorn & Rezaei, 2011; Pufpaff et al., 2015; Weigle, 1994), it appears common to follow variations of the same approach, categorically speaking, in the way raters (regardless whether novice or experienced) are initially introduced to a rating scale (either holistic or analytic). First of all, surprisingly, many studies (Barkaoui, 2011; Knock, 2011; Lovorn & Rezaei, 2011) do not provide sufficient detail regarding exactly how raters are introduced to the rating scales selected in their respective studies. Of the studies (Joe et al., 2011; Pufpaff et al, 2015; Weigle, 1994) that do provide some limited information regarding this part of rater training, the common approach seems to be Present and Clarify/Explain with respect to the descriptors on the rating scale.
This clearly is an example of what is defined as the hierarchical approach: “passing onto raters a predetermined view on how they are to interpret the scale wordings, using pre-assessed scripts (so called ‘master codes’) which are not to be discussed but to be accepted and internalised” (Martin & Harsch, 2012, p. 233). On the one hand, researchers have repeatedly identified the difficulty that raters experience when trying to understand rating scale descriptors (Barkaoui, 2010; Greer, 2013; Hamp-Lyons, 1989; Harsch & Martin, 2012; Joe et al., 2011). On the other hand, as described above, variations of a very top-down approach seem to prevail when it comes to training raters to become familiar with the wordings on the rating scale.
From this perspective, Harsch and Martin’s (2012) study, as reviewed in the preceding section, may be considered as an exception in that the raters in their study were engaged in a series of in-depth tasks attending to, analyzing, and revising the descriptors on the scale. These researchers also emphasize the importance of “reaching consensus about how to interpret scripts with reference to scale descriptors” (p. 233). As mentioned earlier, the rater training in their study spanned over a two-month period. Most readers would agree that a rater training program like that, while both impressive and exemplary, is anything but feasible in most real contexts. The reality of most rater training is likely to resemble the two-hour norming session referred to as a typical rater calibration procedure (Weigle, 1994, pp. 7-8).
Here is a question, then, that a concerned program administrator or teacher educator might ask: Do people use the top-down approach because they are constrained to the typical two-hour calibration procedure (or however many hours it might take but not the luxury of two months)? Acknowledging the “time- and resource-intensive” nature of their approach, Harsch and Martin (2012, p. 244) recommend realistic adaptations using existing rating scale descriptors (i.e. not necessarily attempting to revise the descriptors as their raters did). So it appears that, although there is a huge gap between a deeply engaging, albeit extremely unfeasible, approach to rater training and a more commonly practiced top-down approach, careful retooling of the top-down approach can help fill this gap. In the next section, I will introduce an authentic example of a rater training procedure which is characterized as a rater-centered bottom-up approach. (Readers can rest assured that this procedure will not require two months to try!)
The rater training procedure described here, as an example of a rater-centered bottom-up approach, has been implemented in an authentic test context. The following provides some background information about the context:
|Location||Four-year university in the U.S. Midwest|
|Purpose||Placement decisions for writing courses in English for Academic Purposes|
|Raters||Graduate teaching assistants in an MA-TESL program (1st~4th semester)|
|Target Texts||Academic essays written by second language writers|
|Scoring Rubric||Locally revised version of Composition Profile by Jacobs et al. (1981)|
Table 1. Training Protocol
Training Protocol (Rater-centered Bottom-up Approach)
[I] Individually [SG] Small Groups (3~4) [WG] Whole Group [T] Trainer M: Materials
|Description of Steps in the Training Protocol||Rationale for Each Step|
|1. Activating existing knowledge & expectations about academic writing|
|[SG] Brainstorm & consolidate existing knowledge & expectations about academic writing
[WG] Discuss and summarize
M: Brainstorm sheets
|Step 1 allows each rater to activate existing knowledge; allows each rater to generate his/her own language to describe features of writing; compile & share entire group’s ideas.|
|2. Evaluating a writing sample based on existing knowledge – without any rubric|
|[I] Read Essay #1; Write any/all notable features, good & bad, one feature per sticky-note; Place sticky notes in worksheet; Give a holistic score
[SG] Compare notes placed in individual worksheets; Compare holistic scores
M: Essay #1; Sticky-notes; Worksheet
|Step 2 allows each rater to apply existing knowledge; allows each rater to notice features in the writing with no constraints; exposes raters to writing features noticed by others.|
|3. Familiarization with rating scale descriptors|
|[I] Read rating scale descriptors and criteria
[SG] Discuss & help each other understand concepts & terminologies
[WG] Review & clarify concepts & terminologies
M: Rating scale descriptors & criteria handout; handout on Content-to-Form continuum in writing
|Step 3 introduces descriptor language to raters; helps raters to conceptually align their own language with descriptor language; helps identify & clarify gaps between rater-generated language and descriptor language.|
|4. Matching current knowledge with rating scale descriptors|
|[SG] Discuss each note on sticky-notes; Transfer & match each sticky-note with descriptors in the Descriptor Handout
[WG] Discuss & further clarify descriptors based on questions from SGs
M: Descriptor Handout (one copy for each SG)
|Step 4 allows raters to map their own unconstrained observations onto descriptors; helps identify, discuss, & resolve writing features that are difficult to map onto descriptors; more importantly, helps raters understand descriptors with self-generated concrete examples.|
|5. Practice using rating scale descriptors without scores|
|[I] Read Essay #2; Use the Descriptor Handout to mark relevant descriptors
[SG] Compare individuals’ markings on descriptor handout
M: Essay #2; Descriptor Handout
|Step 5 allows raters to practice using the descriptors directly without scaffolding (i.e. no self-generated descriptive notes as with Essay #1); allows another chance to focus on the descriptors with no burden to score the essay numerically.|
|6. Familiarization with the complete version of rating scale (with score indicators)|
|[T] Introduce the complete version of the rating scale (with score indicators) and explain
[I] Based on markings on Descriptor Handout (from Step 5) and using the rating scale (with score indicators), numerically score Essay #2
[SG] Compare scores for Essay #2
M: Complete version of rating scale
|Step 6 finally exposes raters to the actual rating scale with score indicators; helps raters perform the task of numerical scoring (not exactly the same as mapping observed writing features onto descriptors); helps deal with two different subtasks (i.e. identifying matching descriptors vs. numerical scoring) with more clarity.|
|7. Practice using the complete version of rating scale with a familiar essay|
|[I] RE-read & score Essay #1 using rating scale
[WG] Discuss the results & rationale of rating scale
M: Complete version of rating scale with score indicators
|Step 7 allows raters to apply the rating scale in evaluating a familiar writing sample; provides raters with an opportunity to review & re-assess their own initial evaluation of Essay #1 (performed prior to the introduction of the rating scale).|
|8. Remaining steps in the protocol|
|There are a few more steps in the protocol, which are beyond the focus of this article. Some of the remaining steps are similar to commonly practiced norming procedure, and some steps are specific to the local test context.1|
As Table 1 shows, the training protocol follows a rater-centered bottom-up procedure, which affords the raters step-by-step scaffolding to develop an understanding of and the ability to apply the descriptors on the rating scale. The procedure promotes activating existing knowledge and acquiring new knowledge of technical concepts/terminologies through a sequence of small tasks rather than through top-down imposition of abstract descriptors onto the raters. For many novice raters, learning to use a rating scale with pre-determined descriptors includes an element of language acquisition. It is not a mere coincidence that, in many ways, the procedure introduced here resembles language learning activities based on the task-based language teaching (TBLT) approach, in which language acquisition occurs as a natural part of successful completion of communicative tasks (Van den Branden, 2006).
The rater training procedure introduced here also provides scaffolding for one of the subtasks of rating that present a unique challenge for most raters, namely translating descriptors into numerical scores. Studies have shown that both novice raters (Greer, 2013) and experienced raters (Hamp-Lyons, 1989) find this subtask very difficult. In the procedure described in Table 1, raters are assisted to deal with this challenge in two ways: (1) initial steps in the procedure focus on the descriptors without the ‘burden’ of matching them with numerical scores; and (2) the complete version of the rating scale, a locally revised version of the composition profile by Jacobs et al. (1981), presents numerical scores in subsets to match raters’ judgments based on descriptors.
Unlike the two-month rater training described in Harsch and Martin (2012), which is quite impressive and ambitious, the training procedure introduced in this article is bundled with realistic and practical advantages:
The last two items directly address “the time- and resource-intensive” challenge of Harsch and Martin’s (2012, p. 244) otherwise exemplary rater training model. Hopefully, these practical advantages would encourage many readers of this article to consider employing this training procedure.
Aside from the obvious practical advantages, the most critical advantage of this procedure, at least based on informal observations during several semesters of implementation, involves the change in dynamics and roles between the trainer and the raters in training. As the protocol shows, at each step, raters are actively engaged in small doable tasks either independently or in collaboration with peer raters. Because the steps are sequenced to promote learning-by-doing, the procedure does not require much top-down talk from the trainer.
When this new procedure was first implemented a few semesters ago, raters who had experienced the previous format resembling the Present and Clarify/Explain approach enthusiastically commented that the new procedure felt stress-free, engaging, and helpful. As the trainer in this incidence, I too noticed unexpected changes when first implementing the new procedure. It felt as though I did not have to do anything during the procedure because the raters were doing all the work for themselves!
Although the rater-centered bottom-up training procedure is strongly recommended, the caveat is that it has not been empirically tested. First, interested readers are encouraged to consider field-testing this procedure in their various test contexts. It can be modified to fit the needs and capabilities of each context. One example might be replacing the use of sticky-notes with a digital/online tool to help raters generate, compile, and compare the features they observe in the writing sample they evaluate. This is actually an attractive idea, which can lead to the next point of this discussion, namely research possibilities.
The hands-on aspect of using sticky-notes is actually a very positive and valuable element of the procedure, and it helps raters ease into the sequence of tasks in the procedure. Its non-digital nature, however, has been an obstacle in converting the notes into analyzable data. These rater-generated notes can reveal interesting aspects of rater cognition. Designing empirical studies to capture such data to learn more about rater cognition would not only benefit the field of language education but also the field of education in general. In fact, in the special issue of the journal Educational Measurement: Issues and Practice devoted to rater cognition, Myford (2012) emphasizes that more research on rater cognition is needed.
Other research questions worth exploring including obvious ones such as Is the bottom-up rater training procedure more effective than the more commonly practiced top-down approach? Is there any difference between the two approaches in improving rater self-consistency (emphasized as the main benefit of rater training)? These are just a few examples, and readers are encouraged to pursue their own research questions associated with the rater-centered bottom-up rater training procedure introduced in this article.
This article started by asking readers to imagine a novice rater performing a presumably learnable, but indeed tremendously challenging, task of using a rating scale to make decisions about learners’ proficiency. Anecdotal evidence and informal observations suggest that rater training, for both novice and experienced raters, need not be like that – top-down, opaque, and anxiety-inducing. Instead, a rater-centered bottom-up approach can make the process more transparent and positively engaging. However, for this statement to be generalizable, we need empirical evidence, and this research topic is open to any interested readers.
Barkaoui, K. (2011). Do ESL essay raters’ evaluation criteria change with experience? A mixed-methods, cross-sectional study. TESOL Quarterly, 44, 31057.
Greer, B. (2013). Assisting novice raters in addressing the in-between scores when rating writing. (Master’s thesis). Retrieved from BYU ScholarsArchive.
Hamp-Lyons, L. (1989). Raters respond to rhetoric in writing. In H. W. Dechert & Raupauch (Eds.), Interlingual processes (pp. 229-244). Tubingen: Buner Narr.
Harsch, C. & Martin, G. (2012). Adapting CEF-descriptors for rating purposes: Validation by a combined rater training and scale revision approach. Assessing Writing, 17, 228-250.
Jacobs, H., Zinkgraf, S., Wormuth, D., Hartfiel, V., & Hughey, J. (1981). Testing ESL composition: A practical approach. Rowley, MA: Newbury House.
Joe, J. N., Harmes, J. C., & Hickerson, C. A. (2011). Using verbal reports to explore rater perceptual processes in scoring: a mixed methods application to oral communication assessment. Assessment in Education: Principles, Policy & Practice, 18, 239-258.
Knoch, U. (2011). Investigating the effectiveness of individualized feedback to rating behavior – a longitudinal study. Language Testing, 28, 179-200.
Lovorn, M. G. & Rezaei, A. R. (2011). Assessing the assessment: Rubrics training for pre-service and new in-service teachers. Practical Assessment, Research & Evaluation, 16, 1-18.
Myford, C. M. (2012). Rater cognition research: Some possible directions for the future. Educational Measurement: Issues and Practice, 31, 48-49.
Pufpaff, L. A., Clarke, L., & Jones, R. E. (2015). The effects of rater training on inter-rater agreement. Mid-Western Educational Researcher, 27, 117-141.
Rezaei, A. R. & Lovorn, M. (2010). Reliability and validity of rubrics for assessment through writing. Assessing Writing, 15, 18-39.
Royal-Dawson, L. & Baird, J. (2009). Is teaching experience necessary for reliable scoring of extended English questions? Educational Measurement: Issues and Practice, 28, 2-8.
Stevens, D. D. & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Virginia: Stylus Publishing, LLC.
Van den Branden, K. (Ed.). (2006). Task-based language education: From theory to practice. Cambridge, UK: Cambridge University Press.
Weigle, S. C. (1994). Using FACETS to model rater training effects. Paper presented at the Language Testing Research Colloquium (Washington, DC).
Winke, P. & Lim, H. (2015). ESL essay raters’ cognitive processes in applying the Jacobs et al. rubric: An eye-movement study. Assessing Writing, 25, 38-54.