Testing Benefits

 Mariana Castro shows what language proficiency assessments have to offer educators of English learners

If you are an educator, chances are that you have had or will have a student who is learning English as an additional language. Some of these multilingual students are eligible for language support services if their English proficiency limits their access to learning academic content. Districts typically have specific policies for identifying these students. Once these students are identified, by federal law, they are required to take an annual language proficiency assessment to confirm their eligibility for additional support until they are considered English proficient. Language proficiency assessments are used to monitor eligibility and language growth over time vary by state. Some states, like California, New York, and Texas, have developed their own assessments. However, most states join consortia, like ELPA 21 or WIDA, for enhanced support.
Language proficiency assessments like the ones described here are considered summative assessments; in other words, they provide information on the learning that has already occurred and are typically used to meet policy requirements. However, while the main purpose of these assessments is accountability, the data from them can be used to enhance the language development of students. Here are some ideas of how to use the data from language proficiency assessments:
Identify/Monitor Language Goals for Your Students
Data from language proficiency assessments can help identify specific areas of focus for schools or districts. Whether you work as part of a leadership team or a professional learning community, collect data specific to the various language domains available through your assessment: listening, speaking, reading, writing, and any other composite scores available, like literacy, comprehension, and overall scores for all of your students.
Important reminders:
Do not use the information from your language proficiency assessment as the sole data point when making decisions. Always try to collect other data and triangulate all of your points of data.
Remember that language proficiency assessments show the performance of students at one point in time, so they do not account for average performance of individual students. However, when aggregated, they can provide one perspective on how groups of students are performing.
Monitor Language Development Over Time
In the same way that language proficiency assessments provide you with information about language performance at a point in time for a group of students, if the same data is collected over time, you can look for patterns in the language development of your students.
Considerations in Using Data from Language Proficiency Assessments to Monitor Growth:
Use the right score: Some scores from language proficiency assessments are better when trying to identify language growth across time. Raw scores, for example, are not appropriate when comparing scores from two different test forms or students. This is because typically a raw score does not take into account the difficulty of the items on the test. In some of these assessments, like ACCESS for ELLs, proficiency levels are interpretations of the scale scores that account for the grade level of the student; therefore, those levels may not be appropriate when looking at growth over time. The best score to use from ACCESS for ELL is the scale score, because its calculation accounts for the difficulty of the items, but it is not specific to grade level.
Use multiple scores: Triangulate data from multiple sources related to language growth to get more comprehensive and useful information about your students’ language use, including classroom observations across various contexts and situations.
Involve others: Make sure that you use language from classroom observations, but also from informal spaces, extracurricular activities, and home, whenever possible and available.
Using Data from Language Proficiency Assessments to Guide Instruction
While data from language proficiency assessments provides good information for goals over time or large groups of students, this data can also be used in other ways to guide teaching and learning. Enhancing the use of academic language across different contexts provides students better access to academic content and with increased opportunities to participate meaningfully in teaching and learning. For children and youth who are multilingual, and for whom English may represent a barrier to demonstrating what they know and can do, language proficiency assessments are an additional tool. Language proficiency assessments provide opportunities for students and their educators to focus on the development of language, serving as models, sources of data, and catalysts for a more intentional education.
Mariana Castro, PhD is director of academic language and literacy initiatives at WIDA Consortium, Wisconsin Center for Education Research, University of Wisconsin-Madison. For more information about the uses of ACCESS for ELLs, visit www.wida.us.

Studies Highlight Success of Head Start

New studies from UC Berkeley, Georgetown University, and the Hamilton Project at Brookings reveal that Head Start—the nation’s oldest and largest early education program that began in the 1970s—is more effective than previously believed. The program has repeatedly criticized for its lack of medium-and long-term impact on standarized tests, grade point averages, honors courses, grade retention, special education placement, absenteeism, and suspensions for subgroups of children defined by gender, race/ethnicity, and English language learners (ELL’s).

September 2016

cover-sept-2016-2In Our September Issue: 

Stressing Classy Communication Margo Gottlieb and Gisela Ernst-Slavit know that academic language is important for all students and essential for English language learners

Testing Benefits Mariana Castro shows what language proficiency assessments have to offer educators of English Learners

Helping Students Find their Voices Adrienne Almeida examines the unique challenges that ELL students face and the impact these challenges have on their social-emotional and academic health

Curing Initiative Fatigue Stacy Hurst and Laura Axtell explain how to kick off the new school year with buy-in from everyone

On the Road to Multilingualism Bilingual rapper GüeroLoco takes us on his passionate journey to
promote language learning and allow students to reap the benefits

Citizens of the World Beth Marshall believes that awareness of global citizenry is the true goal of language education

 

 

 

How Language Affects Color Vision

How much does word Blue affect the color blue? More than you may have thought…

While it may be surprising, humans’ ability to detect color doesn’t begin until between three and five months. The older we get, the better we are able to not only determine the difference between colors, but in naming them as well. These two things aren’t as unrelated as one might think. Once we have a name for a color, we are able to determine it from different colors. This means that depending on what language one speaks, the way that they see color may be different. View the infographic below for more information, and check out the podcast for further discussion on this interesting link between color, vision, and language.

Deaf Artist Creates Visual Art about ASL

Artist Christine Sun Kim was born deaf, and throughout her life thought that sound was only for able-hearing people. When she began making art that visually represents ASL through themes of music, everything changed. In some pieces, she visually represents the movements made in sign that mean a certain thing, like the sign for ‘all day’ which looks like  the outline of a hemisphere.

Watch Kim talk about ASL, the power of interpreters, and her interaction with sound and language as a Deaf person below.

 

Blending to Test

Julie Damron and Jennifer Quinlan assess student outcomes in the blended classroom

Located in Provo, Utah, Brigham Young University (BYU) is a private institution with one of the nation’s largest language teaching programs—70% of students speak a second language and .32% of students take world language courses. Over 55 languages are regularly taught on campus, with over 40 more available based on student needs. As programs continue to expand, unique needs arise, such as more classroom space, more flexible course scheduling, and academically meaningful study abroad and internship experiences. In response to some of these needs, several departments are developing or expanding their online and blended course options. Enabling educators to effectively teach a language online implies instructional design that reflects extensive scaffolding and careful, relevant implementation of technology and learning resources, as well as knowledge of assessment of online language teaching and learning and its impact on language education. Assessment of language-learning outcomes in an online environment and documenting students’ learning and progress in the blended classroom can be challenging. This article explores the creation, implementation, and assessment of a blended Korean language class at Brigham Young University. It also compares assessment methods and student attainment of learning outcomes in the same course administered via a traditional classroom and an online class.

Method
We compared data from three beginning Korean classes: on campus face-to-face (F2F), blended (online and F2F interaction), and online (no F2F element). All three classes were held in the fall and winter of 2014, with 37 students in the blended section of the class, 30 students in the F2F section, and 26 students in the online section.

Learning outcomes
-The learning outcomes for all three classes were the same:
-Read (with limited comprehension) and write proficiently.
Discuss topics such as family, school, months of the year, hobbies, and vacation plans.
Interact linguistically on a limited basis using middle and high language.

Instruction and assessment
All three sections of the course were taught by the same professor, with three different TAs. All three courses used the same textbook and workbook and performed similar speaking and listening activities, assignments, and assessments.

Meeting times
The F2F class met five days a week on campus with the professor and the TA. All work was completed in the classroom or at home in a hard-copy workbook. Using a flipped classroom model, the blended class met together four days a week, with additional material (lectures, slides, quizzes, tests, and chat rooms) online for a fifth day of self-study. The online class was delivered via a series of lessons with synchronous and asynchronous meetings through a web browser with no face-to-face contact; course access was not limited to time or place.

Evaluation
At the end of each semester, we examined student success in overall course grade, quiz scores, chapter tests, midterm exam, and final exam. We also looked at student minutes online in comparison to minutes in the classroom and examined any relationship between minutes online and final course grade. Finally, we compared positive and negative student comments in all three sections of the course.

Course design
The faculty and an instructional designer evaluated significant course elements which could be delivered online versus face to face. We did not develop a web-facilitated version of the course. The Conversation Café is an online forum moderated by a TA. Students can drop in at any time during open hours (there are set hours five days per week) to practice speaking and applying concepts from class. The Conversation Café was available to students in the blended and online sections; it was not available in the F2F section. TAs reported that students used the Conversation Café to seek tutoring/assistance on specific items, as well as to practice free and unscripted dialogue. While the textbook content of the course included many scripted dialogues for practice, the benefit the Conversation Café seemed to provide was an element of spontaneous oral production.

Overview of Findings

Results of the study revealed the following
-Time spent online had a positive correlation with overall grade. This may come as no surprise, but students who logged more minutes online/in the course material reflected higher final course grades than their counterparts who spent less time online. This finding is not truly representative, however, as a student can be “logged in” but not actively engaged with the online content. We have not found a learning management system which can discriminate between actively interacting with online course material (e.g., reading, scrolling, answering questions, etc.) and simply accessing material (e.g., logging in and then stepping away from the desk).
-Students in the traditional classroom appeared to spend significantly more observable time with class material (tests, quizzes, slides, etc.). We emphasize observable time, as it is difficult to know how much of F2F classroom time every student actually spends engaged. Likewise, it is difficult to know how much time blended/online students might be spending studying and reviewing without being logged into the online material.
-Course grades for each class were similar. The difference among quizzes, midterm exams, final exams, and overall course grades for each of the three sections was less than 2%. However, the grade for chapter tests had a variance of 7%, with the lower score evident in the blended and online courses. We are exploring the factors which may contribute to this variance.
-Student evaluations were slightly lower for blended classes. Throughout the semester, the faculty asked students if they wanted to have a fifth day of instruction face to face for some extra review or in-person work. Consistently, their response was no. However, in end-of-course evaluations, students indicated they wanted more interaction with their professor. They also noted some negative responses to the LMS and glitches with some of the online course elements. This was the first time the faculty had run a blended section, making it a first for students as well. There was a sense of a steep learning curve for both bodies.
There was positive and negative feedback regarding each of the sections. Some of the feedback has already been identified. Further, while some students may feel more comfortable with the familiarity of face-to-face instruction, evaluations reflected that students valued the time/place flexibility offered by the blended and online sections. While they seemed to seek more in-person teacher interaction (as indicated in student end-of-course surveys), which is clearly afforded by the F2F section, they also valued the ability to access online material repeatedly and at their convenience. For example, blended students revisited course material on average twice as many times as it was presented in the F2F class. In the F2F section, students did not have access to any course materials online. Finally, while there were some technical glitches in the course which received negative feedback, students gave positive feedback regarding the ease and convenience of taking quizzes online.

Limitations/further research
There were a few unanticipated findings and/or limitations in this research. First, we had little control over who took the online course. If students were heritage or native speakers of Korean, that may have skewed the findings. Second, we found that test scores in the online course dropped significantly when they became proctored tests. Third, we found that students preferred not meeting with the professor five days a week during the semester, but then wrote negative comments about not seeing the professor enough on end-of-semester evaluations. Finally, we became aware of the extent to which online students were “binge studying,” or accessing significant amounts of course material immediately prior to an assignment deadline. Based on our findings, we are anticipating running a second phase of research to identify more discrete points of data, thus validating the results from this study. We are also implementing an ongoing measurement of student satisfaction with the course experience and a structure for student support when they encounter technical issues. Additionally, further research will explore the effects of the Conversation Café on oral proficiency, student engagement, sense of community, and mastery of learning outcomes. We anticipate comparing these elements among students in each set of classes (F2F, blended, online); the impact of the café on overall oral proficiency were not measured in this study. As part of normal course evaluation and refinement that BYU conducts for blended and online courses, we will also examine the difficulty and discrimination of assessment items and compare student performance on discrete elements within assessments. Our intent is to identify if students score better or worse on specific items within each assessment even though questions are identical in all three sections.
Conclusion:
In developing the blended and online course design and implementation strategies, we explored learning theories (e.g., Bandura, Vygotsky, Gagne), pedagogical approaches to student learning objectives, industry reviews of blended and online instruction, and elements of classroom formats (face-to-face, blended, online). This information helped guide the instructional design of the courses as well as explore techniques for interaction/student engagement, instruction, and assessment. While we anticipated student scores in the blended classroom would not be as high as those in the traditional F2F classroom, we discovered successful student learning in the blended language classroom was possible.
Dr. Julie Damron is associate professor and associate section head of Asian and Near Eastern Languages and Jennifer Quinlan, MFS, is academic product consultant for world languages and a second-language acquisition PhD candidate at Brigham Young University, Provo, Utah.

References
ADFL Guidelines on the Administration of Foreign Language Departments. www.adfl.org/resources
Allen, I., and Seaman, J. (2013). Changing Course: Ten years of tracking online education in the United States. Retrieved April 6, 2015.
UCF. “Benefits of Blended Learning.” blended.online.ucf.edu/about/benefits-of-blended-learning
Clayton Christensen Institute for Disruptive Innovation. January 1, 2012. “Blended Learning Model Definitions.”www.christenseninstitute.org/blended-learning-definitions-and-models/
Center for Digital Education. (2012). “Realizing the Full Potential of Blended Learning.” echo360.com/sites/default/files/CDE12%20STRATEGY%20Echo360-V.pdf
Elvers, G., Polzella, D., & Graetz, K. (2003). “Procrastination in Online Courses: Performance and attitudinal differences.” Teaching of Psychology, 30(2), 159-162. www.anitacrawley.net/Articles/elversAttitudinalDifference.pdf
Hill, P. (February 26, 2013). “The Most Thorough Summary (to date) of MOOC Completion Rates.” mfeldstein.com/the-most-thorough-summary-to-date-of-mooc-completion-rates
Ho, A., and Lu, L. (2006). “Testing the Reluctant Professor’s Hypothesis: Evaluating a blended-learning approach to distance education. Journal of Public Affairs Education, 12(1), 81-102. www.jstor.org/stable/pdf/40215727.pdf?acceptTC=true
“Is Blended Learning the Best of Both Worlds?” (January 17, 2013). onlinelearninginsights.wordpress.com/2013/01/17/is-blended-learning-the-best-of-both-worlds
Rovai, A., and Jordan, H. (2004). “Blended Learning and Sense of Community: A comparative analysis with traditional and fully online graduate courses.” The International Review of Research in Open and Distributed Learning, 5(2). www.irrodl.org/index.php/irrodl/article/viewArticle/192/274
The Student View of Blended Learning. (January 1, 2011). www.ecsu.edu
Suppes, P., and Morningstar, M. (1969). “Computer-Assisted Instruction.” Science, 166, 343-350. suppes-corpus.stanford.edu
Sheehy, K. (2013, January 8). “Online Course Enrollment Climbs for Tenth Straight Year.” www.usnews.com/education/online-education/articles/2013/01/08/online-course-enrollment-climbs-for-10th-straight-year
“Understanding the Spacing Effect.” www.knowledgefactor.com

Ten Spanish Words That Come From Africa

African culture and language has had a massive influence over Latin American, South American, Caribbean, and other Spanish-speaking countries not only culturally (from Samba to carnivals), but also linguistically. Language Magazine rounded up ten Spanish words that come from African roots.

Marimbaa cognate that refers to a beating in the southern cone of South America.

Mucamaused for maid or servant in Uruguay, Argentina, and Brazil.

Guineoused in Puerto Rico and the Dominican Republic for different types of bananas

Congoused in Latin America for a term for a black man/woman, presumably stemming from the country Congo.

Ñamea Spanish word for yam.

cachimbo/aused for pipe in Latin America, a term for a poor man/woman in the Caribbean, and solider in the Andes.

Merengueused to refer to the music and dance, along with meaning mess in the southern cone, and wimp in South America.

Mandingaused for devil, evil spirit, and goblin in Latin America, black in the Caribbean, and effiminate in the south cone of South America.

MondongoSpanish word that means guts, and is often used as a word for tripe.

Chévereused as a term or phrase for fantastic or great in Latin America.

Original reference from “A History of Afro-Hispanic Language” by John M. Lipski of Pennsylvania State University published by Cambridge University.

Bilingual-that’s HOT!

Forget dieting or styling your hair-learn another language!

Americans and Brits find multilingualism attractive in different ways, according to Babbel’s Perception of Languages Survey. Babbel is a German-based online language-learning platform that trains its over a million paying users in major European languages and Indonesian.

About 1500 Americans and 1500 Britons were surveyed on their perceptions of the languages offered by Babbel which naturally led to euro-centric answers for most questions but there was one unquestionable result-being multilingual makes people more attractive. Nearly three-fourths of Americans (71%) believe speaking more than one languages makes a person seem more attractive, while over two-thirds (64%) of Brits agree. On both sides of the Atlantic, nine out of ten people admit they would learn a language in the pursuit of love and about 50% admit to having fantasized about a romance with a “foreigner.”

Californians Favor English Plus Another Language

Maroon_Tee-ShirtCalifornia voters both prioritize the learning of English and recognize the value of speaking more than one language, according to a new study of voter attitudes about bilingual education by the Institute of Governmental Studies (IGS) at the University of California, Berkeley.

The study used an online poll to ask a series of questions about language policy and bilingual education, which which will once again be on the California ballot this fall.

Tools for School

ThinkstockPhotos-488510454 - CopyA bumper crop of resources to help make the new academic year a success

Language Magazine