CAL Resource Guides Online |
Introduction
Digests
ERIC/CLL Publications
Publications From Other Sources
Listservs
Web Sites
Conference Information Online
ERIC Documents
As the ability to speak more than one language becomes more important, so too does the need to assess the language abilities of second language learners. In the classroom, assessment can be seen as an ongoing process, in which the teacher uses various tools to measure the progress of the learner. Among those tools are portfolios, self-assessment, and, of course, tests. If assessment can be seen as a movie, then a test is a freeze frame: it gives a picture of the learner's language at a particular point in time. Used properly, these tools can help the teacher develop a full picture of the learner's progress. It is important to note that all types of testing and assessment are important in gathering information about your students.
Testing has traditionally been the most widely used assessment tool in the classroom, and in many classrooms, it still is. Moreover, testing has applications outside of the classroom. Foreign language programs test students for placement, colleges and universities test students for credit, and employers test the abilities of prospective employees. In any testing situation, it is important to consider which of the four skills (speaking, listening, reading, writing) needs to be assessed, who will be taking the test, and for what purposes the test results will be used. Clearly, a test which is appropriate in one situation may be inappropriate in another: a test designed to measure the reading abilities of elementary school learners will not be appropriate for college placement. Thus, when choosing a test to use, it is important to define the testing situation, and then to find or develop a test that fits the situation.
It is also important to know the reliability and validity of the test, especially if the test is to be used for high-stakes purposes, such as entrance into a college or university. Reliability measures the consistency of the test; validity is the extent to which the test measures what it claims to measure. Large scale standardized tests have more reliability and validity requirements than classroom tests, and many books, articles, research projects, and other materials have been devoted to this issue.
This resource guide provides information on all aspects of assessment, from large-scale norm-referenced tests to classroom-based assessments.
ERIC/CLL is grateful to Glenn Fulcher, English Language Institute, University of Surrey (UK); Yasuyo Sawaki, Applied Linguistics Department, University of California Los Angeles; and Dorry Kenyon, Center for Applied Linguistics, for their valuable assistance in compiling this Resource Guide Online.
The following publications, conferences, Web sites, and listservs provide information about second language proficiency assessment. An annotated bibliography of ERIC documents related to proficiency assessment is also included. This resource guide is intended as a general starting point; it is not an exhaustive list of resources on the topic.
Digests are brief overviews of topics in education. ERIC/CLL has prepared many timely digests on topics related to language teaching and learning. The following ERIC/CLL titles are related to the field of second language assessment.
ACTFL Speaking Proficiency Guidelines
Alternative Assessment and Second Language Study: What and Why?
Considerations in Developing and Using Computer-Adaptive Tests to Assess Second Language Proficiency
Simulated Oral Proficiency Interviews: Recent Developments
Current Trends in Foreign Language Assessment appeared in K–12 Foreign Language Education. The ERIC Review, Volume 6, Number 1 (Fall 1998). This issue of The ERIC Review gives a broad overview of the status of K–12 foreign language education in the United States. Readers will find information about program types, enrollment trends, national standards, assessment, professional development, less commonly taught languages, heritage language education, and American Sign Language. Resources listed include organizations, books, journals and newsletters, and electronic resources.
The Foreign Language National Assessment of Educational Progress (FL NAEP): Questions and Answers appeared in the October 2000 issue of ERIC/CLL Language Link.
K–8 Foreign Language Assessment: A Bibliography describes foreign language assessment instruments currently in use in elementary and middle schools across the country and lists a wealth of resources related to foreign language assessment.
Many excellent journals, books, and reference materials have been published that give an overview of the field of second language proficiency. The items listed below are not intended to be exhaustive. They are merely a sample of the major publications to be found on this topic.
Language Testing publishes articles in the fields of second and foreign language testing, native language testing, and assessment of language disability. This journal is available from Turpin Distribution service.
The Studies in Language Testing Series from the University of Cambridge Local Examination Syndicate and Cambridge University Press offers six volumes that examine issues in assessment and offer a multilingual glossary of language testing and assessment.
TOEFL Research Report Series provides the results of research studies conducted under the direction of the Test of English as a Foreign Language (TOEFL) Committee of Examiners.
Alderson, J.C., Clapham, C., & Wall, D. (1995). Language test construction and evaluation. Cambridge: Cambridge University Press.
Bachman, L.F. (1990). Fundamental considerations in language testing. Oxford: Oxford University Press.
Bachman, L.F., & Palmer, A.S. (1996). Language testing in practice. Oxford: Oxford University Press.
Banerjee, J., Clapham, C., Clapham, P., & Wall, D. (1999.) Language testing update: ILTA language testing bibliography 1990-1999. (1st ed.). Lancaster (UK): The International Language Testing Association (ILTA).
Brown, J.D. (1996). Testing in language programs. Glenview, IL: Addison-Wesley Longman.
Clapham, C.M. & Corson, D. (Eds.). (1998). Language testing and assessment: Encyclopedia of language and education. (Vol. 7). Dordrecht, Netherlands: Kluwer.
Genesee, F., & Upshur, J.A. (1996). Classroom-based evaluation in second language education. Cambridge: Cambridge University Press.
Hughes, A. (1989). Testing for language teachers. Cambridge: Cambridge University Press.
McNamara, T. (2000). Language testing. Oxford Introductions to Language Series. Oxford: Oxford University Press.
Spolsky, B. (Ed.) (1999). Concise encyclopedia of educational linguistics. New York: Elsevier. A compilation of 232 articles dealing with language and education topics, this encyclopedia includes eight articles about language testing by experts in the field.
The Pennsylvania State University hosts the LTEST_L listserv, dedicated to language testing research and practice. To subscribe, send the message
SUBSCRIBE LTEST-L YOURFIRSTNAME YOURLASTNAME
to
listserv@lists.psu.edu.
AERA Division D hosts a listserv dealing with assessment issues. To subscribe, send the message
SUB AERA-D YOURFIRSTNAME YOURLASTNAME
to
listserv@asu.edu. To read the list without subscribing, go to http://lists.asu.edu/archives/aera-d.html.
The National Council on Measurement in Education (NCME) hosts a general listserv as well as a listserv for graduate students. To subscribe to NCME's general listserv, send the message
SUBSCRIBE NCME
to
majordomo@ncme.org.
To subscribe to the graduate student listserv, send the message
SUBSCRIBE GRAD
to
majordomo@ncme.org.
The International Language Testing Association (ILTA) is an independent international association of assessment professionals. ILTA seeks to promote the improvement of language testing throughout the world through workshops, conferences, publications, and professional services.
The American Council on the Teaching of Foreign Languages (ACTFL), a national professional association for foreign language teachers, has been instrumental in the development of proficiency assessment. The ACTFL Proficiency Guidelines for Speaking have recently been updated and can be downloaded at this site.
The Association of Language Testers in Europe (ALTE) provides definitions of proficiency levels as well as descriptions of tests developed by participating organizations.
The Center for Applied Linguistics (CAL) offers tests such as the Simulated Oral Proficiency Interview (SOPI), the Basic English Skills Test (BEST), and the Student Oral Proficiency Assessment (SOPA), and provides an online Foreign Language Assessment Resource Guide.
The Center for Equity and Excellence in Education Test Database, maintained by the George Washington University and ERIC/AE, includes almost 200 tests for use with limited English proficient students.
The Educational Testing Service (ETS) offers information about the Test of English as a Foreign Language as an online magazine.
The ERIC Clearinghouse on Assessment and Evaluation (ERIC/AE) offers a number of resources, including an assessment library and a test locator.
The National Center for Research on Evaluation, Standards and Student Testing (CRESST) offers articles available for downloading, reports, newsletters, and CRESST products.
The American Psychological Association's Testing and Assessment Web page contains information for test developers, including ordering information for the updated Standards for Educational and Psychological Testing.
The Buros Institute of Mental Measurement offers several articles on testing issues.
The Center for the Advancement of Language Learning (CALL) offers links to testing sites and conference information.
The Center for Advanced Research on Language Acquisition (CARLA) at the University of Minnesota has developed proficiency tests in French, German, and Spanish, including computer-delivered reading, writing, and listening assessments.
The National Assessment Governing Board (NAGB) is responsible for overseeing the National Assessment of Educational Progress (NAEP). A new national assessment for Spanish is scheduled to be administered to secondary school students in 2003. Download the framework, the Foreign Language National Assessment of Educational Progress (pre-publication edition), at this site.
The National Capital Language Resource Center (NCLRC) is one of nine federally-funded language resource centers. Assessment-related products include a Foreign Language Test Database. Services include training in the administration and scoring of oral proficiency interviews.
The National Foreign Language Resource Center at the University of Hawai'i is developing computer-based tests for the less commonly taught languages.
The National K–12 Foreign Language Resource Center focuses on preparing teachers in the administration and interpretation of foreign language performance assessments, tthe use of effective teaching strategies, and the use of new technologies.
The University of California Department of Applied Linguistics offers information about UCLA's Web-based language test development project (WebLAS).
The University of Cambridge Local Examination Syndicate (UCLES) produces English as a foreign language (EFL) exams, accredits examiners, and ensures that the exams are graded by qualified English language experts.
AERA Division D is dedicated to measurement and research methodology.
The Asian Centre for Language Assessment Research (ACLAR) site has information on the many projects being carried out by the center, which is based at The Hong Kong Polytechnic University.
The Consortium for Equity and Standards in Testing is devoted to exploring "how educational standards, assessments, and tests can be used more fairly." Their site includes spotlights on testing issues and downloadable documents discussing various aspects of assessment.
The Language Acquisition Resource Center (LARC) at San Diego State University produces a digital version of the Video Oral Communications Instrument (VOCI).
The National Association of Test Directors (NATD) site includes articles on general testing issues for K–12, as well as information about previous years' conferences and links to other testing sites.
The National Council on Measurement in Education (NCME) offers several publications, including an online newsletter. They also have an extensive job databank.
TESOL Arabia Testing Assessment and Evaluation Special Interest Group includes an on-line introduction to classroom test development.
The Resources in Language Testing WWW Page includes an extensive list of links to and reviews of sites related to language testing, as well as a searchable database of articles from the journal Language Testing.
The University of Surrey: Dissertations in Language Testing page is a list of dissertations on language testing and assessment completed at the university since 1994.
The Language Testing Page of the National Foreign Language Resource Center (NFLRC) at the University of Hawaii offers materials relating to the assessment of less commonly taught languages, as well as the assessment of cross-cultural pragmatics.
The Foreign Language Test Database from the National Capital Language Resource Center (NCLRC) offers an online searchable database of tests that measure foreign language proficiency.
The online Foreign Language Assessment Resource Guide is part of an ongoing performance initiative of the Center for Applied Linguistics and the National K–12 Foreign Language Resource Center. This searchable database includes descriptions of language assessments that are currently being used in elementary, middle, and secondary school foreign language programs around the country.
The Cutting Edge CALL page has several demos that show what can be done with computer-designed tests.
The Language Testing page of Drs. Kenji Kitao and S. Kathleen Kitao has links to resources and articles on the web.
Research into Language Testing at the University of Duisburg is still under construction, but contains information about C-tests as well as an extensive bibliography.
The ACTFL Proficiency Guidelines for Speaking have recently been updated and can be downloaded from The American Council on the Teaching of Foreign Languages (ACTFL).
The Interagency Language Roundtable Scale grew out of the Foreign Service Institute Rating scale. This site includes a brief history of the scale, an annotated bibliography, and descriptions of the different levels.
The International Second Language Proficiency Ratings (ISLPR) (formerly the Australian Second Language Proficiency Ratings) site has descriptions of the proficiency levels along with information on registering for an ISLPR test (available only in Australia) and some references.
Brigham Young University offers information about its Computerized Adaptive Placement Exams (CAPE) in various languages.
The DIALANG Project, currently in the piloting phase, is developing Internet-based tests for 14 European languages. The Web site contains links to information about the first and second phases (in German, French, and English) of the project.
The Test of English as a Foreign Language (TOEFL) Web site provides information about TOEFL programs and services.
The Language Testing Research Colloquium will be held in St. Louis in 2001. The International Language Testing Association Web site provides further information, as well as an online submission form and program books from previous colloquia.
The American Council on the Teaching of Foreign Languages (ACTFL) holds an annual conference. Included are several pre- and post-convention workshops dealing with assessment issues.
Information on the 2001 Asian Language Assessment Research Forum (ALARF) in Hong Kong will be available on the Asian Centre for Language Assessment Research (ACLAR) Web site in December 2000.
The Southern California Association for Language Assessment Research (SCALAR) offers information about regional assessment conferences throughout the U.S.
Cambridge University Press publishes a list of links to second language learning conferences at the Web site of their journal Studies in Second Language Acquisition.
The University of Sydney Language Centre maintains a comprehensive worldwide conference list.
Conference Schedules for Linguists, Translators, Interpreters, and Teachers of Languages provides information about quarterly and annual events.
The American Educational Research Association (AERA) will hold its annual meeting in Seattle, Washington, April 10-14, 2001.
The National Council on Measurement in Education (NCME) will hold its 2001 meeting in Seattle, Washington, April 11-13.
The ERIC database of educational documents includes numerous books, papers, reports, journal articles, and other documents of interest. A sample of materials available on this topic appears below. Information on obtaining these materials appears after the search.
To conduct your own search of the ERIC database, visit an ERIC Center or conduct your own search on the Web.
ED436093
Essentials of English Language Testing
Kitao, S. Kathleen; Kitao, Kenji
162p.
ISBN: 4-268-00327-4
Availability: Eichosha Co., Ltd., Kusaka Building, 2-28 Kanda Jimbocho, Chiyoda-ku, Tokyo 101-0051, Japan. Tel: 03-3263-1641; Fax: 03-3263-6174
This book covers essential problems and basic concepts of English language testing. There are 20 chapters in four sections. Section 1, "Background," includes: (1) "Why Teachers Test"; (2) "Some Definitions"; (3) "Validity and Reliability"; and (4) "What You Should Learn about Standardized Tests." Section 2, "Preparing To Write Tests," includes: (5) "Test Specifications"; (6) "Writing a Good Test"; and (7) "Some Common Problems with Test Items." Section 3, "Writing Tests," includes: (8) "Testing Reading"; (9) "Testing Listening"; (10) "Testing Writing"; (11) "Testing Speaking"; (12) "Testing Grammar"; (13) "Testing Vocabulary"; and (14) "Testing Communicative Competence." Section 4, "The Results," includes: (15) "Scoring Tests"; (16) "Graphs and Descriptive Statistics"; (17) "Item Analysis"; (18) "Computer Programs for Analyzing Test Results"; (19) "Evaluating Test Results"; and (20) "Reporting the Results of Examinations." Each chapter also includes a list of key words, an introduction, a conclusion, and either an application exercise or discussion questions. The two appendixes present types of language testing resources on the Internet and a list of language testing resources on the Internet. (SM)
Descriptors: Communication Skills; Computer Uses in Education; Data Analysis;
*English (Second Language); Evaluation Methods; Grammar; Internet; Listening
Skills; Reading Ability; Reading Skills; Scoring; Second Language Learning; Speech
Skills; Standardized Tests; *Student Evaluation; Test Construction; Test Content;
Test Reliability; Test Results; Test Validity; *Testing; Vocabulary Skills;
Writing Evaluation; Writing Skills
ED431322
Assessing Language Proficiency Levels
Hoffman-Marr, Annette
63p.
Notes: Ed.D. Thesis, California State University, Hayward.
A study compared two methods for assessing English proficiency of limited-English-
proficient school-age children, for purposes of placement in bilingual and
monolingual education programs. The two instruments used were the Idea Oral Language
Proficiency Test (forms C and D) (IPT), which assesses oral English proficiency, and
the Woodcock-Munoz Language Survey-English (WMLS), which tests cognitive academic
language proficiency. Subjects were 20 language-minority children, 11 in
kindergarten and 9 in third grade. Tests were individually administered. Results
indicate no significant difference in English oral language proficiency scores on the
two tests for either grade level. However, overall, more students were designated at
lower proficiency levels on the WMLS than on the IPT, suggesting that IPT scores may
place more students in bilingual classrooms than in monolingual classrooms.
Implications for placement, instruction, and provision of services are discussed
briefly. Contains 26 references. (MSE)
Descriptors: *Bilingual Education; Comparative Analysis; Elementary Secondary
Education; English for Academic Purposes; *English (Second Language); Language
Minorities; *Language Proficiency; *Language Tests; *Limited English Speaking;
Oral Language; Speech Skills; *Student Placement; Testing; Thinking Skills
ED430390
Native Speakers' Perception of the Nature of the OPI Communicative Speech Event.
Johnson, Marysia
30p.
1999
A study investigated the Educational Testing Service's claim about the
conversational nature of the Oral Proficiency Interview (OPI) from the perspective of
native speakers of the target second language. Eight subjects listened to 16
randomly-selected OPI communicative speech events, and their perceptions were
measured using a semantic differential instrument. Analysis of the data indicate
that with few exceptions, native speakers did not differ in their judgments of the
nature of the OPI communicative speech event; they found that the OPI does not test
speaking ability in the real-life context of a conversation as it claims to. The OPI
tests speaking ability in the context of two interview types: a very formal interview
that has many features of a survey research interview, based on the behaviorist
theory of stimulus and response, and a more conversational type of interview that has
many features of a sociologic interview. It is concluded that the findings raise
questions about the validity of the OPI testing instrument. (Author/MSE)
Descriptors: Attitudes; *Interviews; *Language Tests; *Native Speakers; *Oral
Language; *Second Languages; Speech Skills; Standardized Tests; Test Construction;
Test Validity; *Verbal Tests
Identifiers: *Oral Proficiency Testing
EJ579764
Systematic Effects in the Rating of Second-Language Speaking Ability: Test
Method and Learner Discourse
Upshur, John A.; Turner, Carolyn E.
Source: Language Testing, v16 n1 p82-111 Jan 1999
1999
ISSN: 0265-5322
Research on two approaches to assessment of second-language performance--second-
language acquisition and language testing--is examined and compared with regard to
systematic effects on language tests. Findings incidental to a test development
project are then presented. It is concluded that a full account of performance
testing requires a paradigm incorporating relationships not specified in research on
either approach. (Author/MSE)
Descriptors: Discourse Analysis; *Language Tests; *Performance Tests; Second
Language Learning; *Second Languages; Speech Skills; Test Format; *Verbal Tests
ED428575
Talking and Testing: Discourse Approaches to the Assessment of Oral
Proficiency. Studies in Bilingualism, Volume 14.
Young, Richard, Ed.; He, Agnes Weiyun, Ed.
406p.
1998
ISBN: 1-55619-548-6
ISSN: 0928-1533
Availability: John Benjamins Publishing Co., P.O. Box 27519, Philadelphia, PA 19118-
0519; Tel: 800562-5666; e-mail: service@benjamins.com; Web site: http://
www.benjamins.nl/Sbp/index.html
Papers on second language oral proficiency testing include: "Language Proficiency Interviews: A Discourse Approach" (Agnes Weiyun He, Richard Young); "Re-Analyzing the OPI: How Much Does It Look Like Natural Conversation?" (Marysia Johnson, Andrea Tyler); "Evaluating Learner Interactional Skills: Conversation at the Micro Level" (Heidi Riggenbach); "What Happens When There's No One To Talk To? Spanish Foreign Language Discourse in Simulated Oral Proficiency Interviews" (Dale April Koike); "Answering Questions in LPIs: A Case Study" (Agnes Weiyun He); "Framing the Language Proficiency Interview as a Speech Event: Native and Non-native Speakers' Questions" (Carol Lynn Moder, Gene B. Halleck); "Miscommunication in Language Proficiency Interviews of First-Year German Students: A Comparison with Natural Conversation" (Maria M. Egbert); "Knowledge Structures in Oral Proficiency Interviews for International Teaching Assistants" (Bernard Mohan); "The Use of Communication Strategies in Language Proficiency Interviews" (Yumiko Yoshida-Morise); "Meaning Negotiation in the Hungarian Oral Proficiency Examination of English" (Lucy Katona); "Maintaining American Face in the Korean Oral Exam: Reflections on the Power of Cross-Cultural Context" (Catherine E. Davies); "Confirmation Sequences as Interactional Resources in Korean Language Proficiency Interviews" (Kyu-hyun Kim, Kyung-hee Suh); "Divergent Frame Interpretations in Oral Proficiency Interview Interaction" (Steven Ross); and "'Let Them Eat Cake!' or How To Avoid Losing Your Head in Cross-Cultural Conversations" (Richard Young, Gene B. Halleck). (MSE)
Descriptors: Communication Problems; Cultural Context; *Discourse Analysis; English
(Second Language); Foreign Countries; Foreign Students; German; Higher Education;
Interaction; Intercultural Communication; Interpersonal Communication; Interviews;
Introductory Courses; Korean; *Language Proficiency; *Language Tests; *Oral
Language; *Questioning Techniques; *Second Languages; Simulation; Sociocultural
Patterns; Spanish; Teaching Assistants; Testing
Identifiers: *Oral Proficiency Testing; Questions
ED427522
Prochievement Testing of Speaking: Matching Instructor Expectations, Learner
Proficiency Level, and Task Type
Pino, Barbara Gonzalez
Source: Texas Papers in Foreign Language Education, v3 n3 p119-33 Fall 1998
17p.
1998
ISSN: 0898-8471
Previous literature on classroom testing of second language speech skills provides
several models of both task types and rubrics for rating, and suggestions regarding
procedures for testing speaking with large numbers of learners. However, there is no
clear, widely disseminated consensus in the profession on the appropriate paradigm to
guide the testing and rating of learner performance in a new language, either from
second language acquisition research or from the best practices of successful
teachers. While there is similarity of descriptors from one rubric to another in
professional publications, these statements are at best subjective. Thus, the rating
of learners' performance rests heavily on individual instructors' interpretations of
those descriptors. An initial investigation of instructor assumptions was conducted
regarding student performance on speaking tests in one program and identified several
areas of discrepancy in instructor testing and rating practice. It is argued that
faculty as a group must delineate more clearly their specific expectations by level
for a number of rated features. The concerns identified in this study coincide with
those discussed recently in the literature, suggesting that other programs might
benefit from similar self-analysis. The instructor questionnaire is appended.
Contains 17 references. (MSE)
Descriptors: College Instruction; *Evaluation Criteria; Higher Education;
Interrater Reliability; *Language Proficiency; Language Research; Language
Teachers; *Language Tests; Oral Language; Second Language Learning; *Second
Languages; *Spanish; Speech Skills; Surveys; Teacher Attitudes; *Teacher
Expectations of Students; Test Items
ED425645
Assessment in ESL & Bilingual Education: A Hot Topics Paper
Hargett, Gary R.
Northwest Regional Educational Lab., Portland, OR. Comprehensive Center, Region X.
37p.
August 1998
The purposes and methods of testing in bilingual and English-as-a-Second-Language
(ESL) education are discussed. Different instruments, including specific published
tests, are listed and described briefly. They include language proficiency
assessments, achievement tests, and assessments in special education. Introductory
sections address topics surrounding the testing itself, including the need to
understand the purposes of the testing and of a specific test, the information needed
about a student, specific uses of that information, when it is appropriate to test
the student, and defining second language proficiency. (Contains 20 references.) (MSE)
Descriptors: Academic Standards; *Bilingual Education; Classroom Observation
Techniques; Cloze Procedure; Educational Trends; *English (Second Language);
Evaluation Criteria; Evaluation Methods; Intelligence Tests; Language Proficiency;
*Language Tests; *Limited English Speaking; Oral Language; Reading Tests; Second
Language Programs; Special Education; Speech Skills; Student Evaluation; Test
Construction; Test Format; Test Items; *Testing; Trend Analysis; Vocabulary
Development; Writing Exercises; Writing Tests
Identifiers: *Placement Tests
ED423665
Differences in N/NN Teachers' Evaluation of Japanese Students' English
Speaking Ability
Nakamura, Yuji
Source: Cross Currents, v19 n2 p161-65 Win 1992
8p.
1992
A survey of 32 Japanese and 44 native English-speaking teachers of English as a
Second Language investigated how the two groups evaluate the English speech skills of
Japanese students. A 59-item questionnaire was designed to elicit comparative
information on definition of oral proficiency, criteria (including newer ones derived
from instruction focusing on communicative competence) used to assess oral skills,
and the relative importance attached to these criteria. Results suggest significant
differences overall between Japanese and native English-speakers' standards in two
main assessment categories, fluency and discourse factors, although no significant
differences appeared within subcategories of these criterion groups. The
questionnaire is appended. (MSE)
Descriptors: Comparative Analysis; *English (Second Language); Foreign Countries;
Interrater Reliability; Japanese; *Language Proficiency; *Language Teachers;
*Language Tests; *Native Speakers; Oral Language; Questionnaires; Second Language
Instruction; Speech Skills; Surveys
Identifiers: Japanese People
EJ566367
Self-Assessment in Second Language Testing: A Meta-Analysis and Analysis of
Experiential Factors
Ross, Steven
Source: Language Testing, v15 n1 p1-20 Mar 1998
1998
ISSN: 0265-5322
Summarizes research on self-assessment in second-language testing using a meta-
analysis on 60 correlations reported in second-language-testing literature. Self-
assessments and teacher assessments of recently instructed English-as-a-Second-
Language learners' functional English skills revealed differential validities for
self-assessment and teacher assessment depending on the extent of learners'
experience with the self-assessed skill. (SM)
Descriptors: *English (Second Language); Evaluation Methods; Language Proficiency;
*Language Tests; Listening Skills; Reading Skills; Second Language Learning; *Self
Evaluation (Individuals); Speech Skills; Student Evaluation
ED417599
Involving Factors of Fairness in Language Testing
Nakamura, Yuji
Source: Journal of Communication Studies, n7 p3-21 Sep 1997
21p.
September 25, 1997
Notes: Based on a paper presented at the Annual Meeting of the Language Testing
Research Colloquium (19th, Orlando, FL, October 6- 9, 1997).
This study investigated the effects of three aspects of language testing (test
task, familiarity with an interviewer, and test method) on both tester and tested. Data were drawn from several previous studies by the researcher. Concerning test
task, data were analyzed for the type of topic students wanted most to talk about or
preferred not to talk about, and whether they had similar preferences for Japanese
and English tests. Concerning the interviewer factor, data were analyzed for whether
the interviewer was a classroom teacher, whether teacher and interviewer could share
a common conversation topic, and whether the interviewers were interested in topics
the students respond to. Student preferences for oral test method, direct or semi-
direct and type of interaction used to elicit speech, were also analyzed. Results
indicate that at different proficiency levels, students perform differently on direct
and semi-direct tests, and interviewers' choice of test questions influenced student
performances and may have even influenced raters' ratings. Implications for fairness
in testing are considered. Contains 18 references. (MSE)
Descriptors: Behavior Patterns; Comparative Analysis; *English (Second Language);
Interrater Reliability; *Interviews; Language Laboratories; *Language Tests;
Rating Scales; Second Language Instruction; Student Attitudes; Surveys; *Test Bias;
Test Format; Test Items; *Testing; *Verbal Tests
ED417598
A Study of Raters' Scoring Tendency of Speaking Ability through Verbal
Report Methods and Questionnaire Analysis
Nakamura, Yuji
Source: Journal of Communication Studies, n5 p3-17 Sep 1996
17p.
September 25, 1996
To find ways to improve rater reliability of a tape-mediated speaking test for
Japanese university students of English as a Second Language, two studies gathered
information on: how raters actually made their choices on rating sheets of students'
speaking ability; determined what criteria teachers think they use and actually use
in rating students' speaking ability; and drew implications for improving rater
reliability. In the first study, subjects were six native English speakers and three
Japanese teachers of English. They answered a 16-item questionnaire about rating
techniques and criteria, then four months later, rated 30 student tapes, answered the
questionnaire again, and gave a retrospective self-report about the internal process
of evaluating student skills. One of the subjects also gave a self-report concurrent
with rating tapes. In the second study, six audio- and videotape raters analyzed
their rating tendencies through verbal reports, self introspective and retrospective
reports, and interviews. Results of the two studies and implications for teaching
and testing are reported. The instruments (questionnaires, retro/introspective
questions, and interview questions) are appended. Contains 3 references. (MSE)
Descriptors: *English (Second Language); *Evaluation Criteria; Foreign Countries;
Interrater Reliability; Language Research; *Language Tests; *Rating Scales;
*Scoring; Second Language Instruction; Speech Skills; Student Evaluation; Surveys;
Testing; *Verbal Tests
Identifiers: Japan
ED403744
Speech Style, Gender, and Oral Proficiency Interview Performance
O'Sullivan, Barry; Porter, Don
14p.
1996
Notes: Paper presented at the Annual Meeting of the Southeast Asian Ministers of
Education Organization Regional Language Center Seminar (31st, Singapore, 1996).
This study investigated: (1) whether there is a gender effect in the speech of
Japanese learners of English as a Second Language; (2) whether the effect is positive
if the interlocutor is female; (3) whether there are associations of gender effect
with specific features of speech; and (4) in which linguistic features of learner
speech the gender effect is most evident. Subjects were six female and six male
Japanese university students, average age approximately 20 years, and observed by
three male and three female native speakers of English. Each Japanese student was
interviewed twice, once by a man and once by a woman, and observed by a native
English speaker of the same gender. Interviews were analyzed for specific speech
characteristics (e.g., use of fillers, rephrasing, minimal response, repetition) and
scored for accent, grammar, vocabulary, fluency, and comprehension. Results for the
24 interviews indicate a significant difference in scores awarded by different
interviewer/observer pairs. Despite a high degree of agreement within pairs, and
consistency in interviewer speech styles, in all but one interview the students
scored higher when interviewed by a woman, particularly in grammar and fluency.
However, male and female interviewers did show different patterns in linguistic
features. Contains 15 references. (MSE)
Descriptors: College Students; English; *English (Second Language); Foreign
Countries; Higher Education; Interviews; *Japanese; *Language Patterns; *Language
Styles; *Language Tests; Language Usage; Native Speakers; Oral Language; *Sex
Differences; Speech Skills; Testing
Identifiers: *Oral Proficiency Testing
ED398261
Testing Speaking
Kitao, S. Kathleen; Kitao, Kenji
7p.
1996
Speaking a second language is probably the most difficult skill to test in that it
involves a combination of skills that may have no correlation with each other, and
which do not lend themselves to objective testing. In addition, what can be
understood is a function of the listener's background and ability as well as those of
the speaker. Another difficulty is separating the listening skill from the speaking
skill. In spite of the difficulties in testing speaking, it can be very beneficial
in that it encourages the teaching of speaking in class. Reading aloud,
conversational exchanges, and tests using visual material as stimuli are common test
items for testing speaking. Oral interviews, role play tests, and group or pair
activities are also useful. One of the great difficulties in testing speaking is the
assessment and its scoring. If possible, the speaking tasks should be recorded and
the scoring done from the tape. Aspects of speaking that might be considered in the
assessment scale are grammar, pronunciation, fluency, content, organization, and
vocabulary. Even though methods of testing speaking are not perfect, they are worth
the effort for their effects on teaching and classroom instruction. (SLD)
Descriptors: Communicative Competence (Languages); Educational Assessment; Foreign
Countries; *Language Fluency; Language Proficiency; Language Tests; Language Usage;
*Scoring; *Second Language Learning; *Speech Skills; *Test Construction; Test
Items; Test Use
ED390269
Reliable & Valid Testing of Productive Vocabulary: Speaking Vocabulary Test
(SVT)
Templin, Stephen A.
23p.
1995
Notes: Paper presented at the Annual Meeting of the Teachers of English to Speakers
of Other Languages (29th, Long Beach, CA, March 26-April 1, 1995).
This study investigated the testing of speaking vocabulary in English as a Second
Language (ESL) at a university in Hawaii. A Speaking Vocabulary Test (SVT) was
developed and piloted with college students. Test-takers (n=37) were divided into
three groups: native English-speaking freshmen and sophomores; non-native English-
speaking freshmen, sophomores, juniors, and seniors; and non-native English-speaking
students enrolled in an intensive English program preparatory to mainstream
university classes. Results indicate the test to be reliable and valid: students'
scores were consistent, showing a high level of correlation; two evaluators' scoring
of the same ten random tests showed high correlation; and the three student groups
had significantly different scores, ranking in descending order: native; non-native;
and non-native language institute. Anecdotal information on student response to the
test is also offered. A brief bibliography, the test, and test evaluator guidelines
are appended. (MSE)
Descriptors: College Students; Comparative Analysis; *English (Second Language);
Higher Education; *Language Tests; Limited English Speaking; Native Speakers;
Second Language Instruction; *Second Languages; *Speech Skills; Statistical
Analysis; Test Reliability; Test Validity; *Vocabulary; Vocabulary Development
ED369258
The Role of Cohesion in Communicative Competence as Exemplified in Oral
Proficiency Testing
Pavlou, Pavlos
21p.
March 1994
Notes: Paper presented at the Language Testing Research Colloquium (Washington, DC,
1994).
This paper investigates the role of cohesion in oral proficiency testing. This
study analyzed the cohesion of oral reports given by 16 Cypriot high school students
studying English as a foreign language (EFL) using a modification of M. A. K.
Halliday and R. Hasan's rating scale. It then compared the objective cohesion
ratings with impressionistic grades of cohesion assigned by three experienced raters.
Although the results of the study were not conclusive, it was found that the only
objective cohesive device that correlated with raters' grades for cohesion was the
use of referential pronouns. The strengths and weaknesses of the objective rating
scale are discussed. Seven appendixes reproduce models of communicative competence,
test format, objective rating scales, data summaries and correlation matrixes. (MDM)
Descriptors: *Communicative Competence (Languages); Comparative Analysis; *English
(Second Language); Foreign Countries; High School Students; High Schools;
*Language Proficiency; *Language Tests; Oral Language; Rating Scales; *Second
Language Instruction; Speech Skills; *Test Reliability
Identifiers: Cyprus; *Oral Proficiency Testing
ED365157
Methods for Teaching Learning Strategies in the Foreign Language Classroom
and Assessment of Language Skills for Instruction. Final Report
Chamot, Anna Uhl; And Others
357p.
December 1993
Two studies are reported. The first investigated the feasibility of integrating
learning strategy instruction into high school beginning and intermediate level
Russian and Spanish classes. The second study assisted teachers and students of
Japanese, Russian, and Spanish to implement informal assessment activities in their
classrooms. A literature review examines previous research on learning strategies
and the "good language learner," motivation, alternative assessment, and whether or
not learning strategies can be taught. The two studies are then described, detailing
the subjects, sites, instruments, and procedures for the three languages involved in
the three years of the studies. Both studies were conducted in the Washington, D.C.
area in three public school districts and one private school, with the collaboration
of two Japanese teachers, four Russian teachers, and seven Spanish teachers.
Instructional materials designed to teach learning strategies explicitly were
developed for and integrated into the Spanish and Russian curricula. Teachers were
provided with guidelines for instruction in vocabulary learning, listening and
reading comprehension, speaking, self-regulated learning, and problem-solving. Major
accomplishments included identification of relevant strategies, successful classroom
implementation, development of instruments to assess the effectiveness of
instruction, and increases in student self-confidence and language skills.
Substantial related materials are appended. (MSE)
Descriptors: Achievement Gains; Classroom Techniques; Diagnostic Tests; High
Schools; Independent Study; Instructional Materials; Introductory Courses;
Japanese; *Language Tests; *Learning Strategies; Listening Comprehension; Material
Development; Problem Solving; Reading Comprehension; Russian; Second Language
Instruction; *Second Language Learning; Skill Development; Spanish; Speech Skills;
*Student Placement; Teacher Education; Test Construction; Vocabulary Development
EJ472765
Relationships among Second Language Proficiency, Foreign Language Aptitude,
and Intelligence: A Structural Equation Modeling Approach
Sasaki, Miyuki
Source: Language Learning, v43 n3 p313-44 Sep 1993
1993
ISSN: 0023-8333
Investigates relationships among measures of second-language proficiency (SLP),
foreign-language aptitude, verbal intelligence and reasoning in 160 Japanese college
students studying English. The factor analysis of several different SLP test scores
was examined, and the relationship between a general SLP factor and a hypothetical
general cognitive factor was assumed to influence foreign language aptitude. (55
references) (JL)
Descriptors: College Students; *English (Second Language); Factor Analysis; Foreign
Countries; *Intelligence; *Language Aptitude; *Language Proficiency; *Language
Tests; Models; Scores; *Second Language Learning; Verbal Communication
Identifiers: Japan
EJ470864
Assessment of Oral Skills: A Comparison of Scores Obtained through Audio
Recording to Those Obtained through Face-to-Face Evaluation
Nambiar, Mohana K; Goon, Cecilia
Source: RELC Journal: A Journal of Language Teaching and Research in Southeast
Asia, v24 n1 p15-31 Jun 1993
1993
ISSN: 0033-6882
The oral performance of a sample of 87 undergraduates was evaluated first in a face-
to-face setting and then through audio recordings of that same performance. Results
confirm that effectiveness in oral communication clearly is not dependent on words
and sounds alone but that paralinguistic and extralinguistic data also play a
significant role. (eight references) (Author/LB)
Descriptors: *Audiolingual Skills; College Students; Comparative Analysis; Foreign Countries; Higher Education; *Oral Language; *Paralinguistics; *Second Language Learning; Student Evaluation; Tape Recordings; *Verbal Communication
The full text of most materials in the ERIC database with an "ED" followed by six digits is available through the ERIC Document Reproduction Service (EDRS) in microfiche, by email, or in paper copy. Approximately 80% of ERIC documents from 1993 to the present are available for online ordering and electronic delivery through the EDRS Web site. You can read ERIC documents on microfiche for free at many libraries with monthly subscriptions or specialized collections. To find an ERIC center near you, contact our User Services staff.
The full text of journal articles may be available from one or more of the following sources:
To obtain journals that do not permit reprints and are not available from your library, write directly to the publisher. Addresses of publishers are listed in the front of each issue of Current Index to Journals in Education and can now be accessed online through the CIJE Source Journal Index.
If you would like additional information about this or any topic related to language education or linguistics, contact our User Services Staff.