Print Page   |   Contact Us   |   Report Abuse   |   Sign In   |   Join or Create a Guest Account
Invited Plenaries
Share |

Invited Plenaries

Alan Davies Lecture


Abstract
Language Assessment Literacies and the language testing community: A mid-life identity crisis?
Ofra Inbar-Lourie, Tel-Aviv University
The conceptual framework and dilemmas accompanying Language Assessment Literacy (LAL) study and research can be traced to a large extent to the scholarly legacy that Alan Davies offered the language testing community as a leading theorist and researcher. One of his notable contributions in this sense is the attempt to define, along with other scholars, the fundamentals of the field in the Dictionary of Language Testing (Davies, Brown, Elder, Hill, Lumley & McNamara, 1999), and later on in a review of language testing textbooks (Davies, 2008). In this more recent publication, Davies first described the components of language testing knowledge emerging from the books. He then identified a move towards professionalism amongst language testers, wondering however whether this trend might lead to utilizing only in-house resources, resulting in the insulation of language testing from other “potentially rewarding disciplines”.
In this presentation I will further probe the nature of language assessment literacy from this latter perspective, by making a case against insulation in defining the knowledge base required for conducting, analyzing and making decisions based on assessment data by diverse protagonist groups. I will argue for adopting a pluralistic loose descriptive language assessment literacies framework (rather than taking a prescriptive monolithic literacy approach), where existing paradigms are integrated with other areas of expertise and embedded in particular settings thus allowing for the creation of local scripts. Additionally, I would like to argue that the current focus on LAL as the conference theme, allows for a reflection on identity issues within the language testing community at this point in time, as it heads closer towards its 40th anniversary.

Davies, A., Brown, A., Elder, C., Hill, K., Lumley, T., & McNamara, T. (1999). Dictionary of language testing. Cambridge: Cambridge University Press and UCLES.
Davies, A. (2008). Textbook trends in teaching language testing. Language Testing, 25(3), 327– 347.

Bio

Ofra Inbar-Lourie trained at Tel Aviv University, where she was awarded a B.A. in American and English Literature, an M.A. in Education and a Ph.D. in Language Education. She taught at Beit Berl Academic College of Education in the area of EFL teacher education and language assessment, and served as the head of the English department. She was also an EFL inspector for the Ministry of Education, and coordinated the M.A. TESOL program for overseas students at Tel-Aviv University. Currently she lectures in the Multilingual Education program in the School of Education at Tel-Aviv University on assessment issues, language policy and curriculum design, and since 2011 chairs the Teacher Education Unit in the School of Education.


Her professional experience includes officiating in local and international committees dealing with language policy, language planning, language testing, and teacher education. She has researched, published and presented locally and internationally on a range of language education issues, including language assessment and language assessment literacy, language policy, language teachers, especially with regard to native and non-native speaking background, young language learners and recently on English Medium Instruction.

Samuel J. Messick Memorial Lecture Award


How Would Messick Validate 21st-Century Language Assessments?
Stephen G. Sireci, University of Massachusetts Amherst, USA
        Through his writings, Samuel Messick was clear that validation involved a comprehensive evaluation of the empirical evidence and theory that supports the use of test scores for their intended purposes.  His perspectives persevere today as embodied in the Standards for Educational and Psychological Testing (American Educational Research Association, American Psychological Association, & National Council on Measurement in Education, 2014), which stipulate five sources of validity evidence “that might be used in evaluating the validity of a proposed interpretation of test scores for a particular use” (p. 13).  In this presentation, I illustrate the link between Messick’s writings and the AERA et al. (2014) Standards, and illustrate how the validation framework proposed by the Standards can be used to support the validity of the intended uses of contemporary language assessments test scores.
        Contemporary language assessments are used for many different purposes in education, industry, and everyday life.  These purposes range from very high stakes, such as placement into instructional programs in elementary, middle, and secondary school (e.g., English language proficiency assessments for public school students in the United States); to very low stakes such as self-assessment of second language proficiency.  A 21st-century framework for validating language assessments (a) clearly articulates the intended purposes of a test, (b) specifies appropriate and inappropriate uses of the test, (c) identifies the sources of validity evidence needed to support intended purposes and evaluate unintended consequences, and (d) synthesizes the validity evidence in a comprehensive and comprehensible validity argument.
In this presentation I will give examples of how validation frameworks can be developed for language assessments that are consistent with the AERA et al. (2014) Standards, and with Dr. Messick’s validity theory.  By following Dr. Messick’s guidance, and the guidance provided by the 60-year history of the Standards, we will not only develop strong validity arguments to support the use of language tests in specific situations, we will also develop better language tests.

Bio

Stephen G. Sireci, Ph.D.
Professor of Educational Policy, Research, and Administration
Director, Center for Educational Assessment
University of Massachusetts Amherst, USA
Dr. Sireci is Professor and Director of the Center for Educational Assessment in the College of Education at the University of Massachusetts Amherst. He earned his Ph.D. in psychometrics from Fordham University and his master and bachelor degrees in psychology from Loyola College in Maryland. Before UMASS, he was Senior Psychometrician at the GED Testing Service, Psychometrician for the Uniform CPA Exam and Research Supervisor of Testing for the Newark NJ Board of Education. He is known for his research in evaluating test fairness, particularly issues related to content validity, test bias, cross-lingual assessment, standard setting, and sensitivity review. He is the author of over 100 publications and conference papers, and is the primary architect of the Massachusetts Adult Proficiency Tests, which is the primary assessment of reading and math skills used in adult education programs in Massachusetts.  He currently serves or has served on several advisory boards including the National Board of Professional Teaching Standards Assessment Certification Advisory Panel, the Texas Technical Advisory Committee, and the Puerto Rico Technical Advisory Committee. He is a Fellow of the American Educational Research Association and a Fellow of Division 5 of the American Psychological Association.  Formerly, he was President of the Northeastern Educational Research Association (NERA), Co-Editor of the International Journal of Testing, a Senior Scientist for the Gallup Organization and a member of the Board of Directors for the National Council on Measurement in Education.  In 2003 he received the School of Education’s Outstanding Teacher Award, in 2007 he received the Chancellor’s Medal, which is the highest faculty honor at UMASS, and in 2012 he received the Conti Faculty Fellowship from UMass. He has also received the Thomas Donlon Award for Distinguished Mentoring and the Leo Doherty Award for Outstanding Service from NERA.  Professor Sireci reviews articles for over a dozen professional journals and he is on the editorial boards of Applied Measurement in Education, Educational Assessment, Educational and Psychological Measurement, the European Journal of Psychological Assessment, and Psicothema.


more Calendar

5/18/2017 » 6/30/2017
Call For Papers -2-day IATEFL TEASIG seminar

5/19/2017 » 7/31/2017
Call for Papers: 3rd Int'l Conf on Language Testing & Evaluation and 5th British Council

Association Management Software Powered by YourMembership  ::  Legal