Effectiveness of online testing in general English university course from teacher and student perspectives
To analyze how Moodle learning management system (LMS) can be applied to online language testing, study the effectiveness of online testing and compare the students’ and teachers’ attitude towards online testing in General English university course.
Рубрика | Педагогика |
Вид | статья |
Язык | английский |
Дата добавления | 16.07.2023 |
Размер файла | 837,5 K |
Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже
Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.
Размещено на http://www.allbest.ru/
Effectiveness of online testing in general English university course from teacher and student perspectives
Olena O. Kucherova
PhD of Philological Sciences, Associate Professor of the English Language Department National University of “Kyiv-Mohyla Academy”, Kyiv, Ukraine
Iryna O. Ushakova
PhD of Pedagogical Sciences, Senior Lecturer of the English Language Department National University of “Kyiv-Mohyla Academy”, Kyiv, Ukraine
Abstract
With most switching to distance education due to the pandemic caused by COVID-19, the efficacity of assessment is a major concern. The research was aimed to analyze how Moodle learning management system (LMS) can be applied to online language testing, study the effectiveness of online testing and compare the students' and teachers' attitude towards online testing in General English university course. Online tests were administrated as a synchronous component of the distance learning to 857 first-year bachelor's degree students of “Kyiv-Mohyla Academy” (Ukraine) by 20 teachers during the 2020-2021 academic year. A mixed research design was employed, which involved collecting data using an online questionnaire completed by students and teachers anonymously as Microsoft Forms; Excel spreadsheets were used for the analysis afterward. A quantitative descriptive study was conducted to evaluate the students' and teachers' satisfaction with online testing. The expert evaluation method was prioritized to define the expediency of the effectiveness of the online test by the specified criteria and indicators based on the judgments expressed by 7 experienced teachers who are competent in test design. In addition, Pearson's correlation coefficient was calculated to compare the results of an online test and oral exam. The qualitative research method allowed to analyse and interpret data of the experimental learning. Based on the results of our study, we can conclude that different types of Moodle LMS questions can be successfully applied to online language testing as part of course assessment at the university level. The paper argues that online language testing can be effective and relevant to course objectives from both students' and teachers' perspectives with positive washback on education. The results of the study can be employed by university teachers for language course design both for distance and blended learning.
Keywords: online testing; test effectiveness; language skills assessment; formative assessment; Moodle.
Анотація
ЕФЕКТИВНІСТЬ ОНЛАЙН ТЕСТУВАННЯ В УНІВЕРСИТЕТСЬКОМУ КУРСІ АНГЛІЙСЬКОЇ МОВИ З ТОЧКИ ЗОРУ СТУДЕНТІВ ТА ВИКЛАДАЧІВ
Кучерова Олена Олександрівна
кандидат філологічних наук, доцентка кафедри англійської мови Національний університет «Києво-Могилянська академія», м. Київ, Україна
Ушакова Ірина Олександрівна
кандидат педагогічних наук, старша викладачка кафедри англійської мови Національний університет «Києво-Могилянська академія», м. Київ, Україна
В умовах масового переходу до дистанційного навчання, обумовленого світовою пандемією COVID-19, проблема ефективності онлайн оцінювання залишається надзвичайно актуальною. Метою нашого дослідження було проаналізувати шляхи застосування системи управління навчанням Moodle (Moodle LMS) до онлайн тестування з англійської мови, вивчити ефективність онлайн тестування та порівняти ставлення студентів та викладачів до нього під час вивчення/викладання курсу англійської мови. Упродовж 2020-2021 навчального року онлайн тестування проводилось викладачами кафедри англійської мови (20 викладачів) у режимі синхронного викладання для студентів першого року навчання бакалаврських програм (857 студентів) Національного університету «Києво-Могилянська академія» (Україна). Під час дослідження було використано комплекс методів, що передбачали збір даних за допомогою онлайн анкети, заповненої студентами та викладачами анонімно в Microsoft Forms; для аналізу отриманих результатів були використані електронні таблиці Excel. Для оцінки рівня задоволеності студентів та викладачів онлайн тестуванням було проведено кількісне описове дослідження. Метод експертних оцінок був пріоритетним для визначення ефективності онлайн тестів відповідно до наведених критеріїв та показників на основі суджень 7 досвідчених викладачів, компетентних в розробці тестових завдань. Крім того, для порівняння результатів онлайн тесту та усного іспиту було розраховано коефіцієнт кореляції Пірсона. Якісний метод дослідження дозволив проаналізувати та інтерпретувати дані, отримані під час експериментального навчання. На основі результатів нашого дослідження ми можемо зробити висновок, що різні типи завдань Moodle LMS можуть бути успішно використані для онлайн тестування з англійської мови в закладах вищої освіти. У статті стверджується, що онлайн тестування є ефективним, відповідає цілям курсу та має позитивний вплив на навчання загалом за оцінкою студентів і викладачів. Результати дослідження можуть бути використані викладачами університетів для розробки мовних курсів для дистанційного і змішаного навчання.
Ключові слова: онлайн тестування; ефективність тестування; оцінювання мовних навичок; формувальне оцінювання; Moodle.
INTRODUCTION
online testing english university course
The global pandemic caused by COVID-19 has significantly affected the main areas of human activity: medicine, science, business, tourism, culture, and education. Under total lockdown, remote management and online education technologies have become widespread, and people's homes have transformed into temporary offices for parents and classrooms for students. All these external circumstances inspired teachers to adopt new ways of communication with students and organization of learning, teaching, and assessment. University professors and teaching staff started exploring various tools to digitize both study sessions and examinations. Without access to traditional examination methods, they were required to find new effective and relevant ways to assess students online. As a result, teachers had to go beyond traditional forms of teaching and assessment, implementing new digital tools and methods which, in turn, became decisive in the design and evaluation of the effectiveness of online testing.
The problem statement. National University of “Kyiv-Mohyla Academy” has been providing distance education since March 2020, using a university-wide online platform called `DistEdu', which is based on Moodle LMS. One silver lining is that the platform had been implemented and tested before the quarantine, but not all the features and tools were fully employed because it was used as an extra source for teaching. The unexpected lockdown was a trigger for the creation of online courses by each teacher that greatly simplified and generalized the teaching process at the university.
Asynchronous teaching has been extensively encouraged at the university, which means that the teachers and the students engage with the course content at their convenience. Using DistEdu the teachers may upload media, start discussion boards, assign reading or writing as an asynchronous form of teaching and learning. The teachers can also run online tests as a synchronous form of teaching and learning. The teachers guide the students, provide them with feedback and assess their progress. The functionality of the DistEdu platform enabled to replace all study materials and make them available to students quite fast. It was the time to look at everything from different perspectives, and offline classes quickly moved to online Zoom and MS Teams classes, opening the way for teachers to use extra tools and software, such as Quizlet, Padlet, Kahoot, Jamboard, Classtime, or Miro to digitalize teaching and assessments.
It is worth mentioning that the 2020-2021 academic year compared with the previous one was unprecedented just because the teachers who worked with first-year students had never met them in the offline classroom before, and students also had never had face-to-face communication with each other, which was, in any case, a certain psychological barrier and caused some difficulties for an adequate assessment. Furthermore, the question of students' psychological health is becoming ever more vital. For this reason, online testing of the 2020/2021 academic year is featured by a specific educational environment, permanent restrictions, and information overload because of the COVID-19 pandemic.
The analysis of the recent studies and publications. Undoubtedly, the topics of distance education, employing web platforms and e-learning tools for online teaching and assessment are not completely new, they have been studied extensively by the methodologists for several decades, but before the lockdown, they were considered mostly in the context of blended learning or as some extra possibilities of on-site testing, for example, in the computer lab at university. Rapid digital transformation of higher education in connection with COVID- 19 has posed challenges to faculty members, forcing them to search crisis-response methods and search for opportunities for distance education [1]. In this regard, Y. Krylova-Grek and M. P. Shyshkina [2] study the online learning trends of higher education and suggest strategies for how to organize distance learning. Different aspects of technology integration into higher education have been investigated by S. Naidu [3] and the guiding principles for online teachers, emphasising the main problems and challenges that are present in the online classroom were provided by D. A. McFarlane [4]. M. Decuypere et al. [5] and M. Arshad [6] overview the key features of digital education platforms to enhance education.
Due to the pandemic and global move to distance education, the issue of effective online assessment has gone to the fore. On this basis, F. J. Garcia-Penalvo et al. [7] conclude that ways of assessment in Higher Education should be reconsidered. Accordingly, B. O'Sullivan et al. [8] focus on language test preparation strategies and E. Stradiotova et al. [9] compare online testing the language skills with on-site testing. One of the topical issues of online testing in the un-proctored environment is academic dishonesty and student cheating. I. Arnold [10], W. Bloemers et al. [11], D. L. King and C. J. Case [12], J. G. Nguyen et al. [13],
D. Steger et al. [14] draw attention to the impact of cheating while testing on final term grades and suggest ways how to minimize online cheating for online assessments.
In spring 2020 because of experts exploring assessment in different world universities and colleges as well as the impact of technology on education and assessment, the report was published with the aim of helping address some of the concerns about assessment and highlighted some opportunities [15]. According to the report, the assessment should be 1) authentic, designed to prepare students to use technology they will use in their future careers; 2) accessible, designed with accessibility-first principle; 3) appropriately automated, with a balance of automated and human marking for maximum benefits to students; 4) continuous, exploring the opportunities for continuous assessment in order to improve learning experience; 5) secure, adopting authoring detection and biometric authentication for identification and remote proctoring. The abovementioned report also set out the main goals for assessment, arguing that assessment should become more 1) relevant to contemporary needs; 2) adaptable in terms of addressing the students' and providers' needs; 3) trustworthy.
The research goal. Taking into consideration the urgency of online testing as a learning and assessment tool, the given research aims at studying the effectiveness of online testing in General English university course from teacher and student perspectives. To achieve the goal, the following tasks were formulated: 1) to explore the opportunities provided by Moodle LMS to implement online testing in General English course design both for small-scale and large-scale tests; 2) to study the students' and teachers' attitude toward online testing and its washback on education; 3) to compare the results of an online test and oral exam (as a combination of innovative and traditional forms of assessment).
THE THEORETICAL BACKGROUNDS
Being fundamentally important for teaching and learning, high-quality assessment contributes markedly to achieving learning outcomes and enhancing students' satisfaction. Following C. A. Chapelle et al. [16, p. 294], assessment is understood as the process of analysing the ability of learners to use the language based on the information collected. In this regard, the main dichotomies suggested for the investigation of assessment are informal versus formal assessment and formative versus summative assessment [17].
In recent years we have observed the shift towards the assessment for learning approach rather than the assessment of learning [18], [19]. With assessment for learning students take the responsibility for improving their performance in the course. Among recent approaches to assessment, there should be mentioned learning-oriented assessment [20] and dynamic assessment [21]. Built on the idea of formative assessment, they both put learning at the center of the assessment process and “capture all assessment as a vehicle for learning” [15, p. 307] with the primary purpose to promote learning. In this regard, tests as a subset of assessment are viewed as “powerful tools to promote lasting learning in their own right” [22, p. 1864].
Depending on what language tests intend to measure, they can be classified as ability approach (indirect) and performance approach (direct) [15, p. 297]. Ability testing measures grammatical ability, vocabulary knowledge, reading comprehension, and listening comprehension, while performance testing measures writing and speaking. J. D. Brown and T. Hudson employ the term criterion-referenced test to refer to “any test that is primarily designed to describe the performance of examinees in terms of the amount that they know of a specific domain of knowledge or set of objectives” [23, p. 5]. To put it differently, criterion- referenced tests are designed to provide feedback on specific course objectives.
In J. Lewkowicz and C. Leung classroom-based assessment is defined as “any teacher- led classroom activity designed to find out about students' performance on curriculum tasks that would yield information regarding their understanding as well as their need for further support and scaffolding with reference to their situated learning needs” [24, p. 48]. Classroom-based assessment for language courses centers on communicative performance and correlates with the Common European Framework of References for Languages (CEFR). CEFR initially meant as a guide for comparing objectives internationally provides a common foundation for six general levels of language proficiency. It uses descriptions of communicative language activities that learners perform, such as comprehension, production, interaction, and mediation, which draw on linguistic and general competences [25]. Thus, the CEFR, emphasising that evaluation is inextricably connected to teaching and learning, provides principles for the development of language curricula to support the teaching, as well as assessment tools.
As it is stated in CEFR, linguistic competence includes active and passive vocabulary (lexical competence); knowledge of the rules and structures and the ability to use them correctly (grammatical competence); the organization of meaning (semantic competence); hearing and producing sounds (phonological competence); the ability to spell correctly (orthographical competence); the ability to read from a written text, pronouncing correctly (orthoepic competence) [25, p. 129]. That is, grammatical accuracy, vocabulary range, and vocabulary control are essential components of `linguistic competence'. As mentioned above, the fact that some expressions can be used to convey vastly different messages adds to the complexity of language [26, p. 12]. Thus, language tests mainly aim to check grammatical and vocabulary range and accuracy.
In this paper, assessment is understood as an umbrella term that includes formal testing and other types of qualitative assessment, such as informal observation of students' language use, portfolios, and reflection journals. In J. D. Brown's words, a test is “a method of measuring a person's ability, knowledge, or performance in a given domain” [16, p. 3]. As a method or an instrument, it is a set of techniques that require a certain type of performance from the student. And this performance or competence within a particular domain can be measured by a test.
RESEARCH METHODS
The study employed a mixed research design method. To achieve the purpose of our research we used a quantitative research method which involved collecting data by means of an online questionnaire completed by students and teachers anonymously as Microsoft Forms; Excel spreadsheets were used for the analysis afterward. The data analysis was conducted with the use of descriptive statistics, which allowed to summarise and interpret the data obtained. Qualitative data was received through analysing information of individual feedback, which allowed the researchers to compare the responses from students and teachers. In order to determine the effectiveness of online language testing in our study, we also employed the method of expert evaluations and Pearson's correlation coefficient.
Participants
The participants of the study were 857 first-year bachelor's degree students in a wide variety of majors (aged 17-18 years) and 20 teachers (aged 28-56 years) of the National University of “Kyiv-Mohyla Academy” (Ukraine). The research also involved seven experts, who are experienced teachers (with 15-25 years of teaching) from the same university. The study was carried out in the academic year 2020-2021 (autumn and spring semesters) in General English course. The students took part in the experiment as a part of their coursework as they had to pass their examination in a lockdown-forced distance mode.
Instruments
The Likert scale was chosen to measure the students' and teachers' attitudes towards online testing at the end of the second semester and for expert evaluation since this instrument is widely used by researchers in education for that kind of purpose. In addition, Guttman scaling was chosen to find out the most favorable statement in an online questionnaire completed by students and teachers. Pearson's correlation coefficient [27] was chosen to compare the results of the online test with the oral exam to assess its effectiveness.
THE RESULTS AND DISCUSSION
Applying Moodle LMS for online language testing
Testing has always been an important element of the learning and teaching process in General English course. The course lasts for two semesters, during which two main types of tests are used: 1) three Current Tests for each semester (six for the course) and 2) one Final Test at the end of the semester (two for the course). The main purpose of Current Tests is to determine what the students need help with. In contrast, the main purpose of Final Tests is to determine if learning goals have been met. The tests are prepared by the teachers and are concerned with measuring what has been learned as a part of the course, in other words, `achievement assessment' [15, p. 294]. The salient feature of these tests is that they are conducted on the same day by all students of the course.
These tests were quite extensive (100 questions), aimed to check learned lexical and grammar material in the most comprehensive and objective way first for one semester and then for the whole course. That is why the paper-based Final Tests had a complex structure, containing the exercises of various types, which are usually used in preparation for international exams in English and adhere to modern methodological requirements for the indirect type of test items and discrete-point testing. While preparing the Final Tests, teachers of the English Language Department always focus on modern didactic principles and generally accepted criteria to evaluate the effectiveness of any test which are: validity and reliability [28, p. 381]. Moodle LMS can meet all the above criteria. Thus, we were able to adapt and create quizzes of different types to check the studied lexical and grammar material: 1) drag the words from the list below into the correct place in the text, 2) arrange the words into a sentence, or 3) rearrange the jumbled sentences into a logical text (drag and drop into text question type on Moodle LMS); 4) choose the correct answer to complete the sentences; 5) find a mistake in the underlined parts of the sentence; or 6) choose the odd word in each line (multiple choice question type); 7) find the words for the following definitions (matching question type) (Appendix A).
J. D. Brown outlines different assessment types and classifies them into four basic categories: receptive-response, productive-response, personal-response, and individualized- response [29]. The scholar goes on to suggest 12 assessment types: true-false items, matching items, multiple-choice items, fill-in items, short-answer items, performance assessment, conference assessment, portfolio assessment, self/peer assessment, continuous assessment, differentiated assessment, and dynamic assessment. These assessment types fall into two main groups: those that focus on single language points and those that represent how to collect assessment data.
During the research assessment categories and assessment types for tests that were proposed by J. D. Brown [29] were applied to Moodle LMS capacities. Table 1 demonstrates the assessment options that can be employed in online testing. Items for large-scale tests (Final Test) should be relatively quick and easy to administer and score objectively. The selected-response category is suitable for testing the receptive skills of reading, listening, knowledge of grammar, and vocabulary. The constructed-response category is fundamentally different from the selected-response category as students are expected to produce written language ranging from single words (as in fill-in items) to longer answers (as in short-answer items) and a paragraph or essay writing (as in performance assessment). Fill-in items, which require students to write in the missing word or words, are relatively easy to score and quick to administer, hence suitable for large-scale tests (Final Test). Short-answer items, which require students to write a few words, and performance assessment, which requires students to write a paragraph or essay, are more difficult to administer and less objective to score compared to the items mentioned earlier. As a result, they are suitable for small-scale tests (Current Tests).
Table 1 Assessment options for online testing
Category |
Assessment type |
Moodle question type |
Small-scale tests |
Large-scale tests |
|
Selected- response |
True-false items |
True/False Select missing words Embedded answers (cloze) |
X |
X |
|
Multiple-choice items |
Multiple choice Select missing words Embedded answers (cloze) |
X |
X |
||
Matching items |
Matching Drag and drop into text |
X |
X |
||
Constructed- response |
Fill-in items |
Short answer Embedded answers (cloze) |
X |
X |
|
Short-answer items |
Short answer Embedded answers (cloze) Essay |
X |
|||
Performance assessment |
Essay |
X |
Expert evaluation of the effectiveness of online language testing
It is worth emphasizing that while designing such a test, we also followed the principle of balancing elements for adequate distribution of tasks by complexity, the time required, and, depending on this - determining the number of points to evaluate each task. We also fully agree with J. Harmer's assertion that "when we write test items, the first thing to do is to get fellow teachers to try them out" [28, p. 386]. In this regard, our colleagues scrutinized the test questions for methodological purposes like terms and wording clarity, or correctness of the proposed answer options (for example, the presence or absence of cases with more than one correct answer). However, this year the important element of verification was the correct technical realization of the proposed questions, visual design, and usability checking. For this purpose, a group of experts was formed, which included 7 highly qualified specialists with more than 15 years of teaching experience, who have creative thinking, a positive attitude to innovation, and are competent in test design. The work of this group consisted of two stages:
1) To analyze, research, and provide expert evaluation of the tests in advance before they are passed by students. In this case, we used a questionnaire and an interview with the experts to obtain a detailed opinion and identify possible shortcomings.
2) Expert opinion was given by monitoring the online assessment process of their own students; some specific features, such as cognitive and psychological, of online testing were outlined. When the tests closed, students could see their final marks, the correct answers, and feedback provided by teachers. Thus, test results were used as learning devices and the washback effect was evaluated by the experts. During the second stage of the experts' work, we used conversations, meetings, and discussions.
According to the main learning outcomes of the course and online tests specifics, we defined the most relevant indicators (Table 2) that are based on five main principles of language assessment presented by J. D. Brown: practicality, reliability (student-related reliability, rater reliability, test administration reliability, test reliability), validity (content validity, criterion-related validity, construct validity, consequential validity, face validity), authenticity, and washback [16, p. 19-30]. Evaluation of the representativeness of these criteria is as follows: depending on the degree of disclosure and/or the quality of implementation of each indicator in the tests, the experts evaluated them on the scale from 1 to 5, where 1 point indicated an extremely low level of implementation of these indicators and 5 - the highest, respectively. We divided the average score of the obtained results into four levels: low (poor) - 1-3 points, medium (average) - 3.1-4 points, 4.1-4.5 - above average, and high (excellent) - 4.6-5 points. The final average score is 4.7, which fits a high level of effectiveness.Table 2
Expert evaluation of the effectiveness of online testing
Criteria |
Practicality |
Content Validity |
Face Validity |
Test Reliability |
Authenticity |
Technical settings |
|
Indicators |
acceptable price; appropriate time limit; relatively easy to administer; evaluation procedure is time-efficient |
correlation of the tasks and test goals (W. J. Popham's method); correlation of the test and course objectives; coverage of studied grammar and vocabulary material |
a well- constructed, expected format with familiar tasks; clear directions; acceptability of the test items perception (number of questions for 1 page); a level of difficulty presents a reasonable challenge |
adequate number of tasks; correctness of the assessment; criteria for each task; misprints free; stylistic accuracy, clarity, conciseness of the tasks, coherence of test items; test design usability; correctness of the proposed answer options |
task language is as natural as possible; contextualized items; relevant topics; approximate authentic tasks |
only one attempt to take a test without possibility to go back to the previous item; shuffled questions within each section and shuffled answers; time is strictly limited; Javascript is blocked |
|
Experts' average score |
5 |
4,5 |
5 |
4,5 |
4,5 |
4,5 |
Along with the experts' observations while being at the second stage of their work (when students got their test results) we can conclude that beneficial washback was achieved, indicating students' progress, challenges, and the teachers' success. The experts also noted that most students felt positive about online tests. We consider the reason for such an encouraging attitude towards online testing can be explained that students are staying in their comfort zone, in un-proctored areas and they believe that it helps them to get better marks.
Students' and teachers' attitude questionnaire and Pearson's correlation coefficient
Overall, the students were given 8 online tests, of which 6 are Current Tests and 2 are Final Tests. At the end of the second semester, both students and teachers were asked to do a questionnaire. The questionnaire included six questions for students: 1) How do you feel about online testing? With answer options: It is rather effective and worth spreading nowadays; It has a great potential; It is more stressful than paper tests; It isn't objective at all; I feel pretty negative. 2) Would you prefer online testing rather than conventional paper tests next year? With answer options: Yes; No; Maybe. 3) Do online tests cover the main aspects of grammar and vocabulary material studied? With answer options: Yes; No; Maybe. 4) How objective are the online tests in your opinion? On a scale of 1-10. 5) Did you meet any difficulties while online testing? With answer options: Yes or No. 6) What are the main difficulties while testing? This was an open question.
The questionnaire also included five questions for teachers: 1) What is your general opinion about online tests? With answer options: Positive; More positive than negative; More negative than positive; Negative; I have not decided yet. 2) May these tests be used as an effective and objective alternative to traditional paper tests to assess learners' grammar and vocabulary progress (considering that students pass their oral credit/exam)? With answer options: Yes; No; Maybe. 3) How objective are online tests in your opinion? On a scale of 110. 4) Online test is one of the means of assessment aimed at providing paperless and ecofriendly lessons. Should we continue the same practice in the classroom? With answer options: Yes; No; Maybe. 5) What would you recommend improving in the context of online testing? The latter was an open question.
It should be noted that after taking online tests (indirect testing) students also had a direct test - they passed the online oral exam to perform their speaking skills. Thus, integrating various types of assessment into General English course provided a comprehensive evaluation of the four basic students' language skills. The synergy of testing active and passive skills in complex activities or quizzes, serving a complementary one for the other, lies in different modes of testing, for example, grammar patterns are given in spoken interaction, since structural elements of essay writing in spoken production, and lexis in listening quizzes.
The results of the questionnaires taken by students and teachers at the end of the course demonstrate that online testing was found effective both by teachers (46%) and students (26%) with an enormous potential (51% of students). 60% of students and 62% of teachers would prefer to conventional paper tests next year. 80% of students believe that online tests cover the main aspects of grammar and vocabulary material studies. Students and teachers find online tests objective, giving 7.58 (out of 10) and 7.08 (out of 10) as average numbers, respectively.
Nevertheless, it has exposed a range of challenges. 12% of students found it more stressful than paper tests and 37% of students mentioned that they met some problems during the test. Among the main difficulties were mentioned time management problems, the task order was not always comfortable for all students, and unstable internet connection. Some students wrote that they “can't estimate the time you have to spend on a task”, “it's really bad that you can't go back to the previous questions, so you're left without a chance to think twice”, “sometimes I deal with internet connection problems, which makes me feel more stressed”. The findings are presented in Table 3.
Table 3
Students' and teachers' attitude to online testing
Students |
Teachers |
||
Express generally positive opinion about online testing |
26% |
46% |
|
See great potential in online testing |
51% |
31% |
|
Consider online tests an effective alternative to traditional paper tests |
80% |
62% |
|
Experienced difficulty while doing/preparing online testing |
37% |
43% |
|
Overall find online testing objective |
7.58 |
7.08 |
As a final stage of our research, we compared the results of the last online test, which was Final Test, and oral exam. They both covered the material learned during the whole course. For this, we used Pearson's correlation coefficient, which is “based on the assumption that the relation between the predictor and criterion is linear [27, p. 283]. Pearson's correlation is represented as:
where Pxy - the correlation between predictor (online test) and criterion (oral exam), Pxx, and Pyy, - their reliabilities. The Equation provides an estimate of the correlation between the scores of online test (X) and oral exam (Y). The result was calculated using Excel Pearson formula. The calculated Pearson's correlation coefficient is P = 0,87, which indicated that it is pretty good positive correlation between two variables (online test and oral exam).
CONCLUSIONS AND PROSPECTS FOR FURTHER RESEARCH
The results of the research clearly demonstrate that different types of Moodle LMS can be successfully applied to online testing as a part of course assessment at the university level. Overall, online testing was found effective and relevant to course objectives by students and teachers. The expert evaluation also showed that online testing is effective with positive washback on education. Moreover, the effectiveness of online testing was proven by the calculation of Pearson's correlation coefficient.
The technology enables students to take the test anywhere, from the comfort of their homes. The content of the test is not reliant on memory alone. The need to digitalize assessment has raised not only big opportunities but also challenges. Some teachers may feel skeptical about opening book approaches due to the possibility of cheating and fraud. Visual proctoring has become the thing of the past like rote learning.
In the process of our research, we identified the basic principles that helped to minimize cheating and reduce its impact on the overall assessment of students. First, there are certain technical settings:
- students have only one attempt to take the test and they cannot go back to the previous question and correct it;
- the time is strictly limited, for example, 80 minutes;
- JavaScript is blocked, which disabled students from opening extra browser windows, online dictionaries, or grammar references on the device;
- tests are organized into sections with shuffled questions within each section and shuffled answered.
According to our observations, the implementation of these simple requirements significantly reduces the ability of students to use some additional resources and consult each other while testing. This does not mean that we can prevent academic dishonesty entirely but this way we are able to minimize the risks. It could be also recommended to adhere to some methodological practices given below:
- At least A of the tasks should be based on specific or/and authentic materials that had been taught during the classes and it would be difficult to find the key on the Internet if you didn't attend the course.
- Summative assessment cannot be centered on online tests that are installed on Moodle LMS only. It is considered to use direct testing - oral exam - as the second part of the evaluation process that is aimed to define mistakes and knowledge gaps of students who might have been cheating and correct the grade of the course [30].
- Tests should be planned one per week or module to keep students on track without overloading them.
- Feedback should be provided after each test. This can be general feedback to the whole class or feedback built into the test.
In summary, online testing is an effective form of formal assessment, which supports learning and has a positive washback on education. Integrating various types of assessment into General English course comprised a comprehensive evaluation of the basic students' language skills. Ultimately, there are some clear benefits of online testing: 1) they contribute to learner-focused learning; 2) they help scaffold students' learning across the term; 3) they give students timely feedback. With the assessment-for-learning approach students take responsibility for their learning, and as a result, are more prepared for future working environments.
With this in view, online testing looks like a constantly modified puzzle, adjustable to the learning goals, matrix, or a set of programmed elements designed on the substrate of language material, evolving more assessment opportunities. Positive experience may serve for the future methodological developments independent from lockdown restrictions, changing the balance between online and offline formal assessment.
Any exam requires a high level of participants' attention and concentration, can provoke, expose psychological problems that could be hidden behind the “curtain” of turned off camera while online classes on Zoom/MS Teams platform. That is why this issue still requires further examination and analysis. Future research also lies in investigating how to employ online testing for blended learning and the impact of distance education on psychological states. Studying students' engagement rate and anticipation while testing can help design more efficient tests.
REFERENCES (TRANSLATED AND TRANSLITERATED)
[1] O. B. Adedoyin and E. Soykan, “Covid-19 pandemic and online learning: the challenges and opportunities,” Interactive Learning Environments, pp. 1-13, 2020, doi: 10.1080/10494820.2020.1813180. (in English)
[2] Y. Krylova-Grek and M. P. Shyshkina, “Online learning at higher education institutions in Ukraine: achievements, challenges, and horizons,” Information Technologies and Learning Tools, vol. 85, no. 5, pp. 163-174, 2021, doi: 10.33407/itlt.v85i5.4660. (in English)
[3] S. Naidu, “Reimagining education futures to lead learning for tomorrow,” Distance education, vol. 42, no. 3, pp. 327-330, 2021, doi: 10.1080/01587919.2021.1956306. (in English)
[4] D. A. McFarlane, “Facilitating and dealing with learner differences in the online classroom,” European Journal of Educational Research, vol. 1 (1), pp. 1-12, 2012, doi: 10.12973/eu-jer.1.1.1. (in English)
[5] M. Decuypere, E. Grimaldi, and P. Landri, “Introduction: Critical studies of digital education platforms,” Critical Studies in Education, vol. 62, no. 1, pp. 1-16, 2021, doi: 10.1080/17508487.2020.1866050. (in English)
[6] M. Arshad, “Experience of using the Blackboard learning management system in Jazan University,” Information Technologies and Learning Tools, vol. 83, no. 3, pp. 79-99, 2021, doi: 10.33407/itlt.v83i3.4185. (in English)
[7] F. J. Garda-Penalvo, A. Corell, V. Abella-Garda, and M. Grande-de-Prado, “Recommendations for Mandatory Online Assessment in Higher Education During the COVID-19 Pandemic,” in Radical Solutions for Education in a Crisis Context. Lecture Notes in Educational Technology, D. Burgos, A. Tlili, and A. Tabacco, Eds. Springer, Singapore, 2021, pp. 85 -98, doi: 10.1007/978-981-15-7869-4_6. (in English)
[8] O'Sullivan, K. Dunn, and V. Berry, “Test preparation: an international comparison of test takers' preferences,” Assessment in Education: Principles, Policy & Practice, vol. 28, no. 1, pp. 13-36, 2021, doi: 10.1080/0969594X.2019.1637820. (in English)
[9] E. Stradiotova, I. Nemethova, and R. Stefancik, “Comparison of on-site testing with online testing during the COVID-19 pandemic,” Advanced education, vol. 8 (17), pp. 73-83, 2021, doi: 10.20535/24108286.229264. (in English)
[10] Arnold, “Cheating at online formative tests: Does it pay off?,” The Internet and Higher Education, vol. 29, pp. 98-106, 2016, doi: 10.1016/j.iheduc.2016.02.001. (in English)
[11] W. Bloemers, A. Oud, and K. van Dam, “Cheating on unproctored internet intelligence tests: Strategies and effects,” Personnel Assessment and Decisions, vol. 2 (1), pp. 21-29, 2016, doi: 10.25035/pad.2016.003. (in English)
[12] D. L. King and C. J. Case, “E-cheating: Incidence and trends among college students,” Issues in Information Systems, vol. 15 (1), pp. 20-27, 2014, doi: 10.48009/1_iis_2014_20-27. (in English)
[13] G. Nguyen, K. J. Keuseman, and J. J. Humston, “Minimize online cheating for online assessments during COVID-19 pandemic,” Journal of Chemical Education, vol. 97 (9), pp. 3429-3435, 2020, doi: 10.1021/acs.jchemed.0c00790. (in English)
[14] D. Steger, U. Schroeders, and O. Wilhelm, “Caught in the act: Predicting cheating in unproctored knowledge assessment,” Assessment, vol. 28 (3), pp. 1004-1017, 2021, doi: 10.1177/1073191120914970 (in English)
[15] “The future of assessment: five principles, five targets for 2025,” 2020. [Online]. Available: https://www.jisc.ac.uk/reports/the-future-of-assessment (in English)
[16] C. A. Chapelle, B. Kremmel, and G. Brindley, “Assessment,” in An Introduction to applied linguistics, N. Schmitt and M. P. H. Rogers, Eds. London; Routledge, 2020, pp. 294-316. (in English)
[17] D. Brown, Language assessment. Principles and classroom practices. Longman, 2004. (in English)
[18] R. Dann, “Assessment as learning: blurring the boundaries of assessment and learning for theory, policy and practice,” Assessment in Education: Principles, Policy & Practice, vol. 21, no. 2, pp. 149-166, 2014, doi: 10.1080/0969594X.2014.898128. (in English)
[19] M. Quinlan and E. Pitt, “Towards signature assessment and feedback practices: a taxonomy of discipline- specific elements of assessment for learning,” Assessment in Education: Principles, Policy & Practice, vol. 28, no. 2, pp. 191-207, 2021, doi: 10.1080/0969594X.2021.1930447. (in English)
[20] C. E. Turner and J. E. Purpura, “Learning-oriented assessment in second and foreign language classrooms,” in Handbook of second language assessment, D. Tsagari and J. Banerjee, Eds. Boston, MA: De Gruyter Mouton, 2016, pp. 255-271. (in English)
[21] J. Lantolf and M. Poehner, “Dynamic assessment in the classroom: Vygotskyan praxis for second language development,” Language teaching research, vol. 15, pp. 11-33, 2011. (in English)
[22] B. W. Yang, J. Razo, and A. M. Persky, “Using Testing as a Learning Tool,” American journal of pharmaceutical education, vol. 83(9), p. 1862-1872, 2019, doi: 10.5688/ajpe7324. (in English)
[23] J. D. Brown and T. Hudson, Criterion-referenced language testing. Cambridge: Cambridge University Press, 2002. (in English)
[24] J. Lewkowicz and C. Leung, “Classroom-based assessment,” Language Teaching, vol. 54(1), pp. 47-57, 2021, doi: 10.1017/S0261444820000506. (in English)
[25] Council of Europe. Common European framework of reference for languages: learning, teaching, assessment, 2020. [Online]. Available: https://rm.coe.int/common-european-framework-of-reference-for- languages-learning-teaching/16809ea0d4 (in English)
[26] E. Piccardo, “From communicative to action-oriented: a research pathway,” 2014. [Online]. Available: https://transformingfsl.ca/wp-content/uploads/2015/12/TAGGED_DOCUMENT_CSC605_Research_Guide_English_01.pdf (in English)
[27] D. L. Bandalos, Measurement theory and applications for the social sciences, New York, London: The Guilford Press, 2018. (in English)
[28] J. Harmer J. The Practice of English Language Teaching (4th ed.), Pearson Longman ELT, 2007. (in English)
[29] J. D. Brown, “Assessment in ELT: Theoretical options and sound pedagogical choices,” in English language teaching today. Linking theory and practice, W. A. Renandya and H. P. Widodo, Eds. Springer, 2016, pp. 67-82. (in English)
[30] Ushakova, “Osoblyvosti vykorystannia systemy Moodle v umovakh dystantsiinoho vykladannia anhliiskoi movy dlia studentiv 1 roku navchannia [Features of using LMS Moodle in distance English teaching for the 1-year university students],” in Osvita 2.0: zbirnyk materially naukovo-praktychnoi konferentsii [Education 2.0. Proceedings of scientific-practical conference], Sievierodonetsk: LDUVS imeni E. O. Didorenka, 2021, pp.128-134. [Online]. Available: https://lduvs.edu.ua/wp-content/uploads/Docs/books/Osvita2.0.pdf (in Ukrainian)
Appendix A. Examples of the test items
Sources used for tasks preparation:
1. Exam English. Free Practice Tests for learners of English. Retrieved 21 November 2021 from https://www.examenglish.com/B1/b1_listening_restaurant.htm
2. Oxford Learner's Dictionaries. (2021). https://www.oxfordlearnersdictionaries.com
3. Cambridge Dictionary. (2021). https://dictionary.cambridge.org
4. Hewings, M. (2013). Advanced Grammar in Use. A self-study reference and practice book for advanced learners of English (3rd ed.). Cambridge University Press.
Размещено на Allbest.ru
...Подобные документы
Teaching practice is an important and exciting step in the study of language. Description of extracurricular activities. Feedback of extracurricular activity. Psychological characteristic of a group and a students. Evaluation and testing of students.
отчет по практике [87,0 K], добавлен 20.02.2013Oxford is the oldest English-speaking university in the world and the largest research center in Oxford more than a hundred libraries and museums, its publisher. The main areas of training students. Admission to the university. Its history and structure.
презентация [1,6 M], добавлен 28.11.2012Программа online обучения как программа, основанная на изучении материалов и взаимодействии студента с преподавателем посредством сети Интернет. Особенности процесса проектирования изделия. Виртуальный университет и виртуальное рабочее место одновременно.
реферат [12,1 M], добавлен 26.03.2011Oxford is a world-leading centre of learning, teaching and research and the oldest university in a English-speaking world. There are 38 colleges of the Oxford University and 6 Permanent Private Halls, each with its own internal structure and activities.
презентация [6,6 M], добавлен 10.09.2014University of Cambridge is one of the world's oldest and most prestigious academic institutions. The University of Cambridge (often Cambridge University), located in Cambridge, England, is the second-oldest university in the English-speaking world.
доклад [23,1 K], добавлен 05.05.2009What are the main reasons to study abroad. Advantages of studying abroad. The most popular destinations to study. Disadvantages of studying abroad. Effective way to learn a language. The opportunity to travel. Acquaintance another culture first-hand.
реферат [543,8 K], добавлен 25.12.2014About University of Oxford. The University consists of 38 faculties and colleges, as well as the so-called six dormitories - private schools that do not have the status of college and belonging, as a rule, religious orders. Structure of the University.
презентация [2,1 M], добавлен 11.11.2014The history of the use of the interactive whiteboard in the learning. The use of IWB to study of the English, the advantages and disadvantages of the method. Perfect pronunciation, vocabulary. The development of reading, writing, listening and speaking.
презентация [1,3 M], добавлен 23.02.2016Study the history of opening of the first grammar and boarding-schools. Description of monitorial system of education, when teacher teaches the monitors who then pass on their knowledge to the pupils. Analysis the most famous Universities in Britain.
презентация [394,4 K], добавлен 29.11.2011Disclosure of the concept of the game. Groups of games, developing intelligence, cognitive activity of the child. The classification of educational games in a foreign language. The use of games in the classroom teaching English as a means of improving.
курсовая работа [88,5 K], добавлен 23.04.2012Involvement of pupils to study language as the main task of the teacher. The significance of learners' errors. The definition of possible classifications of mistakes by examples. Correction of mistakes of pupils as a part of educational process.
курсовая работа [30,2 K], добавлен 05.11.2013The development in language teaching methodology. Dilemma in language teaching process. Linguistic research. Techniques in language teaching. Principles of learning vocabulary. How words are remembered. Other factors in language learning process.
учебное пособие [221,2 K], добавлен 27.05.2015Характеристика сущности информационно-образовательной среды дистанционного обучения. Использование виртуальных библиотек, online учебников, тестов и научных разработок для получения знаний. Ознакомление с опытом "Открытый университет Великобритании".
контрольная работа [33,8 K], добавлен 03.04.2014Motivation to learn a foreign language in Kazakhstan. Motivation in the classroom. The role of games on language lessons. Examples of some games and activities which had approbated on English language lessons. Various factors of student motivation.
курсовая работа [25,0 K], добавлен 16.01.2013Development of skills of independent creative activity in the process of game on the lessons of English. Psychological features of organization of independent work and its classification. Development of independence student in the process of teaching.
курсовая работа [35,8 K], добавлен 03.04.2011Investigation of the main reasons English language jelly. Characteristics of the expansion content Total Physical Response; consideration of the basic pedagogical principles of its use in teaching language inostannomu junior and senior school age.
курсовая работа [40,2 K], добавлен 21.02.2012Features of training of younger schoolboys and preschool children. Kognitivnoe development of preschool children. Features of teaching of English language at lessons with use of games. The principal views of games used at lessons of a foreign language.
курсовая работа [683,5 K], добавлен 06.03.2012Italy - the beginner of European education. Five stages of education in Italy: kindergarten, primary school, lower secondary school, upper secondary school, university. The ceremony of dedication to students - one of the brightest celebrations in Italy.
презентация [3,8 M], добавлен 04.04.2013What is the lesson. Types of lessons according to the activities (by R. Milrood). How to write a lesson plan 5 stages. The purpose of assessment is for the teacher. The students' mastery. List modifications that are required for special student.
презентация [1,1 M], добавлен 29.11.2014Peculiarities of English nonsense rhymes – limericks and how to use them on the classes of English phonetics. Recommendations of correct translation to save its specific construction. Limericks is represented integral part of linguistic culture.
статья [17,5 K], добавлен 30.03.2010