Work place: Faculty of Educational Studies, Universiti Putra Malaysia, Selangor, Malaysia
E-mail: mohd7010@gmail.com
Website: https://orcid.org/0000-0002-6192-6697
Research Interests:
Biography
Mohammed Al Ajmi, he is a PhD student at University Putra Malaysia (UPM), he got a master’s degree holder in Educational Measurement from Sultan Qaboos University (SQU). He works as Director of the Career Guidance and Counseling Department at the Ministry of Education in the Sultanate of Oman (2019- now). He has contributed to the preparation and publication of many scales such as the professional interest scale and the entrepreneurial traits scale. He has extensive experience in the field of statistics according to the classical theory and the item response theory, as well as he has extensive knowledge in dealing with various statistical programs such as SPSS, BILOG_MG, MULILOG, M PLUS, AMMOS, WINSTEPS and ISDL.
By Mohammed Al Ajmi Siti Salina Mustakim Samsilah Roslan Rashid Almehrizi
DOI: https://doi.org/10.5815/ijeme.2024.06.04, Pub. Date: 8 Dec. 2024
Employing Computerized Adaptive Testing (CAT) to evaluate verbal ability symptoms proves advantageous over traditional tests by delivering heightened measurement precision and reducing the testing burden. The CAT-Verbal Ability, developed from a large sample of 2689 participants in Gulf countries, underwent meticulous item bank development, ensuring unidimensionality, local independence, and investigating differential item functioning (DIF). The CAT-Verbal Ability item bank has high content validity, is unidimensional, locally independent, and does not have DIF; these outstanding psychometric qualities were confirmed by CAT simulations that were based on real data. With just 14 items needed, CAT simulations showed a high degree of measurement accuracy (r=0.73). In addition to being a psychometrically sound instrument, the proposed CAT-Verbal Ability demonstrated acceptable marginal reliability, criterion-related validity, sensitivity, and specificity. This makes it an efficient assessment method that reduces testing burden while maintaining information integrity, and it also saves time.
[...] Read more.By Mohammed Al Ajmi Tahra Al Ajmi
DOI: https://doi.org/10.5815/ijeme.2024.05.01, Pub. Date: 8 Oct. 2024
This quasi-experimental study investigates the effect of Collaborative Writing (CW) on the writing performance and their perceptions of 36 Grade 9 female students in Oman. The study, rooted in the shortcomings of traditional product-oriented writing instruction, employs pre-/post-writing tests and a questionnaire to measure the effectiveness of CW. The findings reveal a statistically significant improvement in the experimental group's writing scores, emphasizing the positive influence of CW on various writing components. Additionally, students express favorable attitudes towards CW, particularly in editing and revising texts collaboratively. Despite these positive outcomes, challenges such as unequal work distribution within groups are identified. The study concludes by acknowledging limitations and suggesting avenues for future research, emphasizing the potential of CW to enhance students' writing skills and foster constructive dialogue in language classrooms.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals