By clicking the SUBMIT button, I’m providing the above information to Vretta for the purpose of responding to my request.
CONTACTlogo
twitterfacebookfacebook instagram
On-Screen Marking: A Practical Guide

September 30, 2023

On-Screen Marking: A Practical Guide

Share:TwitterlinkedinFacebooklink

Subscribe to Vretta Buzz


The UNESCO hosted their first Digital Learning Week[i] where educational professionals and policymakers gathered to focus on public digital learning platforms and generative AI. The discussions revolved around how these technologies could enhance humanistic education and contribute to Sustainable Development Goal 4 (SDG 4) for education. In the realm of digital learning and assessment, On-Screen Marking (OSM) holds significant potential to drive ongoing reforms and enhance the learning experience for all. OSM stands as a forefront of digital advancement in the domain of educational assessment, representing a pivotal component in the ongoing process of digitization in assessment[ii]. In simple terms, OSM entails the evaluation of scanned student work using a computer or tablet. In an era where traditional academic practices intersect with digital innovation, OSM emerges as a sophisticated approach to assess students' competencies and knowledge. Ofqual's 2020 report on Online and on-screen assessment in high stakes, sessional qualifications[iii] highlights the impetus behind transitioning to OSM - aligning assessments with the digital society, enhancing assessment validity, and fostering advancements in education.

This article aims to delve into the intricacies of OSM, exploring its origins, benefits, and role in contemporary education.

Historical and Practical Sides to On-Screen Marking

The evolution of assessment methods is intrinsically tied to question types - ranging from selected response to constructed response - and the assessment mode, spanning paper-based to computer-based formats. While computerized selected response assessments eliminate manual marking, constructed response assessments necessitate marking irrespective of the mode. These historical aspects also influence the choices made by jurisdictions seeking to modernize their assessment systems. Consider the example of a high-stakes assessment body of a digitally young jurisdiction: the decision between assessment method and mode shapes the trajectory of OSM adoption. Resource-constrained contexts may prioritize the use of selected-response items, potentially constraining the evaluation of higher-order thinking skills. Conversely, digitally mature jurisdictions might refine assessment methods to align with curricula. In parallel, educational institutions like schools and universities could embrace comprehensive digitalization, integrating OSM strategically. In essence, the practical application of OSM is intricately entwined with the digital readiness, available resources, and alignment with curricular goals of a jurisdiction. Grasping these scenarios is pivotal in harnessing the potential of OSM to elevate the landscape of assessment.

Advantages and Challenges of On-Screen Marking

There are different research studies conducted to justify the use of OSM at different levels of organizations, like schools or universities and large-scale assessment bodies. Therefore, there are different guideline documents being placed in practice by various organizations. One is quite relevant for schools produced by the European Schools (2017) Marking system of the European Schools: Guidelines for use[iv] that could be used to support schools and teachers in implementing the marking system practically, helping them engage effectively by providing well-informed guidance. One of the recent studies on the topic of student engagement[v] highlights the importance of supporting assessment literacy and staff development while addressing challenges in providing effective feedback. Additionally, University of Oxford (2017) developed Landscape Report: Research and Practice in E-exams[vi] that outlines a set of recommendations, including the importance of markers' IT proficiency for assessment that cultivate an on-screen marking mindset and eventually enhance inter-rater reliability and fairness just from marking typed experience. According to one of the blogs published by UNESCO’s IEP Learning Portal[vii], AI-assisted marking streamlines assessment by reducing teacher workloads and potential biases, while human supervision maintains fairness.

Advantages

Efficiency and Speed: OSM has gained popularity, especially during the pandemic, revolutionizing education and assessment. OSM speeds up marking, offering a more efficient alternative to traditional paper-based methods. This efficiency is crucial for timely and accurate assessments. OSM-generated data allows in-depth analysis of factors like response rates and time spent on each question, aiding educators in making informed decisions for better teaching.

Accuracy and Consistency: A standout advantage of OSM is its standardized marking process. By following predefined criteria, OSM ensures fairness and consistency in assessments. This approach reduces subjectivity in manual marking, providing fairer results for students. OSM also enhances transparency by tracing the marking process, boosting confidence in outcomes.

Environmental Benefits: OSM is eco-friendly by cutting paper usage. Traditional paper-based marking generates paper waste and environmental impact. OSM's paperless approach conserves resources and aligns with green practices, making it an environmentally responsible choice.

Accessibility and Flexibility: OSM is versatile, transcending geographical boundaries. Markers can work from anywhere with the internet, eliminating the need for physical presence. This accessibility empowers markers and educators, making OSM invaluable in remote learning.

In essence, OSM transforms assessments by boosting efficiency, fairness, eco-friendliness, and accessibility, benefiting students, educators, and institutions.

Challenges

Training and Adaptation: Moving to OSM requires digital literacy. Training is crucial, ensuring markers effectively use digital tools and understand e-marking nuances.

Security Concerns: Data security is essential in OSM. Flexibility can create vulnerabilities, like sharing marked papers or unauthorized access. Diverse marker pools and monitoring enhance anonymity and quality.

In navigating the challenges associated with OSM, addressing technical intricacies, providing comprehensive training, and prioritizing data security become imperative steps towards the successful implementation of this transformative assessment method.

Implementation of On-Screen Marking

In this section, we'll delve into practical instances that illustrate how OSM is effectively integrated into the assessment cycle. A compelling case study originates from Vretta[viii], demonstrating how their e-marking system has revolutionized the management of student responses and marker teams across multiple assessment life cycles.

Vretta’s integrated marking system, equipped with authorized accounts, provides access to marker team dashboards and the designated sets of items for evaluation. Notably, lead markers play a pivotal role in wielding a range of functionalities, such as:

  • Sending invitations to markers for assessing specific sets of exam items, thereby overseeing the marking process.

  • Allocating necessary training and validating the roles of markers.

  • Monitoring and comparing marker performance, both individually and in comparison to peers, with a keen focus on item-level marking.

  • Independently evaluating item marking and assessing the alignment of responses with established marking rubrics.

These versatile features operate within diverse contexts and yield valuable insights. They enable the analysis and evidence-based enhancement of both marker and item marking performances throughout the ongoing cycle of e-marking. This multifaceted toolset plays a vital role in refining the efficiency and effectiveness of the assessment process.

Now that we've explored practical instances of the integration of OSM into the assessment cycle, we could have a closer look at the process further to understand how it unfolds in practice.

Workflow

To provide an eagle-eye view of the process, let’s analyze it from various sides. The practice of marking and scoring involves four key stages: Assessment Configuration, Administration, Marking, and Reporting. These stages work together to ensure accurate, consistent, and efficient assessment of student responses.

Assessment Configuration: This step involves identifying which items are to be marked by humans and which by machines. It also includes setting up scoring profiles, rubrics, and criteria. Additionally, score options are established to maintain consistency. Batch allocation policies are defined to manage the marking process. Adjustments can be made if needed.

Administration: During administration, student responses are collected. For machine-scored items, predefined scoring configurations are used. This might include methods like using answer keys for multiple choice questions or algebraic evaluation. The platform includes real-time tools to review unexpected responses. Both digital and scanned responses are supported. A summary of scheduled sessions helps manage the distribution of human-scored items. There's also a feature to control the timing of response pooling for efficient scoring.

Marking: The Marking phase involves creating windows for marking, inviting markers and supervisors, and providing training materials and examples. After successfully completing the training session, markers claim batches of responses based on allocation policies and access both scanned and online responses anonymously. They can flag responses and access materials within the scoring view. Markers' productivity and accuracy are tracked. Supervisors oversee markers, reviewing flagged responses, applying second or third reads, addressing scan issues, and managing unclaimed responses. The back reading view allows supervisors to ensure accuracy. Key features are organized into categories:

  • Response Types, Assignment, and Double Marking: The marking system supports various response types such as multiple choice and text insertion. Items can be assigned directly or grouped for markers. Responses are randomly assigned to markers, with options to control the number of responses per hour and per school. Double marking is used for objectivity, involving two or more markers assessing the same question. Real-time reliability reports ensure accuracy.

  • Response Privacy, Notation, and Tracking: Student privacy is upheld with anonymous online responses and AI-driven cropping of scanned responses. Notation guides are available for each item. Progress reports are accessible in the item summary view. The scoring back-reading view displays markers and their history. A customizable marker attestation ensures confidentiality. Timestamps track marking time for each response batch.

  • Viewing Options and Commenting: Markers can view notation guides on the same screen or separately. Comment boxes are available for markers, while marking leaders can access comments. Markers' productivity and accuracy are tracked through a marker report. The item marker dashboard helps manage markers, with features like messaging and batch management.

Reporting: In the Reporting stage, student results are determined by applying decision rules to human-scored and machine-scored responses, ensuring accuracy and consistency.

Throughout these stages, the process ensures secure and private student information while delivering efficient and accurate evaluation of their responses.

Artificial Intelligence in On-Screen Marking

As discussed in previous articles about emerging technologies and trends[ix], there is an increasing pressure to integrate digitalization and enhance the currently provided educational assessment services. This demand is also fueled by the need for digital skills to drive transformational reforms and advance the field. The primary catalyst for these developments is artificial intelligence (AI). It is fascinating to observe how these technologies will continue to impact the assessment cycle, particularly the forefront of it – OSM. While AI is already being applied, their applications are set to become more intensive and widespread.

Here are some examples of how these technologies can revolutionize different phases of the assessment process:

Assessment Design and Authoring: AI enables OSM to accommodate more complex item types and interactions in assessments. Adaptive assessments that adjust difficulty based on individual student performance are created with AI assistance, enhancing the assessment experience. Machine Learning (ML) algorithms analyze extensive educational data to recommend optimal item formats, aligning with the interest in AI's role in education and assessment.

Scanning and Data Capture: AI transforms OSM by automating the scanning and data extraction process. ML models accurately identify and classify handwritten and typed responses, reducing the need for manual data entry. This leads to faster and more reliable data capture, thereby enhancing the efficiency of assessment management.

Marking and Feedback: AI enhances OSM through AI-powered marking assistance. ML algorithms pre-score responses, helping markers by identifying potential discrepancies for human review. Natural Language Processing (NLP) provides automated feedback on written answers, combining the context in AI and education to offer valuable insights for student improvement.

Data Analysis and Reporting: The infusion of AI in OSM generates intricate data patterns that offer deeper insights. Advanced analytics predicts student performance trends, pinpointing strengths and weaknesses. This information can guide institutions on utilizing AI-driven insights to enhance teaching strategies and student outcomes.

Quality Assurance and Security: AI strengthens the quality control of assessments in OSM. Intelligent algorithms validate marker consistency and identify potential errors, ensuring accurate and reliable marking. AI-powered plagiarism detectors flag instances of academic misconduct, aligning with the interest in maintaining assessment integrity across markets.

As AI continues pushing the boundaries of assessment technology, their integration into OSM is set to reshape how assessments are conceived, executed, and evaluated. The dynamic interplay between AI applications and the evolution of OSM presents an exciting area for exploration.

Conclusion

On-Screen Marking emerges as a trailblazing force, merging digital innovation with pedagogical practices in the ever-evolving landscape of educational assessment. It encompasses efficiency, fairness, eco-consciousness, and accessibility, heralding a new era of assessment methodology. The adoption of OSM varies based on historical context, regional digital readiness, and educational systems. The fusion of artificial intelligence (AI) with OSM enhances assessment design and development, increases test security, and strengthens academic integrity in the assessment process. Practical implementations, as seen in Vretta's case study, empower lead markers and markers, ensuring accuracy and insightful analysis. The convergence of AI and OSM promises transformative assessment methodology, propelling education into a realm of adaptable assessments and quality enhancement. In essence, OSM guides education at the intersection of technology and pedagogy, illuminating a path toward an enlightened assessment landscape.

About the Author

Vali Huseyn is an educational assessment specialist with extensive experience in enhancing key phases of the assessment lifecycle, including item authoring (item banking), registration, administration, scoring, data analysis, and reporting. His work involves strategic collaboration with a range of assessment technology providers, certification authorities, and research institutions, contributing to the advancement of the assessment community. At The State Examination Centre of Azerbaijan, Vali played a crucial role in modernizing local large-scale assessments. As the Head of Strategic Partnerships and Project Management Unit Lead, he co-implemented several regional development projects focused on learning and assessment within the Post-Soviet region.

Feel free to connect with Vali on LinkedIn (https://www.linkedin.com/in/valihuseyn/) to learn more about the best practices in transitioning to an online assessment environment.


References

[i] UNESCO’s Digital Learning Week: https://www.unesco.org/en/weeks/digital-learning#:~:text=UNESCO's%20first%20Digital%20Learning%20Week,at%20UNESCO%20Headquarters%20in%20Paris
[ii] Digitalization Review: A Guide to Modernizing the Assessment Life Cycle: https://www.vretta.com/blog/digitalization-review/ 
[iii] Ofqual’s 2020 report on Online and on-screen assessment in high stakes, sessional qualifications: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/943382/Barriers_to_online_111220.pdf 
[iv] Marking system of the European Schools: Guidelines for use: https://www.eursc.eu/BasicTexts/2017-05-D-29-en-9.pdf
[v] Can assessment-specific marking criteria and electronic comment libraries increase student: engagement with assessment and feedback? https://www.tandfonline.com/doi/full/10.1080/02602938.2021.1986468 
[vi] Landscape Report: Research and Practice in E-exams: https://www.ctl.ox.ac.uk/files/e-examslandscapereportv1-1pdf
[vii] UNESCO’s IEP Learning Portal: https://learningportal.iiep.unesco.org/en 
[viii] Vretta’s website: https://www.vretta.com/ 
[ix] The Learning Cosmos: Technology’s Role in the Evolution of Educational Assessment: https://www.vretta.com/blog/the-learning-cosmos/