This guide forms part of our assessment resources. We have created this guide for those who manage the assessment processes from an academic and administrative perspective, as well as suppliers and IT staff who manage supporting EMA systems.

Listen to our podcast to hear how some universities and colleges have been using our resources to help improve assessment and feedback processes and make better use of IT systems. Alternatively, you can read the full text transcript.

Common problems

Variation in approach across the institution

Research from our 2014 EMA landscape review1 showed that responsibility for assessment and feedback policy and procedure is often devolved to local level within institutions. This means that large institutions rarely have a single, institution-wide, business process for a given function.

Different faculties, schools, departments and programmes each have their own way of doing things.

But this variation can prevent you from achieving the efficiencies and benefits possible through EMA technology. Your organisation is likely to need a series of time-consuming and cumbersome workarounds to adapt your information systems to many different ways of carrying out the same activity.

Technology 'bolted on'

Participants in our research frequently commented on the extent to which new technologies are 'bolted on' to old processes without people really taking the time to stand back and consider what the process is really intended to achieve.

In some cases poor process design is due to lack of time and appropriate skills. During our assessment and feedback programme, academic staff voiced a concern that they often find themselves on a ‘treadmill’ due to poorly designed processes. Their workload is such that they can’t pause to think about doing things differently.

Academic staff also recognise that they don’t have the skills to undertake process review and redesign without more specialist support, yet they know that they can’t improve their pedagogy without better designed processes.

Organisational myths

In other cases a significant part of the problem is the persistence of organisational myths surrounding policy and process. The tendency to do things the way they have always been done is perpetuated by a belief that this is somehow enshrined in policy.

But academics are often surprised to find that many characteristics of the process are matters of historic choice rather than regulatory issues. They are often surprised at how few regulations there actually are or how easy it is to make changes to perceived barriers in the regulatory frameworks.

Variation in how staff apply assessment policy across an institution is often down to such myths about what actually constitutes the policy in the first place.

Impact on student experience

Different approaches to carrying out the same task impact not only staff workload but also the student experience. For example one institution with capacity to accept e-submission of all written assignments noted the following variations:

One faculty accepted e-submission for postgraduates only but then printed out the assignments for marking

Some course teams were happy to accept and mark submissions electronically, but students were still required to submit a paper copy to meet the requirements of the coursework receipting system

One department required students to hand in a hard copy for marking and also an electronic copy to be submitted through the plagiarism detection system

Footnotes

Implementing processes

Many institutions struggle to get the most out of their information systems due to the variety and complexity of their business processes. But meeting the necessary quality standards associated with EMA processes doesn’t need to be complex. The principles underlying this approach include:

Ensuring tasks are carried out by the right people, whether a learner, administrator or academic staff member

Automating any necessary routine administrative tasks. Digital technologies open up possibilities for streamlining the workload associated with vast quantities of paper

Ensuring your processes have a sound academic rationale.

We can help you identify what a process needs to achieve, what skill set is needed and where information systems can help. What we can't do is design standard workflows that will suit your institution.

Workflows and processes

By workflows we mean how a process is carried out - ie, the order of tasks, who has responsibility and how information flows between people and systems. Such details will depend on how your organisation is structured and what information systems you use.

You may have different workflows for different assessment situations which is fine as long as there are valid reasons for each variant.

Improving EMA processes

We suggest you compare our 10 step model and the detailed breakdowns against practice in your area of responsibility and ask yourself the following questions:

Are you doing additional tasks - if so, why?

Are the tasks being done by the right people eg, do you have academic staff undertaking administrative duties that do not require academic judgement?

Do you have systems that could carry out some of the tasks you are doing manually?

Do you have multiple ways of performing the same task - if so, why?

You can find out more about improving your processes in our guide to process improvement. It includes some simple techniques for analysing issues and ideas for change.

In particular some of the techniques you might consider are: flow charts with swim lanes, rich pictures and Responsibility, Authority, Expertise and Work (RAEW) analysis. Read more about this at the end of the process mapping guide.

You can also find out about how some universities have benefited from so-called Lean approaches in what is a process?

Submission, marking and feedback process

Ten step process

There are ten core tasks in the process of submission and marking and feedback. These correspond to stages four to seven of the assessment and feedback lifecycle and are the most problematic stages according to our research.

We have chosen to show 11 boxes because 'apply penalty or mitigation' is a single task but might occur at different times depending on whether the penalty was for late submission or academic misconduct.

By making good use of EMA systems you can cut out much of the manual intervention that consumes resource, introduces error and adds little value to the learning experience.

Below is our overview of the submission, marking and feedback process.

This model covers all types of summative assessment where tutors give a mark as well as qualitative feedback. It also covers iterative processes where students might:

Undertake formative checking of their own work using text matching tools to review academic integrity

Undertake self or peer review

Be required to show evidence that they have engaged with their feedback before their mark is released.

This is a high level overview. There is no right or wrong way to draw a process map - you need to choose the level that is right for you. For example you might choose to break down some of the sub processes further. Find out more in our guide to process mapping.

Submission process - a detailed view

Below is an example of the submission process broken down into more detail. We have identified the role of the EMA system to show which tasks you can automate.

The letters on the diagram (A, B, C etc) give you hints and tips about managing the task - they can be found in the footnotes below which correspond with each stage on the diagram.

Footnotes

Submit assignment

Tips for institutions

For this process to work seamlessly you will need accurate data on programmes and modules and their assignments and deadlines as well as accurate information about each student cohort and their module enrolments.

You may need to integrate data from your student record system and course information system prior to this stage.

Ability to accept bulk upload of submissions in order to manage data flows at peak times

Ability to handle submission of multiple drafts and final submission for same assignment.

Is it a valid assignment?

Tips for institutions

This may require human intervention as a valid file type eg, Word document or pdf may contain gobbledygook.

System requirements (4.6)

Ability to validate that the submission is actually a file that can be opened and marked.

Provide receipt

System requirements (4.7)

Receipting acknowledging successful submission.​

Is it on time?

System requirements (4.8)

Ability to set notifications (alerts) of submissions.

Is this due to system failure?

Tips for institutions

Separation of the physical act of submission and receipting from the rest of the workflow will alleviate stress for students and allow you to deal with system issues. You will however need a clear set of policies and procedures to be followed in the event of system failure.

Ability to handle repeating assessments without attendance (REPWOA) ie, submitting an assessment that was set in a previous period.

Apply penalty/mitigation

Tips for institutions

You may be dealing with situations where extenuating circumstances are:

Known in advance eg, declared disability

Claimed as part of the assignment submission

Claimed separately.

Manage anonymity (if required)

System requirements (4.16-4.20)

Ability to present anonymised view to certain roles

Ability to identify submissions by ID number (either through integration with student record system or by providing field for student to enter ID number as identifier)

Ability to decouple lifting of anonymity and the return of grades/feedback

Ability to flag students with declared disability

Ability to alter parameters during the process eg, turn on anonymity once submissions have started.

Manage allocation of work to markers

System requirements (5.1-5.8)

Ability to deal with multiple markers for one assignment/create marking sets

Ability to add markers if risk of deadlines not being met

Ability to distribute assignments for marking either automatically or manually

Notifications for markers eg, available for marking/deadlines/late submissions

Second marking: (a) separate and dedicated spaces for first and second marker to record comments and different grades; (b) ability to reconcile grades while audit trail is retained; c) ability to display separate or reconciled grades as required to students

Ability to deliver either blind or open second marking

Security against overwriting another marker’s comments and/or mark

Ability to handle either parallel or sequential second marking.

Originality checking

System requirements (5.9-5.12)

Ability to determine when assignment has been submitted for text matching check

Ability to support checking of multiple drafts as well as final submission

NB. In some cases eg, creative arts, markers may need to see earlier drafts alongside the final one to understand the evolution of the work

Ability to request to view papers from other institutions where collusion is being investigated eg ability to nominate an email address.

Marking and production of feedback

System requirements (5.13-5.23)

Ability to support online or offline marking

Ability to support marking on a variety of devices

Ability to support access needs of markers eg, resizing, colour, interoperation with assistive technologies such as speech recognition software, design and configuration to support enlarged screen, all pages view or one at a time view

Ability to support multiple markers eg, second marker, self-assessment or peer reviewers

Footnotes

Manage allocation of work to markers

Tips for institutions

Consider whether staff development and greater transparency around marking and feedback can reduce the need for second marking.

System requirements (4.16-4.20, 5.1-5.8)

Ability to present anonymised view to certain roles

Ability to identify submissions by ID number (either through integration with student record system or by providing field for student to enter ID number as identifier)

Ability to decouple lifting of anonymity and the return of grades/feedback

Ability to flag students with declared disability

Ability to alter parameters during the process eg turn on anonymity once submissions have started

Ability to deal with multiple markers for one assignment/create marking sets

Ability to add markers if risk of deadlines not being met

Ability to distribute assignments for marking either automatically or manually

Notifications for markers eg, available for marking/deadlines/late submissions

Second marking: (a) separate and dedicated spaces for first and second marker to record comments and different grades; (b) ability to reconcile grades while audit trail is retained; c) ability to display separate or reconciled grades as required to students

Ability to deliver either blind or open second marking

Security against overwriting another marker's comments and/or mark

Ability to handle either parallel or sequential second marking.

Review originality report

System requirements (5.9-5.12, 5.24)

Ability to determine when assignment has been submitted for text matching check

Ability to support checking of multiple drafts as well as final submission

NB. In some cases eg, creative arts, markers may need to see earlier drafts alongside the final one to understand the evolution of the work

Ability to request to view papers from other institutions where collusion is being investigated eg, ability to nominate an email address where external requests for student papers can all be directed.

Marking and feedback

System requirements (5.13-5.23)

Ability to support online or offline marking.

Ability to support marking on a variety of devices.

Ability to support access needs of markers eg, resizing, colour, interoperation with assistive technologies such as speech recognition software, design and configuration to support enlarged screen, all pages view or one at a time view

Ability to support multiple markers eg, second marker, self-assessment or peer reviewers

Possibility of markers using range of tools and paper marking

Possibility of markers using range of tools and paper marking

Tips for institutions

Where a range of approaches are used recording marks in digital format can help avoid transcription errors. Storing marks in digital format can make collation and profiling easier and an online audit trail makes quality assurance processes easier.

Record feedback

Tips for institutions

Make use of time saving tools such as comment banks to avoid repeating frequently used comments. Try giving audio feedback if you type slowly - this can save you time and appear more meaningful and personal to students.

System requirements (5.25)

Ability to support feedback delivered via different media including uploading feedback created using an alternative tool.

Record provisional marks

Tips for institutions

Problems often occur because academics simply do not trust in the ability to edit central systems as needed and prefer to keep marks elsewhere ‘under their control’ until all adjustments have been made and marks have been verified. In some cases marks go unrecorded until late in the process due to a fear that Freedom of Information requests about unconfirmed marks could be problematic.

System requirements (5.28-5.34)

Ability to handle large cohorts by multiple markers eg, for moderator to view grades across different markers to ensure consistency in marking

Ability to identify anomalous marks

Ability to provide read only access to all papers or to a selected sample

Security against overwriting markers' comments and marks

Ability to record moderator's comments and suggestions away from student view

Ability to deliver anonymity for moderation purposes after return of marks and feedback.

Record changes to marks as a result of moderation

System requirements 6.5

Ability to record a mark before/after and the reason for adjustment.

External quality assurance

System requirements (6.6-6.8)

Ease/security of access for external examiners bearing in mind not all external assessors will be staff of a college or university

Ability to grant to external examiners view only access, ability to identify a moderation sample and/or restrict access to only to some submissions

Ability to download evidence of moderation and second marking comments (eg, pdf or Excel) for potential offline review by external examiner.

Return marks and feedback to students

Tips for institutions

Ensure students can see how marks are arrived at in relation to the criteria, so that they understand the criteria better in future

Ensure that the transfer of marks between different systems does not cause issues such as double rounding of numeric marks giving an inaccurate result

There is evidence that students make better use of feedback when it is available electronically and they are informed that it is ready for viewing.

System requirements (7.1-7.8, 8.1-8.5)

Integration with student information system so able to automate transfer of marks for use at formal ratification events eg, exam boards

Ability to handle marks and feedback separately

Ability to inform students that feedback is ready to be viewed

Ability to release marks and feedback to members of a cohort at different times where extensions have been granted

Ability to return feedback/grades by groups

Ability to extract feedback

Ability to collate feedback by marker, assignment or module

Lock down of marker comments after a given date or ability to identify and track any further comments made (by user ID, date) after marks and feedback have been returned to students.

Possibility that students need to evidence engagement

Tips for institutions

Disaggregation of marks and feedback can allow feedback to be released while marks are still undergoing moderation so that feedback is more timely. There is evidence that students pay more attention to feedback when they see it separate to their mark

If your system doesn’t support disaggregation of marks and feedback a workaround may be to avoid filling in the mark field then emailing it later after students have looked at the feedback.

Record exam board changes

System requirements (6.5)

Ability to record a mark before/after and the reason for adjustment.

Selecting EMA systems

Because the assessment and feedback lifecycle covers so many different functions, most institutions need a range of systems to support all of their activities. The key areas covered by information systems are usually:

Course and module information including assessment details

Student records including marks, feedback and final grades

Submission of assignments and e-portfolios

Marking and feedback

Academic integrity checking

Online testing and examinations.

Integrating systems

Ideally the systems that support the functions listed should be able to exchange data readily.

Currently, interoperability between systems remains a key problem area. In practice the emphasis is still on creating a set of interfaces to move data around between systems on a point-to-point basis. This is complex to achieve and brings with it a maintenance overhead as whenever a particular system is changed, a series of interfaces must be rewritten to update the links to all of the other systems.

The expectation is that modern IT systems should have good application programming interfaces (APIs) ie, a set of routines, protocols, and tools that describe each component of the system (data or function). These allow the various components to act together as building blocks so that systems can work together in a plug and play architecture.

The systems are not the only problem. System integration often throws up a host of issues around institutional business processes, workflows, data definitions and data quality. This is why we have tackled the two topics in tandem. You need to ensure your data and processes aren’t an obstacle to making best use of your existing systems, or to effective implementation of new and better systems.

System requirements

Through our working group of around 30 universities and the membership of UCISA, we identified institutions’ core requirements for information systems to support assessment and feedback practice – we consider these to be a minimum set of requirements.

The requirements are available in a downloadable format that maps to the assessment and feedback lifecycle and has supporting user stories to illustrate why the functionality is necessary.

Using the requirements specification with suppliers

We have publicised the specification through both our own and UCISA’s channels and invite suppliers to highlight which of the requirements their product supports.

We have published suppliers’ responses on our EMA blog and we invite customers of those suppliers to use the blog for comment and discussion.

The idea is that by sharing knowledge about effective use of a particular product and how to integrate it, institutions can maximise their existing investments and make good choices about new systems.

We suggest you ask your suppliers to:

Consider our specifications when preparing product roadmaps

Update their response to the specifications when they launch new product versions

Respond to customer discussion on our blog to help the community develop a better understanding of their product.

Using the requirements specification for HEIs

You can use the requirements specification to develop an invitation to tender (ITT) for a new system. This will not only save you work but you can also have confidence that major suppliers will be familiar with the requirements so you have a better chance of getting accurate and meaningful responses.

You can easily tailor the list to your particular specification and requirements.

It could also help you to evaluate the success of your current processes and systems.

Further resources

Here are some further resources and examples of how different universities have approached the mapping of assessment and feedback processes to help them review practice:

The University of Oxford‘s department of continuing education used a swim lane approach, with all actors identified, developed in Excel

Case study: nationwide process review - digital exams in Norway

In Norway a project to review the process of managing digital exams in universities has created good practice guidance that is relevant to the UK HE community.

The approach taken is similar to our own. It concentrates on essential tasks for all universities and identification of the role responsible for each task rather than the sequencing of workflows that varies between institutions.

A series of process maps clearly defines the steps universities need to take to move from current practice to a desired future state that cuts out unnecessary manual actions. The maps are also an excellent example of using the Archi modelling tool, originally developed by Jisc, that is now one of the world's most popular tools to support process analysis.

As a result of this work Norway now has a model IT architecture for digital assessment that is platform independent and based on recognised international standards.

Further resources

For more guidance on choosing new technologies to meet your needs, see our guide to selecting technologies. This will take you through managing a selection project, defining your requirements and conducting supplier evaluation.