This section is crucial and represents the foundation of what comes next in the design stage. Here we focus on discovering and analysing the ‘nitty gritty’ of your own situation in relation to e-assessment. This is where you explore your college systems and develop your own abilities to use them, as well as any external tools that you need.

A really good tip is to borrow a method from the world of art and design and start keeping a sketchbook / notebook / log to jot down ideas and questions. This may be paper or a digital Word document or some other digital record, although many find the immediacy of paper and pen the best.

You need to gather this information and develop practical skills in this section in order to understand your local context and discover any limitations you need to work around. This section is all about developing an enquiring, critical, and systematic approach. By ‘critical’ we especially mean not taking things for granted or believing what you are told at face value – this is especially important in relation to learning technology and popular expectations and stereotypes in relation to students. You really do need to try things out here first before moving onto the design phase.

Analyse Tips

  1. Make sure you and your colleagues are really familiar with the technologies you are planning to use – you need to aim to be as self-sufficient as you can be and only need minimal support from central services (they are under pressure to!). Set yourselves targets of what you need to learn to deliver a particular e-assessment and then do a ‘test run’ using a test student account – so you see what the students will see.
  2. Explain what you want to do to your central support services and ask for their help early on – do not leave it to the last minute. It is wise to plan 6/12 months ahead of going live for a new e-assessment. Discuss your ideas early on with any learning technologists or support staff you have access to. They can help clarify your ideas and let you know how they can help. Look out for any staff development events or drop-in sessions to get ideas and find out what others are doing, remember to look beyond your college as well[1]
  3. Make sure you are familiar with the way your system (the VLE, e-Porfolio etc.) records and manages marks – they can be a bit tricky. Know how to export the marks in different electronic formats so they can be imported into the student records system (to avoid the risks of manual re-entry or the inconvenience of conversion back to paper)
  4. One common problem with recurring e-assessments (like essay submissions or online tests) is that due to the large time intervals between them lecturers often forget how to reset them in the system – using the controls for dates, times and access conditions etc. A good tip is to create (ideally this is a central service job) a help guide that sets out the steps needed in detail and make this available online. Another good tip is that each lecturer and department should work collaboratively to develop a ‘preflight checklist’ of things to do in relation to maintaining their e-assessments at the start of each term – ideally in coordination with any central support service.
  5. Remember that the type of digital literacy required of your students will mean knowing how to use college systems that are often quite complex, ‘clunky’ and old fashioned relative to what students are used to in social media. To be fair to college systems and educational software generally – they are doing a very different job to the well funded and developed commercial social media products. So, being a whizz on Facebook does not mean being any good at using a college VLE etc. Don’t believe the hype that all youth are automatically tech[2] experts! This article about American students tech skills does quite a good job at busting this particularly pernicious stereotype of young people.Are your students ready? Do not assume they have the skills to use the college online systems / equipment needed to access your online learning resources and activities or your e-assessment resources. Ask them and use the UHI skills checklists (see the Downloads section of project website) to assess student digital literacy to use college systems
  1. Be aware that often college VLE or e-Portfolio tools will have limited functionality and display differently on mobile devices – find out what yours look like on Android and Apple tablets and phones (involve your local IT / Learning Technology Department). Do this early on.
  2. One of the first useful e-assessments tasks you can do is to set up a simple MCQ diagnostic test in a college VLE to assess if students have to skills to use college software (VLE, e-Portfolio etc.) and what devices they use to access content outside college and how they access the internet. This provides a useful baseline for your planning, it is well worth suggesting this is incorporated into standard induction procedures.
  3. If you are planning to conduct summative e-assessments that require invigilation (also known as proctoring in the USA and elsewhere) you are likely to need access to college facilities (e.g. classrooms with computers). You need to arrange access early on and plan your invigilation arrangements. You must do a test run (it can be short) with your students beforehand to familiarise them with the system they will be using. Some colleges have set up purpose built and equipped e-assessment centres[3]
  4. Make sure you are familiar with the internal quality management system at your college – in the Scottish system this is usually called ‘Internal Verification’. This records and examines any changes to the courses – especially assessment. So make sure you record these changes and get them approved, think about using the design template introduced in the next section of this toolkit.
  5. Prepare for the external quality management procedures that your college is subject to. In the Scottish system this is called ‘External Verification’ and is carried out by subject experts appointed by the SQA. Again, think about using the design template introduced in the next section of this toolkit.
  6. If you are using social media or other commercial non-college services in connection with e-assessment (or indeed learning in general) you will need to consider your own personal and employer legal responsibilities in relation to data protection, privacy, copyright and child protection etc. you will find some useful information about this in the Design section of this guide under the heading entitled ‘ Checklist for Social Media e-Assessment tools – Leaving the Reservation’
  7. When you are thinking about developing an e-assessment it makes sense to target an area that will return real benefits (not some marginal case) – so think in terms of reaching large number of students, or making off-campus submissions possible and reducing existing problems and bottlenecks (such as marking loads and late feedback to students). Start with a formative assessment exercise to iron out problems before moving on to any summative high-stakes assessment.Make sure you develop an understanding of the bigger picture in your college (your context) and how other factors will impact on your work.
  1. Training 1: If you are providing training to teachers in the use of the in-house college systems (VLE, e-Portfolio etc.) be aware that sometimes the poor usability of aspects of these systems can cause stress and a lack of confidence and a consequent loss of engagement and motivation (this is true of students also). To counter this, you need to make sure that you are fully competent in your own use of the systems. Provide detailed step-by-step help guides for the teachers to use under their ‘own steam’, the ones provided by UCL for Moodle are excellent, the official Moodle documentation site is also a must. In addition Moodle has its own YouTube Channel and a collection of online training videos that present training in short ‘chunks’ about aspects of the system.
  2. Training 2: Do not assume basic IT competence when providing training to staff (the same is true of students), do make sure you start with a basic check of the competences needed to undertake the training task in hand. You can then remediate / alter your training to fit. Going slow at the start like this lays the foundation for effective training; access to detailed guides already described helps the teacher become more independent. Manage teacher’s expectations at the start and stress the need to get the basics right and their own responsibilities to become adept with the systems.

Analyse Checklist

  1. Do you know how to use the tools involved? Have you completed a test e-assessment exercise as a student by using a test student account to do the assessment from a student’s point of view? Have you marked the test student assessment as a teacher and recorded the marks in the system? Do you know how to extract marks from the system to pass into the student records system?
  2. Have you developed or got access to detailed guidance on how to reset your e-assessments for a new term? Do you have your ‘preflight checklist’ for the start of each term?
  3. Have you checked you students’ skills in relation to using college systems? If you expect your students to work online outside college have you checked what personal devices they use outside college and what their access to the internet is like? Remember to inform students about college-based access to the internet and computers e.g. the library and study centres.
  4. Have you asked your central support services to check what your college systems look like on portable devices and any limitations in functionality? Have you tried this yourself
  5. If you are using non-college services have you checked out the legal situation?
  6. Do you have answers to the questions below about understanding your context?

Understanding your own context – prompts for analysis

You will find the outputs of the BOLT project produced by Borders College useful. Your context may just involve you as an individual, a department, a faculty or the whole college. These questions are prompts to help you develop your own picture – it will likely change as you continue working in this area.

Your Students

What are their characteristics – in general? Are they ready for some independent learning? Or do they expect to be closely supported? Will this affect your assessment planning? Are they able to use the college systems effectively? Remember students are one of the most under-utilised resources in education. The REAP project contains useful guidance on involving them in assessment practice – such as peer assessment and peer teaching. Remember to think about assessment for learning not just measuring it.

Subject area

Does your subject area have any characteristics that make it more or less likely for you to imagine using e-assessment? Often ideas come after a period of thinking about it – ideally talk to others. For instance at Glasgow the use of e-portfolios for assessment is growing in areas like construction trades. If your subject requires students to write reports and essays then an online submission is a natural progression from paper essays. The use of objective tests / MCQs has wide potential for application – but does require some thought and experimentation and of course considerable up-front investment of your time. Check out the project case studies for examples of solutions people have developed – what worked and what didn’t.

Are there any existing problem areas in your current assessment practice that you would like to use technology to improve, for instance, a high marking load or late feedback to students? The assessment design template that we introduce in the design section of this toolkit can help you record and share your ideas in a simple and consistent way – and you can customize it to suit your own needs.

Teaching Culture

What are the attitudes and values of the lecturers that you work with? As the Jisc guide to Networked Learning observes, the introduction of technology can highlight personal ideas, values and philosophy about teaching and learning in quite unexpected ways (see the illustration from the Jisc guide in section 1 ‘Getting Stated’ about contextual factors). Here’s a real example from a workshop we attended at one of the partner colleges:

“We took a long hard look at ourselves and our teaching. We realised that we had become stale and that we were teaching on the same programme as each other but in isolation – in our own little silos. We were teaching theory first then doing the practical work so the students had no context for the theory. We weren’t happy and neither were our students.

We decided to change the way we worked. Instead of teaching in this disjointed, way we worked together to redesign the curriculum. We moved from teaching by numbers to teaching through projects. This meant changing everything, especially the assessment as we had now merged 4 units together and were doing the assessment for them through the project work. This was much better, the theory was taught in a practical context and could be applied immediately. The students saw the point of the theory and did not have to wait weeks to use the theory they had been taught previously in an abstract manner. This meant getting the unit re-verified. The result? Students are much happier and are getting excellent results and the staff are happier too.”


Not surprisingly technology is a major factor in the successful use of e-assessment. So, as we say elsewhere, your No.1 priority is to find out the technology you college has and learn how to use it. It is especially important to do this early on in the process and not leave it to the last minute. If you are using a technology that only works well in certain web browsers (as is the case with many of the VLEs) you need to know what those web browsers are and if they are supported in your college. You must make no assumptions in these matters and you must find out for yourself and test the technologies regularly. Obviously this is a lot easier if you are working as part of a collaborative group or team and have some technical support.

Central IT services often have policies restricting what technologies that they will allow on college machines and offer support for, like browsers and plugins and versions of programmes such as Microsoft Office, etc. This is why it is wise to start with a pilot exercise that targets formative assessments in order to find out about your local technology and administrative context. Try to find out when upgrades and changes to the IT systems are planned by the college, if there is no policy of communicating this kind of information routinely ask your IT Dept. Make a point of telling your college IT service when your assessments are scheduled and ask them to alert you to any changes during that period. It obviously makes sense to cultivate good relations with your IT department and find someone you can talk to there. Many central IT departments are still coming to terms with e-learning technology as being part of their support remit and actual arrangements on the ground may still be under negotiation. If things do go wrong due to unannounced systems change etc. having a clear electronic ‘paper trail’ of consultation and notification will help make clear where responsibility lies.

As we indicate elsewhere, it is important to find out what your students skills levels are in relation to using college systems (as are those of your teaching colleagues) and take any remedial measures early on.

Learning Technology Support

Leading on from the previous section if you have access to learning technologists you can ask for their support. This can be especially important during setting up and testing an assessment. Aim to make yourself self-sufficient over time.

Administration systems

Your college administrative systems may be a mixture of different paper and electronic systems and you need to see how this affects your e-assessments over their whole lifecycle. Systems of paper assessments have well-established processes that do not require much engagement on your part. But with e-assessments you are much more likely to need to need to be able to trace the flow of information and be prepared to take steps to intervene. Again this is a good reason for doing pilot exercises to iron these things out.

Quality Systems

In Scotland this will centre on the internal and external verification procedures in relation to SQA qualifications. Your college will have established internal verification (IV) procedures that are there to manage and account for any changes to teaching and assessment. Often this will primarily take the form of paper-based records, although increasingly colleges are moving to using online systems that include simple shared network drive folders or tools like SharePoint, Drupal or even ‘private’ areas within the VLE system that are used solely for administrative functions.

The crucial thing is to record your reasons for changing an assessment and to indicate where and how verifiers can see and examine the evidence of learning produced by your students. We have produced a simple and adaptable design template that should be able to help with this process. Before we leave this area we suggest that it is really worthwhile to agree a naming convention for common items such as module, unit, programme qualification etc. as well as test, mcq assignment, dropboxes etc. Develop a common structure and layout for online learning resources in the VLE and the location of assessments. It is worth thinking about having an agreed glossary for these terms and to get staff to stick to it and publish it in the VLE for the students to refer to as well. It is also worth thinking about having the assessments for a unit in a VLE located consistently in the same place in the online course (it is confusing for lecturers and students alike when they appear all over the place!). These all seem like small things but together then can make your students (and your own) experience much easier.

Institutional Factors Mini Checklist

These factors are much more general and in some cases intangible but can also be the most important, here are some things to consider

  1. Strategy: is there a clear and agreed strategy for the use of learning technology and e-assessment? Is there an implementation plan? (it is not uncommon to have a strategy but no plan for implementing it). Is progress monitored / audited? Are there resources allocated to support this?
  2. Academic leadership: We shall be picking up this theme in our Collaborative Frameworks section later. Is there a clear ownership of pedagogic and educational matters at the college or is it scattered across several units? Is there a unit that deals with this and is it led be a senior teaching academic?
  3. Morale: What is the staff morale like? This can have a big bearing on the appetite for change and openness to trying new things
  4. Support: Is there adequate resources to support for staff in e-learning / e-assessment? In terms of IT infrastructure and learning technologists. Is there access to training and development inside and outside the college?
  5. Finances: The state of the college finances will have a major impact on the other factors and on planning, especially on staffing levels for teaching, learning technology support and equipment and IT infrastructure

Some Typical obstacles

Pain Points

Jisc have produced a really useful review of this area in their Electronic Management of Assessment (EMA): a landscape review and In a section called ‘Pain Points’ they have looked at the way different factors are affecting the take up for e-assessment:

  • Teaching Models
  • Technology
  • Process (administration etc.)
  • Culture (ways of working, reporting, attitudes values etc.)

Their conclusions and observation closely mirror our experience and include:

“The interplay between all of the factors is complex: it is evident that the existing commercial and open source systems do not effectively support all of the existing processes but there are equally some cases where process improvement could clearly be achieved. Similarly, we heard some quite harsh comments about institutional culture but it is clear that experiences with immature or unreliable technologies can turn neutral (or even slightly positive) early adopters into resisters.”

“Staff resistance and attempting to change a long embedded culture are some of the most difficult issues and we have been met with some knee-jerk and excessive reactions.”

Jisc EMA Report

 The lack of integration between the VLE and the administration systems is a particularly problematic one and often compounded by the different lines of responsibility and control and resourcing for the VLE, IT and admin systems in many institutions.

Student Skills and Attitudes

Another key barrier are existing student skills and attitudes, together with the preparation and orientation that they may need to undertake e-assessment. We could describe this as the digital literacies needed by students to use the college e-assessment systems effectively. The Heart of Worcester college has produced some award winning development resources to support students. This often comes as a surprise to people new to adopting e-assessment; the assumption is often that the skills problems will be with the lecturers. Despite the considerable commercially biased hype that exaggerates the digital abilities of young people[4] the actual research continually paints a very different picture[5]. Jisc have produced a guide to developing student literacies that includes the concept of the ‘7 elements of digital literacy’ this is illustrated in the diagram shown below:

7 elements of diglit
Caption: The Jisc 7 elements of digital literacy Licence CC BY-NC-ND

As one student observed at City of Glasgow College – ‘just because you are good on PlayStation or Facebook does not mean you can use the VLE!’ In our project many lecturers reported problems with students IT skills at a basic level when required to do things like rename files and upload them to a submission. This was also identified as problem at the UHI (University of the Highlands and Islands), widely regarded as being at the forefront of e-learning in Scotland, who have produced checklists of the basic skills needed to participate with links to remedial support resource, these are available in the Further Information section of the project website

Staff skills and attitudes

The Jisc ETNA staff skills surveys of teaching staff in FE in Scotland suggest that there is growing confidence in using computers and common ‘office’ applications. But these surveys showed there is markedly less experience and confidence in using online web-based systems like VLEs and e-Portfolios. This is backed up by the work of the Borders College BOLT project:

“Research suggests that most academics are not using new technologies for learning and teaching, nor for organising their own research (ref: New Media Consortium Horizon Report 2013)

In our project this observation certainly fitted with our experience, we certainly found that lecturers were not confident in using the VLE or e-portfolio systems. This situation is further complicated by the number of online systems lecturers might have to master. The Jisc Electronic Management of Assessment (EMA) report observes:

“The key systems are generally:

  • Student record system: as the home of definitive grading information.
  • VLE: used for feedback and marking.
  • Dedicated assessment platforms: with the submission, originality checking, feedback and marking functionality in the Turnitin product suite being widely used.
  • e-Portfolio

[But] Lack of systems integration means that we do not have an end-to-end EMA experience. Students and staff have a disjointed experience and require much more guidance than should be needed …

Despite the relatively limited nature of the core product set, the key integration points between these technologies remain problematic and a source of considerable manual intervention. The sheer amount of administrative effort required to transfer data between systems is a real problem. Returning marks from the VLE to the student information system is a distant hope.”

So, we still have a long way to go in terms of system integration and data management, with multiple paper and electronic systems being used, although Jisc are currently researching a feedback hub to improve matters[6].

Usability Factors

The actual tools provided in-house by the colleges for lecturers and students to use may be old in terms of technology and design and can suffer from usability issues. To be fair, the actual tasks that the tools are supporting are in some cases quite complex themselves – particularly on the teaching side. However, that said, the usability of the tools is an issue and does affect the take-up of these technologies. Once an institution has adopted these tools there is little appetite for changing them, which in turn reduces the incentives for improvement amongst suppliers, so this situation is unlikely to change quickly. This is why you need to use the tools yourself and experiment with them to get to know them and their limitations. In general the usability issues affect the teachers using the system much more than the students due to the complexity of the tasks in setting up assessments, with date, times and access conditions being particularly problematic – partly due to the terminology used.

It is particularly important to be able to see the systems from a student point of view, in some software products there is a ‘student view’ function (the one in Blackboard is particularly good) but it is also sensible to have some test student accounts to allow you to step through your assessments in the system exactly like a student and to record student data in the system.

Beginning to Develop Creative and Systematic Solutions

At this stage you should have a good idea of your own working context, and you should have gathered quite a bit of information and analysed it. Most importantly, you should have a clearer idea of the limitations that you face. This might all seem a bit overwhelming but we reckon identifying these factors early on will provide a solid foundation for progress and reducing wasted time and frustration later on.

At this stage you should be well on the way to developing a critical and systematic approach to working in this area and seeking to interpret your findings in order to develop your own analyses of what may be possible and what may be useful to your students and yourself. So, that covers the systematic component needed for developing effective solutions. In the next section we start to explore the creative dimension of developing effective e-assessments.

[1] The College Development Network, Jisc, ALT and the various user groups all hold events – see the Further Information section



[4] Jisc have produced a useful guide about developing student digital literacy:



Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.