With over 60 presentations from across the globe, this year’s Day and Online Conferences are set to be the largest yet. Make sure you don’t miss the unique experience that is eAssessment Scotland – the UK’s largest conference dedicated to exploring the best examples of eAssessment in the world today.
Day Conference (23rd August)
Karl Leydecker, VP (Learning and Teaching), University of Dundee
Assessment in Open Spaces
Catherine Cronin, National University of Ireland
Building Online Maths and Stats Assessments Using Numbas
Bill Foster & Christian Perfect, Newcastle University
Using MCQs in e-Assessment
Professor Phil Race (Independent author and workshop facilitator, also Visiting Professor, University of Plymouth)
Making Assessment Mobile
Lorraine Berry, OneFile
a. Edinburgh College Increases their Use of e-Assessment Using Questionmark
Ivan Forward, Questionmark and Gavin Lang, Edinburgh College
b. One-size Does Not Fit All! Flexible Approaches to Institutional Adoption of eSubmission and eFeedback
Laura Hollinshead, University of Derby
a. Video Assessment of Psychomotor Skills in Online Learning: Professional Fitness Practice
Professor Lon Kilgore, University of the West of Scotland
b. Digital Signatures
Thomas Davenport, Aston University
TBA, JISC TechDis
b. Can Your Assessment Plaform Do This?
Sue Milne, Mary McVey & Niall Barr, University of Glasgow & Paul Neve, Kingston University
Simon Williams, Chair of the eLearning Alliance
Structured Chaos, Learning Webs – and Assessment
Helen Keegan, University of Salford
Open Badges: Assessment for Learning
Steve Sidaway, MyKnowledgeMap and Doug Belshaw, Mozilla Foundation
Who Asks the Questions? Peerwise for the Vets
Denis Duret, University of Liverpool
a. What’s So Special about eAssessment?
Professor Sally Brown, Independent Consultant
b. What IF We Made Learning More Interactive?
Jon Hilton (Medical Student), St George’s University of London
a. Mahara Groups for Assessment
Domi Sinclair, University College London
b. Out with the Old and in with the New: How Using ‘Old’ Clickers in New Contexts Meets the Demands of Techno Savvy Students
Karen Brogan & Fraser McLeish, Glasgow Caledonian University
a. Certainty-Based Marking: Benefits of a Switch to CBM in Self-tests & Exams
Tony Gardner-Medwin, University College London
b. Using Appropriate Technology in Assessments for Learning
Dr. Barry Ryan, Dublin Institute of Technology
Anyone Need a Can Opener?
Fiona Leteney, BUPA
Erica Morris, Higher Education Academy
Many educators are adopting increasingly open practices, challenging conventional forms of learning, teaching and assessment. Social and participatory media are playing an ever greater role.
The focus of this presentation is on educators and students – our digital identities and pedagogical relationships in online spaces, particularly with respect to assessment and feedback. How do we use the affordances of social and participatory media in designing learning activities, structuring assessment, and enabling feedback? How can open practices be used to maximise student voice and choice, particularly with respect to assessment? And how can we, as educators, optimally engage with students in open online spaces, in what Joi Ito calls “networks of distributed creativity”?
Catherine will explore these questions in the context of current practices of open education, both within and outside higher education – drawing on research, highlighting examples, and inviting discussion.
Social and open platforms and practices allow learners and educators to break through traditional institutional boundaries, leading to increasing opportunities for international collaboration and the development of personal learning networks. Many educators are developing and adopting pedagogies that make the most of these new spaces, connecting their learners with peers and experts across the globe.
These ‘learning webs’ can lead to – and even thrive on – structured chaos. However, in an consumerised HE culture where many learners seem to be increasingly assessment-driven, focusing more on the actual grade achieved that the learning that takes place, there are significant tensions when dealing with assessment and feedback in new learning spaces.
In this presentation, I will highlight the need to provide opportunities for learners to move BEYOND assessment, focusing on curiosity and opportunity in a networked world.
On the 26th April 2013 the landscape for on-line learning changed forever! A project, code named ‘Tin Can’, came to fruition with the launch of the new e-learning standards. Officially known as the Experience API, the standard is stewarded by the Department of Defence in the US, but is still affectionately known as the Tin Can API.
Standards have been around since the late 90’s and they have not always lived up to expectations so why is there so much hype now? Well, I’ve been in the e-learning industry since 2000 and I can safely say I’ve never been so positive and excited about the future of our industry. Since joining Bupa 18 months ago I’ve recognised some of the potential opportunities that these new standards give us.
At the e-Assessment Scotland conference I’m looking forward to sharing:
- The Bupa 2020 vision, how our learning technologies will enable us to achieve it
- What on earth is Tin Can and how are Bupa involved
- The amazing art of the possible for eAssessment and how Bupa intend to take advantage of the new world
Description to follow.
Numbas is a freely available web-based e-assessment system which makes it easy to create, produce and deliver powerful, reusable and flexible online maths and stats testing. The system has been developed by a team at Newcastle University in response to widespread frustrations with “off the shelf” VLE integration of maths notation in question authoring and students’ answers and a desire for more sophisticated randomisation of questions.
Free to use and open-source with a user-friendly interface, Numbas allows you to build tests easily from freely available questions or to author your own questions and tests which include randomised content throughout and LaTeX rendering of mathematical notation. Tests provide instant feedback and review to the student, including rendering of the student’s answers in mathematical notation while they type, and a practice mode which lets students regenerate randomised questions for further practice without starting a new session.
Tests can be delivered as stand-alone web pages or through a number of VLEs, including Blackboard and Moodle, which can also track student progress and results. See the Numbas website for a more complete introduction.
You will learn how to use the Numbas question and test database to create your own mathematics and statistics tests using the over 600 questions available. Once created, these can be made instantly available to your users.
Who is this of interest to?
- Teachers, instructors, academics and support staff who think assessing numeracy, maths and stats skills online should be easy for instructors and productive for students for practice, formative or invigilated summative assessment.
- Anyone who has tried and been frustrated with maths notation in other e-assessment tools.
- Those wishing to convert static maths and stats problems into dynamic randomly generated question sets.
- Anyone wanting to find out more about Numbas to evaluate its suitability compared to other systems.
While it is relatively straightforward to design multiple choice questions on line for feedback purposes, it is a much harder task to use such approaches where the results count towards summative assessment, not least the need to consider item validity, reliability in terms of question facility and discrimination index values. There is also the difficulty of authenticity and security if the assessment is asynchronous.
The workshop will use the experience of participants to generate a check list for the design of multiple choice questions for us in on-line assessment. Participants may enjoy self-assessing their multiple-choice technique using a ‘content-free’ test which originated from Dundee many years ago.
What’s the future for assessment? Paper portfolios are rapidly becoming a thing of the past with eportfolios becoming the standard!
This workshop demonstrates OneFile Nomad — the most advanced mobile app available for learners and assessors. With Nomad you can complete assessments anywhere. Even without an internet connection. We’ll show you how to capture electronic evidence, assign criteria, and sign off assessments using a touch screen.
In this session, you’ll see how to:
- Complete an assessment on a mobile or tablet
- Sync progress back to an eportfolio
- Enrol your learners with our new app – OneFile Ignite
We’ll also cover:
- Assessor eportfolio reporting
- Return on Investment
Capture offline and sync online… it’s simple!
This session will give an overview of Mozilla Open Badges, a new way to credential both informal and formal learning. Run by Steve Sidaway from MyKnowledgeMap and Doug Belshaw from the non-profit Mozilla Foundation, participants will come away knowing more about Open Badges, how they can support assessment for learning, and how they can implement them in their organisation/institution.
Examples will be drawn from medical education and teacher training looking at how badging of achievement can help encourage professional learning competence, confidence and learning behaviours.
The session will benefit those in charge of e-Assessment systems, Learning Technologists, lecturers as well as those interested in new forms of accreditation. No prior knowledge is required or expected.
One of the hardest things about the assessment process is writing the questions for your student. Even the humblest multiple choice question requires you to have a sound understanding of the subject (when it comes to explaining the answer in the feedback!). Making sure that the question is valid, reliable and will challenge your students means that writing a good question is ultimately a test of your own knowledge, skills and experience.
So why not give the task to students?
Peerwise is a free online service that facilitates the process of students writing and sharing their own questions Denis Duret has been using Peerwise with the veterinary students at the University of Liverpool and will be sharing his experiences of the tool and how to improve the quantity and quality of the questions created by students. During the workshop, you will have the opportunity to author your own questions within the Peerwise platform.
Description to follow.
Edinburgh College works to meet the education and training needs of its 20,000 students, local industries and wider community. Since incorporating Questionmark into their learning system in November 2009, Edinburgh College has authored over 5,000 questions with well over 50% of their e-Assessments authored and delivered using Questionmark.
Delivering diagnostic, summative and formative assessments, Gavin Lang – the college’s Qualifications and Standards Manager will explain the various ways the system has helped to demonstrate a learner’s success using features such as the Questionmark Moodle Connector and Questionmark Secure.
How do you get academic staff at an entire institution using electronic submission and feedback? This presentation will provide an overview of the approach taken by the Learning Technology Team as the University of Derby went through a phased rollout of electronic submission and feedback from 2011 to 2013. The university now requires that, where possible, all student assignments be submitted electronically and supplied with electronic feedback. This includes text-based work, presentations, portfolios, images, and a variety of other types of assessed work.
Supporting these changes can be challenging. The presentation will focus on how staff can be supported centrally, through this transition, whilst still recognising the diverse range of needs across the university. In particular how important it is to help staff to develop practices which support their assessment design, students and themselves. It will explain how flexible approach to staff development helped to tackle the issues staff faced when adopting these new practices.
The types of issues include:
- Understanding which type of submission point to set up (we have more than one to support different file types and forms of feedback)
- Moving towards unfamiliar practices and systems
- Provide electronic feedback for work that was not submitted electronically
- Understanding the Turnitin originality report
- Assisting staff with accessibility needs to develop practices which support them and their students
As well as a description of the approaches taken the presentation will include an overview of the advantages and disadvantages of these approaches. It will also highlight how some ‘electronic submission myths’ developed during this process and how these are still being tackled by the team as part of their ongoing support. Attendees will hopefully benefit from understanding the experiences of the Learning Technology Team and be able to reflect on how these methods might work at their own institution.
The growth of internet-based educational opportunities has exponentially increased since the creation of the first web based course (1984) and first fully accredited online university (1996). In the USA over 6,100,000 students studied at least one wholly online course in 2011, up from approximately 1.5 million in 2002. Further, nearly two thirds of all higher education institutions claim online education as a central part of their long term institutional strategy (Allen and Seaman, 2011). This level of demand and support has created significant pressure on academics to transform traditional classroom courses into internet-based courses.
While certain disciplines and curricula are well suited and well represented in the growing global inventory of online courses, disciplines and courses with significant psychomotor skills and learning objectives are under-represented. Such courses are generally considered inappropriate for online learning with a stated innate difficulty of student assessment given as the rationale. Specific to this presentation, online courses that teach future fitness professionals how to carry out instructional tasks are limited in number. Those that exist provide video instruction and assess student knowledge, generally quite successfully, but do little in respect to assessing tangible student professional skills, the central element of their proposed employment.
The most common means of assessing skills is in the form of laboratory reports submitted online after a distance based activity with a subject. While psychomotor skills are presented to the student with this method and are practiced, an important element of assessment is missing; direct visual assessment of student abilities by an instructor. This presentation discusses the development of and demonstrates the use of an alternative to face-to-face psychomotor assessment, student generated video recordings of themselves performing specific psychomotor skills (demonstrating and teaching), done as part of a formal assessment strategy within a fully online course for future commercial fitness professionals.
As electronic submission of coursework and exams becomes more common it is important that institutions use an appropriate level of security to manage the lifecycle of the submitted work.
This presentation describes a system that applies a digital signature to work as it is received. It gives the student a receipt and identifies whether the work was submitted on time or late. The system applies additional signatures to the work as it passes through each stage of collection, marking, moderation, exam boards and issue of the final marks. Each signature binds the previous one creating a non-repudiable audit trail that can verify what has happened, when it happened and whether or not the work has been changed for each item of work at each stage from receipt to results.
Description to follow.
In this presentation, we demonstrate working examples of innovative styles of question, which cannot be produced or delivered by the vast majority of current assessment platforms. The tools we are using can be used in Moodle 2, Blackboard 9, Sakai and many other VLEs. The questions have the potential to be useful across a variety of disciplines, and we shall ask participants how they would use them in their teaching.
We show, first of all, a scenario question with several different but related inputs. This can be as elaborate as the author wishes, limited mainly by their willingness to devise feedback relating to the various answers given by students.
Incorporating graphical elements into the mix, we have a question which asks the user to trace routes on a map, although this may not be the use to which you would put it!
Sequencing questions are used in Life Sciences for topics such as the life cycles of organisms, or antibody reactions; but they have the potential to be easily applicable in many other disciplines. A series of options are provided, some of which may not be relevant to the topic. The student must put the options that are required into the correct order. Marks are gained for selecting the correct options and lost for selecting irrelevant options. Additionally, marks are gained for identifying the correct order.
We shall demonstrate an editor which can create some, but not all, of the examples, and suggest how the other questions might be accommodated in the editor, which is, of course, usable for any discipline.
We invite participants to design new input types, create new components for the editor or simply to join our community of users of adventurous assessment.
With the advent of MOOCs and other innovative curriculum formats, an under-addressed area is how to assess learning in FE, HE and other contexts. Amid much excitement about how eAssessment can solve a number of technical issues, particularly rapid turnaround of marks and feedback, and reducing the drudgery of hand-written comments, the underlying issues remain as much open to discussion as when using traditional assessment formats.
This session will dip a toe into exploring some key questions including:
- What kinds of assignments lend themselves to good eAssessment?
- Can they cope with a wide variety of assessment methods including group assessment, live assignments, practical work and so on?
- How can we really ensure that learning has taken place?
- What kinds of activities provide authentic eAssessment experiences that accurately measure learning rather than being proxy measures that assess what is easy to assess?
- How can we assure the quality of eAsssement?
- How can we satisfy ourselves on such issues as plagiarism and assessee identity?
Problem-Based Learning (PBL) places the medical student in the role of a doctor. At each twist in the story, the student is presented with a list of choices, and I believe deciding what goes on that list is more important than actually picking a choice from that list.
Logical clinical reasoning is one of the most important skills a foundation doctor can possess and PBL does a lot to prepare students for when they become a foundation doctor. But at present PBL cases give the student a menu of choices at each twist in the story… But what if you wanted something that’s not on the menu?
Using a set of open source tools, I’ve developed an Interactive Ficion (IF) approach which allows medical students to direct the scenario through a series of natural language instructions which they can type into the interface. Delegates signing up for this session will have the opportunity to access a demonstration exercise before the conference.
This presentation will explore how Mahara groups can be used in many different ways for e-assessment. It will touch upon the various methods including; individual or group submissions, peer-assessment and content generation.
The many tools in Mahara facilitate its suitability for group work, including the ability to collaborate on pages, a shared files area and discussion forums. Mahara groups are more than a collaborative space though, they can also be used for the private submission of individual work to a course group. Integration with the institutional Virtual Learning Environment (VLE) also makes Mahara a place to create submission that are then submitted centrally to a VLE.
The presentation will utilise some case studies and examples to demonstrate how these methods might be used in teaching and learning practice. Finally there will be a brief discussion of the benefits and disadvantages of using Mahara for group work.
With today’s students being “techno savvy” and technology equipped, they have increased expectations for technologies in the classroom. With this in mind and with 400 ‘old’ clickers in the School of Health and Life Sciences, the Learning Technologist Team devised and offered a new service to encourage the use of this simple but very effective piece of feedback technology.
The presentation will look at how the service was developed and how the clickers were utilised to good effect in new contexts. The various implementations include:
- How flipped teaching was introduced in a Level One Diagnostic Imaging module using clickers as a means of formative assessment and the positive impact this has had.
- How the TurningPoint/Powerpoint clicker slides were voiced over, converted to web based HTML5 and mounted in Blackboard for revision purposes.
- How Physiotherapy delivered the National Student Survey.
- How NHS 24 used clickers at a recent lecture to 200 final year student nurses.
- How a clicker quiz was developed for Ophthalmic students from the program Peerwise, a peer assessment program where students create and rate their own questions.
- How clicker technology will be used in Glasgow and Oman simultaneously with Physiotherapy students using clickers, an app or via a web url.
- How a built-in feedback wall can be used in a similar way to Textwall/Padlet (Wallwisher).
Vote with your feet and come along to see how the old clicker dog of instant feedback has learned new tricks.
CBM is a mark scheme for right/wrong answers that motivates students to give an honest judgement of their degree of certainty that each answer is correct. Its use in London ( www.ucl.ac.uk/lapt ), mainly so far with medical and science students, has provided data supporting three valuable conclusions:
- CBM is readily accepted by students as a fair and stimulating mark scheme and they rapidly learn to gain reward from both the proper acknowledgement of areas of uncertainty and the identification where possible of reasons for confidence. This encourages reflective learning.
- Use of CBM in exams enhances reliability and predictive power in relation to student ability, without compromising the provision of conventional standard setting and analytic data.
- CBM can be readily implemented with existing question banks, using appropriate tools for study use or for exams, and with any of the common right/wrong question formats.
The presentation will discuss the mechanics, cautions and benefits of switching to CBM. It is important to know what you do and don’t know. We shouldn’t reward lucky guesses on a par with knowledge, and confident misconceptions need to be recognised as barriers, worse than ignorance, to study and performance.
The session will describe the integration of several technologies (including PeerWise, Clickers, Online MCQs) into a First Year Foundation Organic Chemistry module with the specific aims of enhancing the use of technology in both formative and summative assessment, fostering a community of self-learning, and providing the students with real and virtual spaces to engage with each other and the content both synchronously and asynchronously.
The session will be a mixture of presentation, workshop and personal reflection. The presentation will be non-linear (using Prezi software) and will provide an initial overview for the pedagogy described; examples for the presenter’s research in this area and will act as a springboard for the audience participation. Examples of engaging during class and ‘assessment for learning’ e-tivities will be described and used in a semi-workshop mode to allow audience participation.
The session will conclude with a look towards the presenters’ future use of technology in assessment (e.g. network maps, eportfolios, reflective writing) and audience opinion, questions and answers.