Quizzes in Quercus
My Experience
To add to Instructors' Insights on Crowdmark, I have used Quizzes in Quercus (Classic not New Quizzes) weekly since September 2018 with between 150 and 850 students in ECO220Y1Y, and prior to that, via Blackboard portal since 2015. This page shares some of what I have learned with six years experience and compares and contrasts Quizzes in Quercus with Crowdmark.
Are You New to Quizzes?
If you are new to quizzes, you can start with a 4 minute video from Canvas Links to an external site.. Further, the Faculty of Applied Science and Engineering at the University of Toronto has a great guide: FASE's Education Technology Office: Quercus Quizzes. Also, there is a quick guide under Online Assessments on the A&S Keep Teaching site. Also, a more detailed guide is on the A&S IIT page Creating Quizzes. You can also see the Quizzes Links to an external site. section of the Canvas Instructor Guide Links to an external site. (detailed instructions with screenshots).
Fortunately, there are now more resources (previous paragraph) than when my TAs and I figured this out. Also, remember that you can use your Quercus Sandbox (enrolling a TA as a student) to test features out.
Case Study: ECO220Y
This case study covers most of the use issues with quizzes, ranging from question types, marking, versions, set up, accessibility, and academic integrity. It may help to start as students have: see pages 5 to 6 of the 2019/20 syllabus Download 2019/20 syllabus in Section 11.2.2. (I also use Quercus Quizzes for the DACM component of the marking scheme on page 3, which consists of five Quercus Quizzes spaced over the academic year.)
Students can typically start a quiz at a time of their choosing within a multiple-day window but have a fixed time limit (e.g. 60 minutes) to finish once they click the Begin the Quiz button. I find a timer is important to focus attention and to make sure that students start the quiz prepared: having completed the reading, attended the lecture, and solved the ungraded homework.
Further, I use multiple versions of each question. For example, in a weekly quiz with eight questions, I typically have four to eight versions of each question. This means that any collaboration needs to be substantive, not copying. For example, with eight questions each with six versions, there are 1,679,616 possible unique quizzes (6 to the power 8). Each student gets one random draw from those possibilities.
I have used machine-marked questions. For example, these include multiple choice (any number of alternatives are possible), numeric replies (with a margin of error), and fill-in-the-blank. The Canvas Guide titled “How do I create a quiz with individual questions?” Links to an external site. gives the full list of options and there is a nice and detailed guide through all question types (with screen shots) on the A&S IIT page Creating Quizzes. Given the nature of the course, I use the numerical answer format the most. These questions are easier to write than multiple choice because you do not need to solve the question all of the wrong ways to come up with alternatives. With numerical answer questions, you do need to think carefully about rounding and the margin of error and clearly communicate precision expectations to students.
Open-ended (essay or file upload) questions can be manually graded with SpeedGrader, but it is cumbersome to do anything more than give a score (i.e. clunky to try to add comments) and it is not good for large courses with a team of marking TAs. However, I know of respected colleagues who find SpeedGrader is still useful for large courses.
The questions hold up extremely well and do not lose value over time. I refresh about 5 to 15 percent of my question bank each year, but that is primarily to keep the questions well aligned as the course evolves, and not because the questions get out there and the marks start rising. This is likely helped by the fact that I have many versions of each question. I started with 3 or 4 versions of each but it has since grown to 6 and, in a few cases, 10. In contrast, if you attempt to use questions from a textbook publisher's test bank, rather than your own original questions, you should expect that a searchable database of those questions is public.
Of course, you may worry about cheating with online quizzes with machine-marked questions. Consistently over five years we have found that weekly online quizzes are a surprisingly good predictor of invigilated work. (For those interested in the correlation analysis, see Quercus Quizzes: Correlations with Invigilated Work.) This is despite the fact that we explicitly allow reasonable collaboration among students in the course (i.e. working with other people in the course is not cheating on Quercus quizzes). You can see my definition of reasonable collaborate in Section 11.2.2 on pages 5 - 6 of the 2019/20 ECO220Y syllabus Download 2019/20 ECO220Y syllabus. (Interestingly, when we've asked students in anonymous surveys, more than half prefer not to collaborate.) Further, these correlations have not been weakening over academic years, even though many of the question versions are reused. Of course, these quizzes are spread over many weeks and are not high-stakes. The high correlations are consistent with the conclusion that this weekly (and not invigilated and machine-marked) work can give students useful and frequent feedback on how they are progressing. It is perhaps surprising that these are such excellent predictors of invigilated and entirely human-marked term tests that focus on reasoning via equations, figures, and considerable writing (and delivered via Crowdmark, Administered Assessments up until March 2020). Again, for those interested in the correlation analysis, see Quercus Quizzes: Correlations with Invigilated Work.
In terms of logistics, I prepare questions and answers in a plain text document. I e-mail these to my Head TA who inputs and sets up the parameters (points, time limit, due date, etc.) How much work the input part is depends on the number of questions, style of questions, and whether you want randomization. See Quercus Quiz: Instructions for Head TA.
I also have the Head TA handle accommodations. To learn about those, you can start at CTSI page titled Online Tests and Exams. UTSC prepared: Adding Extra Time to Student Quiz Attempts. You can also see https://q.utoronto.ca/courses/85981/pages/quiz-availability-and-extra-time. It is counter-intuitive that you have to "Publish" your quiz before you can program accommodations. (See Once I publish a timed quiz, how can I give my students extra time? Links to an external site..) However, you can set the availability window to be in the future. Hence, even though you have to publish before you've actually finished setting up the quiz, your students cannot access or see your unfinished quiz. (Obviously, remind your Head TA to be super careful in setting the availability dates and times.) Again, see Quercus Quiz: Instructions for Head TA.
Quercus Quizzes versus Crowdmark
I view Quizzes in Quercus and Crowdmark as both complements and substitutes, but more as complements. Overall, I definitely need both.
Two advantages of Crowdmark are the ease of adding meaningful comments on students' work and managing a TA team for open-ended questions that require human marking.
Quercus quizzes are best if you can translate what you want to ask into machine-marked questions. Of course this precludes having students answer with equations, graphs/diagrams, or writing. The speed of returning work (given machine-marking) makes Quercus quizzes highly effective when you want to give students fast feedback and is ideal for weekly assessments without breaking your TA budget. Opinions around the pedagogy of machine marked questions vary widely. However, they can test higher order reasoning: I never use them for mere recall. Well-designed and meaningful questions do require instructor skill and experience. Hence, while I believe there are strong pedagogical benefits, I am certain there are considerable costs of using these every week. Quercus quizzes are especially helpful in larger enrollment courses where the design and set-up costs are spread over many students and where you have considerable TA hours and can delegate much of the back-end work (so that you can just focus on crafting questions).
For smaller courses, it is likely easier to simply ask open-ended questions that are human-marked. In my 400-level course (capped at 35 students), I efficiently marked weekly participation (via Crowdmark) myself.
Another advantages of Quercus Quizzes over Crowdmark is the multitude of easy-to-implement settings to deter unwanted collaboration among students, such as shuffling answers for multiple-choice questions, creating multiple versions of questions, showing one question at a time, and disallowing backtracking. (Quercus Quizzes does NOT really allow randomizing the question order unless you do not plan to write multiple variations of each question. The only way to get Quizzes to fully randomize the order of the questions is to put ALL questions into a giant question group and have it pick X out of the Y questions. This is not ideal and there is no simple option to randomize the order in Canvas like other learning management systems. Nor can you create question groups within question groups. New Quizzes apparently has a check box to randomize question order, but New Quizzes is untested at U of T to my knowledge.) While you can do multiple versions of tests with Crowdmark, it takes more doing: see Multiple Versions for Assigned Assessments (remote). Also, note that as of July 27, 2020 Crowdmark introduced timers -- see Creating a Timed Assessment Links to an external site. -- which is a key feature that has always been available in Quercus Quizzes.
Issues with Quercus Quizzes
Watch out for the Grade Posting Policy (the defaults are NOT what you would guess).
An important technical limitation of Quercus quizzes is that if you make a mistake with some types of machine-marked questions you have to manually fix it using SpeedGrader. See What options can I use to regrade a quiz in a course? Links to an external site.. Contrary to any reasonable expectation, if you fix a mistake in a numerical answer question it will not adjust students' marks. We had the same problem with the old Blackboard portal system. (New Quizzes supposedly fixes this BUT has other limitations and we have no experience with it in Economics.) The best solution that I've come up with is to have a highly trusted TA double-check every question and to reuse good questions year-over-year. You can import them in Quercus, so if you're sure they were error-free they will continue to be error-free.
While Quercus Quizzes are generally very robust -- even very large courses using them for the first time -- there is one other warning about them. You need to be careful about including images. Your students may need to hit refresh (multiple times) to get them to load and/or update their browser and/or switch browsers. Obviously if answering the question requires that students can actually see the image this is a serious issue. For unknown reasons, instances of students having trouble viewing images have gone from extremely rare to a handful for each quiz starting in February 2020 to a small (and not tiny) number in Summer 2020. Hence, you'll need to communicate the following to your students if you wish to include images in your Quercus Quiz questions:
- Do not use Safari (if used, uncheck the box for "Website tracking" that prevents cross-site tracking, but be warned it is the browser most likely to have problems with quiz images loading). Use Chrome or Firefox.
- Regardless of the browser, make sure the most recent version is installed.
- Clear the cache and cookies.
- Adjust settings on browser extensions and ad blockers if those prevent images from loading.
- Hit refresh (possibly multiple times) if the image still does not load despite taking all of the above steps.