We're seeking a two-page (or one-page, double-sided), 8 ½”x 11” full-color flyer. It is essentially an article, but presented in visually enticing layout, somewhat like a magazine article. The content is a series of excerpts/quotes from a panel discussion at a recent conference.
We do not have content completely final yet; I have included the draft content. It will be finalized on Monday, Nov. 12 or Tuesday morning, Nov. 13. It’s VERY important for designers to know that since we do not yet have final content, the design they provide must be flexible and could change somewhat based on the exact length of the final content. We will select a winner based on design preference, and then the selected winner will be required to make edits based on final content.
• The piece will be distributed digitally as a PDF; pages will likely be viewed one at a time, not side-by-side.
• Final deliverable is due by Thursday morning, November 15, so this is a VERY short turn-around project.
• Final deliverables: print-ready PDF file, web-ready PDF file, and original source file (PSD, INDD, etc.)
* Content is pasted below, but is also in the attached Word doc.
Please contact me with any questions at all. Thank you for considering!
* Any text in [ ] is for instructional purposes and should NOT be included on the design.
A. [Top section]
a. [Header] Integrity in Online Testing
b. [Subheader] A Discussion about Challenges & Solutions
B. [Photo - Header area should include a graphic or photo of some kind. Could be stock photo of student(s) – see important guidelines below if using stock photos.]
C. [Intro paragraph]
A recent conference session was a forum for a fascinating discussion about technology, budgets, and policies tied to integrity in online testing. Key moments from the discussion – edited for brevity and clarity – are captured here.
D. [Panelists – name, institution, title. Could include headshot photos for each. If design includes headshots, we will provide later; you can mock them in for now.]
Jeremy Bond, Central Michigan University
Interim Director of eLearning
James Frazee, San Diego State University
Sr. Academic Technology Officer + Director of Instructional Technology Services
Jackie Crouch, University of Colorado – Colorado Springs
Instructional Technologist, Canvas Trainer, & Quality Matters Cert. Peer Reviewer
Arie Sowers, Respondus, Inc.
Senior Product Specialist
E. [Section 1 header] How our institution got started with online testing
[Section 1 content]
Jackie: We first moved to online testing within the nursing department, about nine years ago. There’s so much content covered in those classes that devoting three class periods to giving paper tests was problematic… But it was the weather that ultimately forced the issue, actually, a snow day. I got a call from an instructor saying “I was supposed to give a test in class today. Can you help me get it online?” And I said, “Sure, no problem.”
Jeremy: Even though students were doing coursework online, they were taking exams the old-fashioned way. This means they were either visiting the university testing center on the main campus, or students were finding their own proctors at other institutions, libraries, etc. … Then our testing center closed – a “budget inefficiency” – and students were kind of objecting to the idea of having to go and be proctored elsewhere… They were doing all their studying online, but had to drive three towns over to take a test while a stranger watched them.
F. [Section 2 header] Choosing an online proctoring solution
[Section 2 content]
Jeremy: We started to take a hard look at what was available to us… What we were finding is that even in the rare cases where proctors were witnessing behavior they thought was suspicious, too many of these cases ended with the faculty member – who is ultimately responsible for the decision – saying "Well, I didn’t see the cheating myself," and then letting the student off. Some of the solutions we looked at were replicating this model, where a stranger is proctoring the student. We dismissed some options that just seemed to replicate this problem…
The big outcome with Respondus Monitor was that it restored what I call the natural arrangement of things. The educator has everything needed to observe and make decisions about whether cheating has occurred. No, the video and data aren’t in real time, but that’s actually better in a number of ways. And budget considerations were also a key factor for us. We found Respondus Monitor costs were easily managed.
James: Simplicity and convenience were central in our decision. It needed to, first and foremost, be integrated within the learning management system. We didn't want instructors or students to have to leave the learning management system… San Diego State had already been using Respondus LockDown Browser which prevents students from doing screen captures, or going to other websites. It keeps them literally locked into the exam. So, it was natural progression for us to begin using Respondus Monitor.
Jackie: We were also already using LockDown Browser for proctored exams on campus. Then, one of our colleges with a fully online degree program required students to arrange for proctored exams to meet accreditation guidelines. We had students across the nation and in other parts of the world taking these courses. Students having to arrange for a proctor was problematic – plus the cost, which was coming in at about $30 per test for each student… So we made the decision to utilize Respondus Monitor for that program. It worked very well, the department was happy, and soon word about the tool began to spread around campus.
G. [Section 3 header] Policies surrounding online proctoring
[Section 3 content]
James: For online classes, we have a policy that students are notified of any specialized software requirements. However, it's important not to try and hold our online courses to a higher standard than our face to face courses. So we don't have a policy on academic integrity as it relates specifically to online or other various modalities. Online courses are covered under our standard policy on cheating and academic integrity.
Jackie: Faculty have the freedom to choose which of our offered technologies they want to leverage. When they approach us, we ask, how does that support your course objectives, and what is your end game? And then we make a recommendation, and work with them closely.
Jeremy: We’re in the same situation – use of technology tools is faculty choice, and there's a great deal of autonomy. But some academic departments decide to mandate specific technology requirements, such as using Respondus Monitor. That approach offers a consistent, standardized system for measuring assessment outcomes and monitoring integrity situations. It also puts a focus on consistency of the student experience, so they aren’t going from course to course and having a very different experience in how they're tested and how everything functions.
H. [Section 3 header] Budgeting, and who pays for it
[Section 3 content]
Arie: For online proctoring, it generally comes down to whether the institution pays for it, or the student. The higher the cost, the more often it gets passed along to the student.
James: In California, our mission is all about access. Anything you put between students and their access to resources will be a challenge if there’s a cost associated with it. Any fees we charge have to go through our Campus Fee Advisory Committee, which is made up primarily of students. They’re not very hip on new fees. That’s what drove us in this particular direction.
James: We fund Respondus Monitor out of our own ITS budget, just so faculty have more options… The cost was affordable. Kind of a drop in the bucket compared to what was found in other enterprise class technologies on our campus.
Jeremy: Funding of this is not nearly as challenging as funding just about any other technology I can think of… Ultimately, we were able to centralize it… It’s paid for and treated as an enterprise solution. Fortunately, it's not such a significant line item that it gets revisited in a way that others might.
Jackie: For us it comes out of student tech fees, seeing as how it is an enterprise solution. Again, as Jeremy said, it is a nominal cost compared to most technology tools we are using… To illustrate this, we pulled numbers for the 2017 to 2018 year [at University of Colorado, Colorado Springs]. As you can see, we had 171 courses using Respondus Monitor. Again, its use is voluntary by faculty. We used about 3,200 seats. There were 17,000 exam sessions, so that worked out to about 5.3 tests per seat. On a cost basis, it came in about 46 cents per test session. That’s very cost effective for the institution, and faculty are very pleased with the solution itself.
[See attached Word doc for an image that can be used in the design near the paragraph directly above.]
[Image caption] Slide from panel session. Source: Jackie Crouch, UC Colorado Springs
Respondus logo [used small – no larger than 1.5” wide]