COVID-19
Contact
Tracing

September - December 2020

Covid-19 tracing application for students & faculty at Northeastern University

Project Timeline

01. Defining the Problem

The Problem

In 2020, COVID-19 transformed the way we worked, learned, and interacted with others. During this time, we wanted to help those on Northeastern’s campus feel safe by asking the question,

“How can we collectively take action to contain the spread of COVID-19 in the communities we belong to?”

Northeastern University implemented several strategies to protect its members, including mandatory testing every three days. These protocols helped to mitigate the spread of COVID-19, but there were still improvements to be made. In particular, the process of notifying a person's close contacts if they test positive in an automated, accurate, and streamlined way.

The Solution

Our COVID-19 contact tracing app, designed for students, faculty, and staff at Northeastern University, aims to provide more accurate and quick information to those in close contact with COVID-19.

The solution was designed to allow individuals to self-report tests, and by using recent location data notify all individuals who have been potentially exposed to the virus.

In addition to automatically notifying users who have possibly been in contact with the COVID-19 virus, our app also allows users to complete tasks that are part of Northeastern’s routine testing procedures, such as scheduling a test and completing a Wellness Check. The app includes informational pages detailing Northeastern’s COVID protocols, FAQs, as well as a dashboard of COVID statistics at Northeastern.

User Research

We found that, at the time, Iceland was using an app called Ranking C-19 for contact tracing within its country. The goal of Ranking C-19 was to “analyze individuals’ travel and trace their movements against those of other people when cases of infection or suspected infection arise”. The app uses GPS to track the location of users and identify whether people who test positive are potentially spreading the disease.

"In May of 2020, 38% of Iceland’s population had downloaded the app." - MIT Technology Review

Using Iceland’s Ranking C-19 application, we conducted usability tests with eight potential users. We observed how our potential users used the app, what information they clicked on, what information they ignored, and which features were most interesting and relevant to them. From our notes, we discussed seven common themes:

The Key Insights

Most users wanted their identity to remain private if the app needed to alert their contacts.

Users wanted to see more information and resources if they tested positive or had a positive contact.

Users wanted to be able to do Wellness Check and schedule tests in one app, but had different opinions on whether they wanted to see additional COVID-related news/stats as well.

None of our users had used a contact tracing app before, and had mixed responses about whether actually use one in the Northeastern community.

It was difficult to tell how much info users wanted from the app, as some wanted something quick and concise, while others would appreciate some level of helpful data, info, or maybe news.

Most users were okay with sharing their location data in the context of contact tracing, but one user was uncomfortable with Northeastern having their location data.

App Requirements

02. Iterative Design

Sketches

We developed sketches based on the requirements defined by our user testing. During our discussion, we critiqued each other's design. We found a lot of similarities and unique differences among the sketches. Collaborating in this way was pivotal to the ideation process as we found that each individual unintentionally focused their attention on a different areas of the app. For example,

  • I explored the process of reporting and reverting a positive covid test.
  • Angelina worked through the confirmation process after reporting a positive test.
  • Sean focused on ensuring that help/contact information was easily accessible.
  • Holly experimented with the structure and navigation.

The ideation process allowed us to address our biases. For example, my sketches featured a way for users to “reset” their positive status once they recovered from the virus, which led the rest of the group to ask, “What happens after users are no longer positive?”

Each of us offered solutions to this problem; we ultimately shifted towards making the COVID status of a user more dependent on test results.

User Testing Results
  • Made a status in the homepage and moved the report button into the status page.
  • Changed the layout of the other options and got rid of the announcements page.
  • Changed the acknowledgments to one checkmark (positive confirmation page).
  • Change the report button to say “I tested positive.”
  • Added a tested negative option to reverse the positive result.
  • Made the what’s next pages more clear (do not require confirmation) expect to schedule test.
Low-fidelity Wireframes
Formative Heuristics Evaluation Takeaways
  • Standardized language throughout the app
  • Create and follow a style guide
  • Add additional feedback to necessary actions
  • Ensure all tasks flows mimic each other
  • Restructure the layout and look of the homepage

03. Building the Solution

Med-fidelity Wireframes
System Design
High-Fidelity Prototype

The final version of the homepage has three final forms, one for each status: Negative, Possible Contact, and Positive.

We found this was the best method to present the user status while also allowing the user to easily report test results, complete wellness checks, and access additional resources.

username​:​ user@​northeastern.edu​​       password:​1234

Summative Evaluation

Usability Metrics

Eight students participated in usability testing. We collected metrics on task-based efficiency, task-based effectiveness, and overall user satisfaction. The metrics were calculated using the online platform Qualtrics.

Task-based efficiency: To measure task-based efficiency, we measured time-on-task per participant, as well as provided the users an ASQ survey at the end of each task to subjectively measure task difficulty.

Task-based effectiveness: To measure task-based effectiveness, we measured the number of tasks completed per participant and the number of mistakes made while completing the designated task per participant.

Overall satisfaction: Lastly to measure overall satisfaction, we provided the users a 10-question SUS survey after they completed all 14 tasks. The questionnaire targeted the user’s overall experience with the app.

In total, we used five different metrics to analyze the usability of our application. At the end of the usability test, we asked the users a set of open-ended questions to gain more insight into the user experience.

Results & Improvements 


Based on the collected metrics, surveys, and user insights, we can conclude that our overall design is well organized and easy to navigate. Overall participants expressed positive feedback, and enjoyed our aesthetics, flows, and organization.

  • Make the ‘View next steps’ link on both the positive and potential contact homepages easier to find.
  • Make the ‘learn what a safe test is’ subtask easier to find so users know what a safe test is, and where it will take place clearly.
  • Add more feedback to the ‘reschedule test’ function. Users made note that they were unsure if rescheduling a test versus scheduling a second test. More feedback and signifiers should be added to bridge the gap between the gulf of expectations and execution.
  • Users expressed confusion surrounding the navigational flow of the next steps section. They were confused because the nav-wizard disappears after confirmation, but reappears within Next Steps.
  • Users found the lack of a ‘Home’ button on the FAQ page inconsistent with the rest of the app. Users hesitated before navigating home through the Hamburger menu.