CodeJam — Helping graders grade, comment, and collaborate on code, all in one space: UX case study

Let me take you through the process of designing a solution to the tedious task of grading code.

Kevin Park
6 min readDec 29, 2021

--

Project Contributors: Kevin Park, Ava DeBartolomeis, Paul Vermette, & Nina Xie.

Timeline: Aug 2021 – Dec 2021

About CodeJam

Codejam is a design project that offers teacher assistant (TA) a new innovative and efficient way of grading and collaborating on student code submissions. From directly commenting on code to face-timing other teacher assistants, CodeJam makes complex grading simple and easy to do.

This case study focuses on teacher assistants who grade assignments in computer science courses found at Cornell University.

The Problem

Being a TA and grading hundreds of student submissions isn’t easy — it takes hours and hours, and often TA’s don’t have the right tools or knowledge to grade efficiently and fairly.

In particular, being a computer science TA requires downloading additional programs and files, and opening copious numbers of tabs and windows that clutter the desktop screen making the grading process frustrating.

Result

Working on a team where designs and decisions were first individually made had its pros and cons. Despite some challenges in design consistencies and product functionality, the team was able to create a product that served the user’s goals and needs.

Timeline

This project took 10 weeks to complete. The design process was divided into four phases:

Interview Plan, Personas, & Research

After creating our interview scripts and plans, we created a persona to represent the group of people (target audience) we were designing for.

Persona

Our persona took the pseudonym name of Matthew James:

Photo by Ben Parker on Unsplash

User Research

We found 10 computer science TAs’ at Cornell University that fit our persona. Our goal of the study was to better understand the current undergraduate TA experience. More specifically, we interviewed them with 4 goals in mind:

  1. Identify user goals and motivations
  2. Understand user activity/behavior patterns
  3. Recognize and classify current problems as well as high points
  4. Determine current products used to share content

Design Ideation

We identified and analyzed five existing solutions to our problem space.

Sketches

From the solution spaces, the team participated in brainstorming ideas of possible concepts and solutions. Among the collective shared sketches, two main overlapping themes emerged:

  1. Enabling collaborative grading
  2. Providing a collective space for all grading tools necessary

The team went forward with making a platform that consolidated all the necessary grading tools into one platform to minimize the need for multiple windows.

Moreover, the team wanted to focus on opportunities for collaborative grading, an area that the solution space did not address. The selected sketches (on the left) demonstrate some of the overlaps in the sketch concepts, which the team leveraged for the final design idea.

Design Concept

3 user tasks were created to help demonstrate the differentiating features of CodeJam. With each task, a storyboard was created to visually represent a scenario a user may experience while using CodeJam.

Storyboards

Along with each task, sketches of the potential UI for CodeJam were also made.

UI Sketches

Mid/Hi-fi Screens — Grading, Code commenting, Referencing

After discussion, the team decided to edit and add another task to gain a more detailed understanding of user usability.

A run through of the 4 tasks is done by one of the team members, Ava DeBartolomeis.

Task 1: Grading an assignment with comments

Task 2: Requesting in-line help from other TAs

Task 3: Referencing other past submissions

Task 4: Video chatting with other TAs for help

Voiced & made by Ava DeBartolomeis

Heuristic Evaluation & User Testing

In a heuristic evaluation, we recruited 4 computer science TAs’ and asked them to complete the 4 tasks in order to assess whether our designs complied with usability principles, or heuristics.

Analyzing our Findings from User Testing

All users were able to complete the given tasks, however, our metrics demonstrated that users had trouble overcoming the learning curve for our app. This is demonstrated by the lower completion rates (Learnability Metric) associated with our usability testing sessions. Many of the users’ errors were cosmetic and minor, which meant design improvements would be feasible for the team.

Design Improvements

3 design improvements were made to alleviate the pain-points discovered from user testing

  1. Redesign of feature entry points: Icons for Request Help and Reference weren’t intuitive enough.
  2. Remove “Auto-calculate” button: Users didn’t understand funtionality of Auto-calculate button.
  3. Implementation of a Progress bar: Users were confused whether they had completed grading a specific assignment or not

Final Hi-fi Screens

Outcome

The way this project was designed in terms of collaborative work was both challenging and compelling. Because designs, analysis, and ideations were first done individually, there were sometimes difficulty maintaining coherence and design consistencies. However, team members were vocal and proactive in discussing their ideas which lead to amazing designs and ideas. It was a pleasure working on this project and ultimately creating a product that is backed by research.

Special thanks to my team members for all the hard work and time they put in. I couldn’t have done it without them!

To learn more about this project and see more detailed work, contact me @ kp462@cornell.edu

--

--