DESIGN LAB

Human Computer Interaction research on improving peer feedback in online education

DURATION: Apr 2015 - Present

ROLE: Research Assistant, UX / UI Designer

TOOLS: Illustrator, InDesign, Experimental Design, InvisionApp

COLLABORATORS: Catherine Hicks, Vineet Pandey, Rachel Chen

DELIVERABLES:  Final Presentation   | IDP

I previously conducted Human-Computer Interaction research at the Design Lab with Dr. Catherine Hicks and Vineet Pandey, with the lab headed by Don Norman, Jim Hollan, and Scott Klemmer. We explored how to improve peer feedback, collaborating with PeerStudio at Stanford. I created artifacts, help with experimental design, data code (qualitative), read and find papers, and help with creating interfaces for their experiments.

Our future goals include doing more experimental design with Oppia (Google affiliates).

Poster on Peer Feedback

presented at the UCSD CSE undergraduate research poster session

the challenge

Previous research has shown that peer feedback is often unreliable-- how can we make it better so it can be a viable solution to problems with online classes?

In massive online open classes (MOOCs), there are often too many students to a single teacher, making grading and feedback from teachers impossible. However, peer feedback is often seen as unreliable and the quality of feedback is often low from peers as opposed to teacher's.



our approach and process

Context scaffolding includes training users to think more like experts and giving more context in a situation.

It can train novices to pay more attention to fundamental principles integral to success rather than the nonessential parts.

Two types of features users pay attention to were identified with regards to giving feedback.

Deep Features

Surface Features

parts of the assignment that rest on the fundamental principles that are integral to the work's success, such as the argument and logic of the essay

cosmetic, non-essential, and individual choices, such as grammar and word choices

This is potentially a way to make peer feedback more reliable and trustworthy, by reducing biases from novices by guiding them to focus more on deeper features, as experts do.



the outcome

We conducted an experiment using two conditions, one with context scaffolding from training users to think like an expert with critiquing resumes, and another simply asking the user to give peer feedback without training.

We found that with context scaffolding, users gave more diverse, (focusing on surface and deep features), and even more positive feedback.

Those without context scaffolding focused more on surface features.

Take a look at our poster! :)

idea by Catherine Hicks and Vineet Pandey, presented with Rachel Chen

PeerStudio Interface Design

the challenge

Creating interfaces for research purposes must achieve the goals of the experiment with their constraints and gather the correct data, while keeping the experience seamless and not distracting.

Rachel and I wanted to make sure users were able to reach their goals of getting their resumes critiqued, and experimenters were gathering the correct data from what the users were doing and subjects were learning about critiquing. We had to put in the constraints experimenters wanted to ensure the quality of feedback and their experiment, while keeping the experience delightful.



our approach and process

We first created paper wireframes separately, and then came together to discuss the pros and cons of each, and what direction we wanted to head in. I also created a user flow to show the process a user would go through during the experiment.



the outcome

Rachel and I created a hi-fidelity prototype in Illustrator, then we put it into InvisionApp to show the interaction and flow a user would go through.

click here for our InvisionApp prototype

Hearing from Brilliant minds in HCI Research

This experience was extremely valuable to me, to see how psychological research can be applied in the HCI realm, and also being around so many intelligent and amazing people. Listening to people consider different ideas, share their thoughts, and share research on current topics is always great, especially when hearing about the complexities and different dimensions in all different topics in HCI. I also learned while at the Design Lab to take control and intiative of my own growth and learning and goals, where I can ask for a change in direction if I don't see what I am doing currently fitting my future overall goal, or if I wish to do something different. The brilliant minds there have also taught me to continually be curious, and to continue to ask questions (and the right ones!) And ultimately, never being afraid to fail and to get feedback on your work, because although your ego may hurt a bit at first, it will ultimately be beneficial to hear from different people about your work.

  back to work