Design challenge

How do we survive misinformation?

Design Challenge

Health, (Mis)trust & Misinformation

View Slides
Duration

September — October 2021


stakeholder
Team

Dimitri Knight
Yongqian Lee
Rungsiri Upradit
Peter Pham

Role

I served project management, visual design, and user researcher roles. I developed the visual identity for this project. I guided parts of our literature review, focusing on understanding the perspective of UC Berkeley students, their pain points, and the context of their problems given a global misinformation crisis. During our presentation, I focused on telling the story of this problem's worldwide context and defining areas of opportunity to help our target audience.

skills

User Research

TOOLS

Figma
Photoshop
Illustrator

Overview

Product Vision

What did we imagine?

We sought to understand how UC Berkeley responds to public health information, fake news, and making health decisions during a pandemic. We hypothesized that students with varying media engagement levels might have differing judgment, understanding, and reactions to science-based, factual information.We suspected that social media engagement would play a significant role in how a student makes sense of the world from the onset.

Area of Opportunity

What can we do moving forward?

We want to educate users about how their social media feed is curated to them. Our team discovered that students could be unaware of how they consume information and why they're consuming a particular type of information. UC Berkeley students tend to have more awareness of how their social media apps function, but they still struggle to educate family and friends about fake news. Fake news takes advantage of the social media user experience because it optimizes engagement, virality, and ultimately profit. It is worth exploring how social media apps might function if users could anonymously social proof pieces of content by assigning a trust score.

Outcomes

What was our impact?

Our team presented our findings to the Fung Fellowship and representatives from the UC Berkeley School of Public Health and Facebook. Our research was notably relevant to students in the Fung Fellowship, and we received feedback that our findings were prescient. Notably, we received feedback that our area of opportunity is particularly non-partisan, pragmatic, and worthwhile.

NEW YORK TIMES: U.S. NOW HAS MORE KNOWN CASES THAN ANY OTHER COUNTRY

Context

Background

what's going on?


People are falling victim to misinformation.

Health misinformation is not a new phenomenon. The one thing that's changed, of course, is the internet. It gives more people access to more information than ever before. However, not all of this information is equally good. Public health misinformation has followed us throughout history, but the internet has only accelerated its spread, voracity, and consequence.

Problem

What needs fixing?

The proliferation of social media and the recent politicization of the COVID-19 pandemic has obfuscated science-based truth. The devastating impact of health misinformation has resulted in millions of deaths, economic recession, and emotional havoc worldwide. Everyone is searching for solutions but is likely unable to filter truth from fiction.

Contemporary thinkers have no shortage of ideas about addressing them, but they can all be boiled down to two fundamental approaches: censorship or education. To the extent that members of the public are capable of critical thinking, perhaps these are the only viable solutions. But how can we get people to think for themselves? Everyone has their own theories.

Opportunity

where's the value?

The COVID-19 pandemic has demonstrated the acute fragility of our public health system and communication. The answer lies in the collective memory of the citizens. In a world making less sense, citizens need to have access to better tools for defending against pseudoscience and obfuscation.

This design challenge will discuss the factors that led to the spread of misinformation, how they relate to disease outbreaks, and finally, what we need to do to prevent further harm.

How might we enable UC Berkeley students with varying levels of media engagement to identify and trust science-based health information?

Research

Research Methodology

How did we try to understand this challenge?

We decided to explore available literature extensively for this challenge, using our insights to inform our interviews with our target audience. In this process, we'd develop personas of each of our interviewees, understanding their diverse perspectives and drawing themes, key insights, and areas of opportunities to guide future research, prototyping, and ultimately, solutions.

NEW YORK TIMES: SURGEON GENERAL ASSAILS TECH COMPANIES OVER MISINFORMATION
Who do we trust for information?
Americans' Trust in Media Dips to Second Lowest on Record
36%
of Americans trust mass media
57
point gap in Republicans' and Democrats' media confidence
“A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact.”
If information is packaged in partisan manner to make us feel more comfortable, validated, and intelligent, then we will continuously consume, regardless of its truthfulness. In effect, we fall victim to confirmation bias. Media companies, trying to profit, are incentivized to use cognitive biases to increase engagement. Often, to maintain this engagement and positive user experience, media companies avoid their viewers falling into cognitive dissonance.
“A supercomputer ... can literally simulate 3 billion variations of a message, an advertisement, a photo … that’s not free speech, that’s free manipulation.”
With more than 3 billion users, social media is the most common means for receiving and processing information.
  • Social media is a hypernovel medium for information sharing: its fast, its accessible, and highly leveraged.
  • Social media algorithms mirror our most reactive selves, catering to what engages us most.
  • These algorithms identify and accentuate the outrageous, viral content to get us to engage and react.
  • Engagement leads to more time spent on social media, more time spent experiencing advertisements, and ultimately more profit.
  • Fake news have qualities that make it partisan, sensational, and controversial. It's perfect content for engaging us through the current social media algorithms.
conflicting messages

During the critical days for COVID-19 mitigation, U.S. public health institutions gave conflicting messages, which left the public unsafe and feeling skeptical.

U.S. public health messaging was in stark contrast to measures taken in China following the initial outbreak.

Why weren't we told to wear masks in the beginning?
"Well, the reason for that is that we were concerned the public health community, and many people were saying this, were concerned that it was at a time when personal protective equipment, including the N95 masks and the surgical masks, were in very short supply."

Key Insights

Research summary
01
Journalism is seen as increasingly partisan and incredible by the public
02
Media consumption increasingly relies on fast thinking rather than slow thinking
03
Social media algorithms take advantage of our cognitive biases
04
U.S. public health institutions damaged their credibility
How do UC Berkeley students experience mistrust & misinformation?

Interviews

Research methodology
We investigated UC Berkeley students' media diet, interactions, motivations, pain points.

We conducted in-person interviews with 3 UC Berkeley. The interview took about 45 minutes, and we encouraged our interviewees to speak from their personal experiences. For our interviews, we had the following goals:

Goal 1

We want to know how we can help people identify and trust science-based truth over misinformation regarding public health online.

Goal 2

Because misinformation, disinformation, and fake news is really prevalent on social media, we want to use design and technology to positively impact how people interact with health-related information online.

Goal 3

We want to learn about user behavior and how people respond to certain types of news they consider to be trustworthy or untrustworthy.

Interview Questions
Question 1

Where did you grow up? Where do you live now?

Question 2

What platforms or media outlets do you use to access trustworthy news online? What platforms do you consider untrustworthy

Question 3

From the platforms, what are the sources? For example, is it user generated content? Official media source?

Question 4

Do you access news from your phone or laptop? How many times do you go online to read the news on a daily basis? What times of the day? Where?

Question 5

Do you have any bad experiences with reading health information online? What happened?

Question 6

Where do you read information regarding the Covid-19 vaccine? Why do you prefer this platform over others?

Question 7

What sources? User generated content? Official media?

Question 8

Do you share trustworthy information about Covid-19? Do you repost, make comments, do you like, or/and share information offline with friends or family?

Question 9

How do you respond to the information you do not trust? For example, do you cross-check your information with other sources? Everytime? Who do you ask?

Question 10

Misinformation, Disinformation, and fake news is really prevalent on social media. What is your recommendation for academic researchers and high-tech companies to solve this problem?

Personas

Research methodology

Our personas were inspired by our anonymous interviewees, their key insights, and personal stories. We share their background, motivations, pain points, frequent news sources, and news activity.


What were the main takeaways?
Figure 2: Mindmap of extensive contextual analysis from driver’s perspective

Affinity Map

Research methodology

We used affinity mapping to externalize and meaningfully cluster observations and insights from our interviews, keeping us grounded in data as we designed, analyzed, and synthesized.

Interview Summary

Research summary & Synthesis
Tension
Fast vs. slow thinking
KEY INSIGHT
Checking news via notifications correlated with shallow engagement
Tension
Short vs. lengthy consumption
KEY INSIGHT
Accessible media types reduce information into bites
Tension
Institutions vs. dissent
KEY INSIGHT
Difficult to parse valid dissent from misinformation
Tension
Verification vs. credibility
KEY INSIGHT
There are few checks and balances on fact-checkers.
If information was ranked by trustworthiness, then UC Berkeley students can easily filter science-based truth from fiction.

Ideate

Areas of Opportunity

Research Synthesis
01

UC Berkeley students want to stay informed with science-based facts without having to read dense scientific literature.

SOLUTION

We can create guidelines for news sources on how to share key findings, statistics, and guidance succinctly.

02

UC Berkeley students want to share trustworthy sources with family and friends

SOLUTION

Before sharing, we can create an interface that notifies you the trustworthiness of the source.

03

UC Berkeley students have trouble identifying trustworthy sources

SOLUTION

We can reorient social media algorithms and notifications to highly prioritize trusted accounts

Reflection

How did I feel when working on the design challenge?

I felt overwhelmed, excited, confused, and encouraged. This was the first time I've done a rigorous design challenge with an overall process. Although I've been a graphic and interface designer for years, I've never actually gone through a regimented design process with multiple other people. I usually used intuition or sought inspiration in other people's end process.

What would I have done differently?

I would have spent much more time reflecting on our synthesis, scrutinizing our areas of opportunities for feasibility, clarity, and validity. I would have also liked to have framed the areas of opportunity in terms of desired impact on target population.Also, I would have liked to my design presentation to be more concise, really delivering on our synthesis, rather than giving too much context on our interviewees.

What did I learn about myself from from this challenge?

Temperamentally, I act as a bulldozer and am biased toward action. The design process requires one to be action oriented, but also be reflective on the project's direction, ensuring that all information and perspectives are accounted for. I've learned ask questions rigorously and constantly assess my approach towards solving the problems. Throughout the design process, I'd like to ensure that our assumptions about our audience and solutions are absolutely grounded in the facts as we understand them. Further, I'd like to actively disproving our understanding, ensuring that our assumptions are grounded rigorous.

Literature reviews, interview analysis, and affinity diagramming were challenging concepts at first, but I feel confident in ability to use these research methods in the future.

What did I learn about my team from this challenge?

I'm very thankful that our team was incredibly hardworking, insightful, and organized. I felt anxious at times because a lot of leadership was delegated to me, and frankly, I didn't fully know what I was doing a lot of the time. Other team members excelled in their areas of expertise.

portfolio
S
h
o
w
c
a
s
e
M
w
o
r
k