Privacy-preserving UX research
For the past 6 months, we’ve been working on Signalboost as part of Mozilla’s Data Futures Lab. As part of our participation in the lab, I conducted user research to better understand the unique intersection our project and users operates in.
For those unfamiliar, Signalboost is a messaging application that provides encrypted broadcasts and hotlines to activists, organizers, and other vulnerable populations without relying on SMS or exposing personal phone numbers.
People have used it to:
- Run a hotline to monitor election suppression efforts and disinformation by hostile foreign governments
- Broadcast alerts and live updates about Black Lives Matter marches
- Coordinate supplies and trainings for community mutual aid efforts in the face of Covid-19
Because of the nature of the work our users do, privacy is a fundamental value of ours. While many companies are trying to collect as much user data as possible in order to maximize engagement, or sell that data later, here at Signalboost we have the opposite problem: we are a nonprofit that doesn’t rely on our users’ data for profit, so we want to collect the least possible amount of it.
It’s a tricky balance trying to collect the least possible data on our users while still deeply understanding their needs and what motivates them. Here’s how I went about user research while navigating that.
Finding the right people to interview
The first step in any user research process is to develop a research question. After all, the research process is time consuming and expensive for a small, bootstrapped team, so we wanted to be very intentional about who we talked to.
After consulting with the team, we decided that the goal for our research was to understand who our users are, specifically by better understanding the following:
- their backgrounds and what motivates them to engage in social justice oriented organizing work
- what things they find threatening to their safety & security online
- what tools they use for digital communication
Then we developed a screener, which is a form/questionnaire that helps you find the most appropriate participants for your user study. Simply Secure has an excellent guide on building a screener.
We kept ours pretty simple:
We then selected participants based on their availability and variety when it came to how they answered the “What do you use Signalboost for?” question.
Privacy oriented communication and a data retention policy at every stage
At each stage of the communication process we chose to use privacy-oriented communication platforms (read: encrypted) and that were either ephemeral or had the option to delete data, which we did after 30 days. The platforms we chose were the following:
- Signalboost for putting out a call for interested users
- Formstack for screening users on fit and availability
- Signal Messenger for communicating with individuals
- Jitsi Meet for conducting interviews
A trusted advisor of ours recommended Formstack, and we evaluated whether to use this platform on the following criteria:
- Does it have the functionality that we need: i.e. a form with multiple types of fields that can be easily shared via URL?
- Does it have an option to encrypt submissions to a password? This is important because it means that the company hosting the form (Formstack) cannot view the contents of the submissions even if they wanted to.
- Do they have a privacy policy that outlines what data collect and what they use it for?
- Does it have the ability to permanently delete submissions?
Based on the submissions to Formstack, we were able to schedule video chat interviews over a two week period.
Practicing consent during the interview process
Finally, we got to talk to actual people! In these interactions, where users are sharing potentially sensitive information, it’s important to recognize the power dynamics between interviewer and interviewee and practice consent. A great resource to learn more about consent in the research process is the Consentful Tech Project.
One of the amazing things about practicing consent is that it will actually make your interviews better: the more care you put into the process, the more comfortable participants will feel in sharing how they authentically feel.
Some ways we tried to put that into practice during this process:
User bill of rights
Before each interview I sent the interviewee a Bill of Rights, adapted from one from Simply Secure:
I also quickly went over the bill of rights in the beginning of the interview in order to ask for verbal consent.
Asking permission to record, and informing interviewees of data retention policy
It should go without question that it’s really important to ask participants for active consent before doing any type of recording! As a team of 1 leading this process, I found it difficult to be both an active interviewer/listener and take detailed notes at the same time, so I asked participants if I could record audio of our conversation. I told them I would delete such recordings after 30 days, set a calendar reminder, and did so.
Be clear about the purpose of the interviews and what will be shared, with whom
It’s a good idea to be transparent with participants about why you are conducting this research. In our case, I told them about our goals, and asked them for consent to share anonymized contents of the interview notes with my team. I also asked them for consent to share high level, anonymized anecdotes about their use cases publicly.
This is also a good opportunity to remind participants that they are being asked for feedback and not being evaluated on how well they know how to use the tool.
Paying users for their time
In the nonprofit world, finding the funds to pay for any research can be a pain, but compensation helps recognize that participants are providing a lot of value by giving their time and sharing their stories. Being privacy-conscious also means giving participants a choice in how they would like to receive compensation: all of ours chose either Venmo or Paypal.
Persona development
There were many benefits to conducting this type of user research (including getting our team inspired by the stories about what people are using our tool for) but one of the most impactful outcomes was in helping us develop user personas.
"Personas are a tool that represent the needs, thoughts, and goals of the target user"
- Tor UX
Personas help summarize the diverse needs of many users into a handful of accessible archetypes. They are a powerful tool when practicing user-centered design - now, as a team, we often ask ourselves “how would this feature make Zeya’s life easier?”?
Here’s an example of one of our personas — Zeya is based on several South Asian activists I have interviewed through the course of this project, working in the human rights and internet freedom field.
Credits
- Our process was heavily influenced by Shirin Mori’s Usability Case Study, and they also provided invaluable feedback on the UX process and our personas. <3
- Simply Secure was an incredible knowledge base for designing elements of the secure user research process from the ground up: https://simplysecure.org/knowledge-base/
- The Second Muse needfinding framework was massively helpful in understanding how to shape this process: https://www.secondmuse.com/internet-freedom/
- Our personas are heavily inspired by Tor’s: https://gitlab.torproject.org/tpo/ux/research/-/blob/master/persona/jelani.pdf