To Leave or Not to Leave: A Configurational Approach to Understanding Digital Service Users' Responses to Privacy Violations Through Secondary Use
Christina Wagner, Manuel Trenz, Chee-Wee Tan, and Daniel Veit
This study investigates how users respond when their personal information, collected by a digital service, is used for a secondary purpose by an external party—a practice known as External Secondary Use (ESU). Using a qualitative comparative analysis (QCA), the research identifies specific combinations of user perceptions and emotions that lead to different protective behaviors, such as restricting data collection or ceasing to use the service.
Problem
Digital services frequently reuse user data in ways that consumers don't expect, leading to perceptions of privacy violations. It is unclear what specific factors and emotional responses drive a user to either limit their engagement with a service or abandon it completely. This study addresses this gap by examining the complex interplay of factors that determine a user's reaction to such privacy breaches.
Outcome
- Users are likely to restrict their information sharing but continue using a service when they feel anxiety, believe the data sharing is an ongoing issue, and the violation is related to web ads. - Users are more likely to stop using a service entirely when they feel angry about the privacy violation. - The decision to leave a service is often triggered by more severe incidents, such as receiving unsolicited contact, combined with a strong sense of personal ability to act (self-efficacy) or having their privacy expectations disconfirmed. - The study provides distinct 'recipes' of conditions that lead to specific user actions, helping businesses understand the nuanced triggers behind user responses to their data practices.
Host: Welcome to A.I.S. Insights, powered by Living Knowledge. In today's digital world, we trade our personal data for services every day. But what happens when that data is used in ways we never agreed to? Host: Today, we’re diving into a study titled "To Leave or Not to Leave: A Configurational Approach to Understanding Digital Service Users' Responses to Privacy Violations Through Secondary Use". It investigates how users respond when their information, collected by one service, is used for a totally different purpose by an outside company. Host: To help us unpack this, we have our analyst, Alex Ian Sutherland. Alex, welcome. Expert: Great to be here, Anna. Host: So, let's start with the big problem here. We all know companies use our data, but this study looks at something more specific, right? Expert: Exactly. The study calls it External Secondary Use, or ESU. This is when you give your data to Company A for one reason, and they share it with Company B, who then uses it for a completely different reason. Think of signing up for a social media app, and then suddenly getting unsolicited phone calls from a telemarketer who got your number. Host: That sounds unsettling. And the problem for businesses is they don't really know what the final straw is for a user, do they? Expert: Precisely. It’s a black box. What specific mix of factors and emotions pushes a user from being merely annoyed to deleting their account entirely? That's the gap this study addresses. It’s trying to understand the complex recipe that leads to a user’s reaction. Host: So how did the researchers figure this out? It sounds incredibly complex. Expert: They used a fascinating method called Qualitative Comparative Analysis. Instead of looking at single factors in isolation, it looks for combinations of conditions that lead to a specific outcome. Think of it like finding a recipe for a cake. You need the right amount of flour, sugar, *and* eggs in the right combination to get a perfect result. Host: So they were looking for the 'recipes' that cause a user to either restrict their data or leave a service completely? Expert: That's the perfect analogy. They analyzed 57 real-world cases where people felt their privacy was violated and looked for these consistent patterns, these recipes of user perceptions, emotions, and the type of incident that occurred. Host: I love that. So let's talk about the results. What were some of the key recipes they found? Expert: They found some very clear and distinct pathways. First, for the outcome where users restrict their data—like changing privacy settings—but continue using the service. This typically happens when the user feels anxiety, believes the data sharing is an ongoing issue, and the violation itself is just seeing targeted web ads. Host: So, if I see an ad for something I just talked about, I might get a little worried and check my settings, but I'm probably not deleting the app. Expert: Exactly. You feel anxious, but it's not a huge shock. The recipe for leaving a service entirely is very different. The single most important ingredient they found was anger. When anxiety turns into real anger, that's the tipping point. Host: And what triggers that anger? Expert: The study found it's often more severe incidents. It’s not about seeing an ad, but about receiving unsolicited contact—like those spam phone calls or emails. When that happens, and it’s combined with a user who feels they have the power to act, what the study calls 'high self-efficacy', they are very likely to leave. Host: So feeling empowered to delete your account, combined with anger from a serious violation, is the recipe for disaster for a company. Expert: Yes, that or when the user’s basic expectations of privacy were completely shattered. If they truly trusted a service not to share their data in that way, the sense of betrayal, combined with anger, also leads them straight for the exit. Host: This is the most important part for our listeners, Alex. What are the key business takeaways from this? How can leaders apply these insights? Expert: The biggest takeaway is that a one-size-fits-all response to privacy issues is a huge mistake. Businesses need to understand the context. Seeing a weird ad creates anxiety; getting a spam call creates anger. You can't treat them the same. Host: So you need to tailor your response based on the severity and the likely emotion. Expert: Absolutely. My second point would be to recognize that unsolicited contact is a red line. The study makes it clear that sharing data that leads to a user being directly contacted is far more damaging than sharing it for advertising. Businesses must be incredibly careful about who they partner with. Host: That makes sense. What else? Expert: Monitor user emotions. Anger is the key predictor of customer churn. Companies should actively look for expressions of anger in support tickets, app reviews, and on social media when privacy issues arise. Responding to user anxiety with a simple FAQ might work, but responding to anger requires a public apology, a clear change in policy, and direct action. Host: And finally, you mentioned that empowered users are more likely to leave. Expert: Yes, and that’s critical. As people become more aware of privacy laws like GDPR and how to manage their data, companies can no longer rely on users just sticking around out of convenience. The only defense is proactive transparency. Be crystal clear about your data practices upfront to manage expectations *before* a violation ever happens. Host: So, to summarize: it’s not just that a privacy violation happens, but the specific combination of the incident, like web ads versus a phone call, and the user's emotional response—anxiety versus anger—that dictates whether they stay or go. Host: For businesses, this means understanding these different 'recipes' for user behavior is absolutely crucial for building trust and, ultimately, for retaining customers. Host: Alex, this has been incredibly insightful. Thank you for breaking that down for us. Expert: My pleasure, Anna. Host: And thank you for tuning into A.I.S. Insights, powered by Living Knowledge.
Privacy Violation, Secondary Use, Qualitative Comparative Analysis, QCA, User Behavior, Digital Services, Data Privacy