When Optimization Becomes Manipulation

It was a seemingly innocent moment. This morning, I downloaded a new app to my iPhone. Like countless times before, I tapped the app icon, eager to explore its features. Instead of welcoming me to the experience, the first screen demanded I sign up for a "free trial" โ€” prominently displaying a yearly subscription option labeled "BEST VALUE."

A familiar feeling of disappointment washed over me. I knew the app offered a completely free version, yet here it was, pushing me toward a subscription before I'd even experienced a single feature. My eyes searched for an escape route, eventually finding a barely visible gray "X" cleverly placed in the top left corner โ€” not the right where most users would expect it โ€” and deliberately designed to blend into the background.

This wasn't a design oversight. It was calculated manipulation.



The Human Behind the Screen

Just yesterday, I presented to a group of data leaders at one of our 33 Sticks clients discussing this very phenomenon โ€” what happens when data loses its human context. The timing couldn't have been more perfect, this morning's experience was the exact illustration of my message.

"When we remove the human element," I had told them, "when we stop seeing people behind the data, we start to do a lot of bad things."

A friend once remarked to me that "unintentional evil is still evil," referencing how social media platforms might not have been designed with malicious intent, yet their engagement-obsessed algorithms often create harmful outcomes. Sometimes it's not evil but simply unethical, disrespecting humans, breaking trust, or manipulating behavior.

This app's dark pattern tells the same story. Someone made deliberate choices to hide the free option, hoping users would initiate trials they might forget to cancel. Someone designed that camouflaged X, knowing its invisibility would drive conversion metrics higher. Someone removed the human from the equation.



From Metrics to Manipulation

There's a slippery slope in our industry that few are willing to acknowledge. It begins with innocent-sounding phrases: "We're driving engagement." Or "We're creating more efficient conversion funnels." Or "We're maximizing our marketing campaign conversion rates."

But without the anchor of human empathy, optimization inevitably becomes manipulation.

I've witnessed this transformation countless times. Teams start with genuine intentions to improve experiences, then gradually drift toward dark patterns: urgency banners creating false scarcity, endless scrolls designed for addiction rather than value, and misleading interfaces that trick rather than guide.

We're designing for dopamine hits rather than dignity. We're optimizing for quarterly revenue reports rather than human relationships. We're creating experiences that serve our metrics, not our users.



The Moment of Truth

As I stared at that hidden X, I couldn't help but imagine the meeting where this design was approved. Did anyone speak up? Did anyone ask, "Are we respecting our users with this design?" Or was the conversation solely about conversion rates and trial sign-ups?

This is the moment of truth that faces all of us in the digital industry. When metrics and human dignity come into conflict, which will we choose?

Too often, we convince ourselves that small deceptions are simply "good business." We rationalize that everyone else uses these tactics, so we must too. We tell ourselves users are savvy enough to see through our manipulations, even as we deliberately design to exploit their cognitive biases and attention limitations.



Reclaiming Empathy in Data

What would data-driven experiences look like if we refused to separate data from human context? What if every dashboard included not just conversion rates but also measures of user dignity, clarity, and genuine satisfaction?

The path forward isn't mysterious but it does require courage:

  • Start by seeing the whole person, not just their behavioral data. When analyzing user journeys, ask not just "Did they convert?" but "Did they understand their choices?" and "Did they feel respected by this interaction?"

  • Reframe success metrics to include both business outcomes and human experience. A "successful" experience isn't just one that extracts value from users; it's one that creates mutual value exchange where users feel they've been treated fairly.

  • Create space for ethical conversations in design and development processes. Simple questions like "Would we be comfortable explaining this design choice directly to our users?" can prevent many dark patterns from emerging.

  • Build a culture where team members feel empowered to raise concerns when designs drift toward manipulation. The most valuable person in any organization is the one who asks, "Is this right?" when everyone else is asking, "Will this work?"



The Courage to Choose Differently

As I finally navigated past that deceptive subscription screen this morning, I wondered about the people who created it. Did they feel pride in their work? Did they go home and tell friends about the clever ways they'd boosted conversion rates by hiding the free option? Or did they feel that subtle disconnect that comes from knowing your work doesn't align with your values?

For data professionals, designers, product managers, and executives, this is our daily choice. Will we use our expertise to manipulate, or will we commit to serving real humans with dignity and respect?

I've seen companies that choose the harder, more empathetic path. They're transparent about pricing. They make cancellations as simple as sign-ups. They explain clearly what data they collect and why. They design for clarity rather than confusion.

Interestingly, these companies often outperform their manipulative competitors in the long run. Why? Because trust creates sustainable relationships, while manipulation yields only temporary gains.



The Question That Remains

As I meet with data teams across the industry, I'm repeatedly struck by a troubling reality that many professionals know their design choices are unethical, yet they stay silent. Perhaps they don't feel they have the platform to push back against leadership decisions. Perhaps they worry about job security. Perhaps they've simply grown numb to these practices.

So I'll end with the question I asked in that client meeting yesterday, the same question that nagged at me when I encountered that hidden X this morning:

When are we going to start pushing back against unethical choices that devalue the humans using our products?

The answer begins with each of us, in the small choices we make every day between metrics and humanity, between manipulation and empathy, between optimization that exploits and optimization that serves.

The data itself isn't the problem. The problem emerges when we forget what that data represents: human beings with dignity who deserve our respect.

jason thompson

Jason Thompson is the CEO and co-founder of 33 Sticks, a boutique analytics company focused on helping businesses make human-centered decisions through data. He regularly speaks on topics related to data literacy and ethical analytics practices and is the co-author of the analytics childrenโ€™s book โ€˜A is for Analyticsโ€™

https://www.hippieceolife.com/
Previous
Previous

Will Generative AI Replace Digital Analysts?

Next
Next

The Rule Sandwich: now for the webSDK!