The Espresso Paradox: How Automation Might Be Diluting Our Analytical Expertise

As a passionate data professional who has spent decades in analytics, i've found myself drawing parallels between my two long-standing passions, data analysis and espresso making.

The connection might seem a stretch at first glance but as i'll explain, there's a profound lesson here about expertise, automation, and the future of analytical thinking given the current excitement around generative AI.

33 Sticks Data Correlation Small Batch

Origin: Uganda
Bean: AAA
Roast: Dark
Roaster: Kip Rolfe, Owner of Coast 2 Coast Coffee Roasters located in Los Angeles, California

The Art Behind the Cup

i'm what you might call a coffee connoisseur. My journey into the world of coffee spans multiple decades, exploring various methods from roasting to preparation. My home hosts an arsenal of brewing equipment, a cold brewer, stovetop percolator, AeroPress, Hario V60 pour-over, and my daily workhorse, the Breville Barista Express espresso machine.

It's this espresso machine that became the catalyst for my epiphany about data analysis in the Generative AI era.

Making espresso seems straightforward at first but conceals remarkable complexity. For those who view coffee as merely a caffeinated necessity, the nuances might seem trivial. But for someone who appreciates a truly exceptional cup, especially when investing in high-quality, small-batch beans, the process requires methodical attention to many, many variables.

Consider what goes into a single shot:

The bean selection and weight is critical. While recipes might suggest 18 grams, i've learned through experience that the optimal amount might be 19 grams for one bean and as little as 16 for another. The proper weight depends on roast level, bean origin, farm practices, and processing methods, knowledge that comes only through experience, analysis, and meticulous data capture.

The grind size selection is equally crucial. My mid-level grinder offers about 40 different settings, and selecting the right one requires understanding the bean's origin, processing method (natural, washed, honey, etc.), roast darkness, and even how many days have passed since roasting. These variables all impact the ideal grind size for that specific bean.

Then comes the leveling and tamping of the grounds. If you've miscalculated the grind or applied incorrect pressure, the extraction will fail. Too dense, and water won't flow, "choking the puck," as baristas say. Too loose, and water races through, under-extracting the coffee. What we seek is the Goldilocks zone of perfect extraction.

Water temperature matters too, with variations of just a few degrees between 198°F and 205°F drastically changing the outcome. And then there's the extraction ratio, in my case, typically 18 grams of coffee to 36 grams of water, ideally extracted over about 30 seconds.

What makes this process educational is that i perform each step manually. i understand intimately how each variable impacts the final outcome. When something goes wrong, when a shot comes out too weak, too strong, under-extracted, or over-extracted, i can identify precisely which part of the process needs adjustment. This knowledge comes from years of experimentation, countless successes and failures, and rigorous data capture and analysis.


The Paradox of Automation

Now, let’s contrast my approach with a friend who purchased a higher-end Breville machine where most of the process is automated, from bean weighing to grinding, tamping, extraction time, and water temperature.

For someone without experience pulling espresso shots, when the result is disappointing, it becomes nearly impossible to diagnose where the process broke down. Not only does the novice lack the experience to identify the problem, but they also can't see inside the closed system of automation to understand what decisions the machine is making.

How does the machine select grind size? How does it determine tamping pressure or extraction time? Is it using my preferred 18:36 ratio or is it making its own adjustments? We simply don't know, as these parameters remain hidden from view.

The machine may be making educated guesses based on brew head pressure and extraction time but does it truly understand the bean's origin, roast level, oil content, or age? i doubt it. And without the foundational knowledge that comes from manual experience, users might simply trust that whatever comes out of this automated machine must be correct, without any baseline for comparison.

Even for those who can recognize suboptimal results, the automated process provides no framework for improvement. Without hands-on experience, how can one know whether to adjust the grind, the tamp, the temperature, or some other variable?


The Analytics Parallel

This brings me to analytics and the growing role of Generative AI in our field. My concern is that as we increasingly rely on Generative AI to perform analysis for us, we're robbing young professionals, even more mature professionals, of crucial learning experiences.

Just as we might turn over espresso-making to an automated machine, we're beginning to offload analytical thinking to Generative AI systems. And unless individuals are extraordinarily curious and committed to understanding the underlying processes, most will likely take this as an opportunity to disengage from learning.

Push a button, out comes an espresso. Push a button, out comes analysis.

When we skip the learning process, we lose the ability to discern good analysis from bad. We place blind trust in systems without the foundational knowledge to evaluate their output critically. And even when we suspect something is amiss, we lack the experiential knowledge to identify where the problem lies.

Is it in the data collection? The processing methodology? The mathematical approach? The data pivoting? The visualization technique? Without having performed these steps ourselves, repeatedly, with both successes and failures, we simply don't know.


The Cost of Convenience

This automation-driven shortcut to expertise comes at a significant cost. It makes us fundamentally lazy, diminishes our critical thinking skills, and ultimately places us in a position where we have no choice but to place unfettered, unquestioning trust in Generative AI systems.

If we replace all our experienced analysts, those who have metaphorically "pulled their espresso manually", with automated systems, who remains to question whether the output is correct? Who maintains the institutional knowledge of how analysis is actually performed?

The most dangerous outcome isn't just suboptimal analysis, it's that we might not even recognize it as suboptimal. Without baseline knowledge developed through experience, we lose our ability to discriminate between insightful analysis and algorithmic guesswork.


Finding Balance

i'm not suggesting we reject automation or Generative AI-assisted analytics, far from it. These tools offer tremendous potential to scale our capabilities and tackle previously insurmountable challenges. But we must approach them as extensions of human expertise, not replacements for it.

Perhaps the solution lies in a blended approach, ensuring that every analyst, regardless of career stage, experiences the full analytical process manually before graduating to automated tools. Maybe we need to establish "analytical apprenticeships" where professionals learn by doing, making mistakes, and developing the intuitive understanding that can only come from hands-on experience.

Like the barista who understands every step of espresso-making before using an automated machine, the analysts of tomorrow need to build foundational knowledge before leveraging Generative AI shortcuts. Only then can they use these powerful tools with the discernment and critical eye necessary to ensure quality results.



We currently face a fundamental choice. We can embrace convenience at the cost of understanding or we can commit to maintaining human expertise alongside technological advancement.

The latter path is undoubtedly more challenging, requiring investment in training, patience with learning curves, and tolerance for the inefficiencies inherent in human development. But it's also the only path that preserves our capacity for true analytical thinking, the kind that questions assumptions, challenges outputs, and drives genuine insight.

In both espresso-making and data analysis, the magic happens not when we automate away complexity but when we master it. The perfect shot, like the perfect analysis, comes from understanding each variable, each step, each potential pitfall in the process.



Written by: jason thompson

Jason is the CEO and co-founder of 33 Sticks, a boutique analytics company focused on helping businesses make human-centered decisions through data. He regularly speaks on topics related to data literacy and ethical analytics practices and is the co-author of the analytics children’s book ‘A is for Analytics’

jason thompson

Jason Thompson is the CEO and co-founder of 33 Sticks, a boutique analytics company focused on helping businesses make human-centered decisions through data. He regularly speaks on topics related to data literacy and ethical analytics practices and is the co-author of the analytics children’s book ‘A is for Analytics’

https://www.hippieceolife.com/
Next
Next

33 Sticks Launches Inaugural Internship Program in Partnership with ETSU