As Analysts, We Must Approach Data with Skepticism and Critical Thinking

Data is everywhere. We must remain curious.

If we don’t remain curious about data and how it’s being used, or misused, it become far too easy to accept statistics at face value. When we’re bombarded with reports and summaries from seemingly credible sources, we often fail to question the assumptions, biases, and methodologies that drive these findings. However, as professionals working with data, it is our responsibility to think critically and be skeptical, especially when data is used to push a specific agenda or narrative.

Take, for example, a recent LinkedIn post that made sweeping claims about the advantages of in-office work, drawing from both personal experience and unspecified data. The author suggests that relationships are stronger in person, which leads to favorable outcomes in promotions and layoffs. While this seems to make intuitive sense based on "basic human psychology," it’s a perfect example of why analysts need to be cautious. Such claims, while compelling, can often be supported by data that isn’t transparent, is oversimplified, or presented in a way that reinforces existing biases.

Transparency in Data: The Foundation of Credibility

One of the first red flags we should watch for when analyzing data is a lack of transparency. In the case of the study cited in the LinkedIn post, the dataset isn’t available for independent verification. Instead, the study offers only summary statistics, broad overviews that tell part of the story but leave critical details hidden from view.

Why does this matter? Without access to the underlying data, it’s impossible to evaluate the robustness of the analysis. Were the data collection methods sound? Were important variables controlled for, or were outliers excluded? These questions remain unanswered when we can’t scrutinize the data for ourselves. For analysts, transparency is everything. It allows us to verify findings, understand the context in which the data was collected, and assess whether the conclusions drawn are justified or misleading.

When summary statistics are all that’s available, we should be wary. Conclusions derived from partial information can be easily skewed, either intentionally or unintentionally, to serve a particular agenda.


The Danger of Oversimplified Assumptions

Another critical issue with many workplace studies, like the one implied in the LinkedIn post, is the oversimplification of key assumptions. For example, the study’s authors reportedly used a 50-mile radius to define remote workers. This may seem like a straightforward way to differentiate between in-office and remote employees, but it ignores the growing complexity of modern work arrangements.

Many employees operate within hybrid models, splitting their time between working remotely and commuting to the office. A 50-mile classification doesn’t capture the full range of experiences for someone who spends part of the week working from home and the other part commuting for team meetings or specific projects. It’s a blunt instrument being used to measure a nuanced reality.

By relying on such a narrow definition, the study potentially misrepresents the dynamics of remote and hybrid work, making its conclusions less reliable. As data analysts, it’s our job to question these assumptions and consider how they might distort the findings. When studies reduce complex human behaviors to binary categories, we lose out on the richness of the data that could lead to more actionable and nuanced insights.


Aggregated Data Masks Critical Insights

One of the biggest pitfalls in data analysis is the use of aggregate statistics, which, while useful for painting a broad picture, often mask critical insights hidden within the data. Averages and summary percentages tend to flatten the nuances that could provide much deeper understanding.

For example, how does the supposed benefit of in-person relationships differ when broken down by age, experience, or industry? Does an early-career professional experience the same value from being in the office as a seasoned executive? Does a tech worker who spends most of their time on individual coding projects really benefit from in-person collaboration or would they be just as productive (if not more) working remotely?

Without breaking the data into more granular categories, we miss these important distinctions. For job seekers and employers alike, these details are far more actionable than broad statistics about workplace dynamics. Yet, many studies prioritize generalized conclusions that hide these critical insights, limiting their practical value.


The Role of Bias in Data Interpretation

Perhaps one of the most significant yet overlooked issues in data interpretation is bias, both in how the data is presented and how we as analysts or consumers of data interpret it. Data is rarely neutral. It’s shaped by the people who collect it, the methods they choose, and the narratives they want to support.

In the case of the LinkedIn post, the author’s argument is framed in a way that anticipates and dismisses dissent. By labeling remote work supporters as the "this isn’t fair crowd," the post immediately frames any opposition as emotional and irrational. This is a classic example of how data can be used to reinforce confirmation bias. The data supporting the superiority of in-person work is accepted at face value, while alternative perspectives are marginalized or ignored.

As analysts, we need to remain hyper-aware of these biases, not only in the data we analyze but also in our own interpretations. Confirmation bias can easily lead us to cherry-pick the data that supports our preconceived notions while ignoring or dismissing evidence that contradicts them.

In this case, there’s a clear preference for in-person work (even while claiming the post author’s team is “remote”), and data is being wielded to back up that perspective. But what about the data that shows remote work can increase productivity, employee satisfaction, and work-life balance? These perspectives are often left out of the conversation when the narrative is so strongly skewed toward reinforcing traditional work models.


Understanding the Incentives Behind Data

Finally, it’s essential to consider who benefits from the data being presented. In this instance, the study’s authors profit from selling employment-related data. The more controversial the topic, such as remote work versus in-person work, the more engagement their reports are likely to generate. This kind of engagement drives social media buzz, increases brand visibility, and ultimately leads to higher sales.

When we encounter data that seems to support a divisive or hot-button issue, we need to ask ourselves: Who gains from this narrative? In this case, it’s not just about understanding the data itself but also recognizing the incentives behind its presentation. Data is a tool, and like any tool, it can be used to build or to manipulate.


Critical Thinking: The Analyst’s Responsibility

As data professionals, our role goes beyond crunching numbers. We must cultivate a skeptical mindset that questions the assumptions, methods, and motivations behind the data we analyze. This critical thinking is essential not just for ensuring the accuracy of our own work but also for protecting against the manipulation of data by others.

When we see claims backed by data, whether in favor of in-person work, remote work, or any other topic, we need to dig deeper.

We must ask:

  • Is the data fully accessible and transparent?

  • What assumptions were made and how might they limit or skew the results?

  • Are the findings based on aggregated data that oversimplifies a complex reality?

  • What biases might be present, either in the data collection or in its interpretation?

  • And perhaps most importantly, who stands to benefit from this data-driven narrative?

Only by asking these tough questions can we ensure that the data we work with is truly informative, not manipulative. As professionals entrusted with the power of data, it’s our duty to challenge surface-level interpretations and seek out the deeper truths hidden beneath the numbers.


Critical thinking and skepticism are not just optional tools for data analysts, they are essential. Without them, we risk allowing flawed, biased, or incomplete data to influence important decisions. In a world where data is used to shape narratives and drive opinions, our responsibility is to approach it with a healthy dose of doubt and a commitment to uncovering the full story.

“Data is everywhere. Stay curious!”

-Jason Thompson, CEO, 33 Sticks

jason thompson

jason is the co-founder and CEO of 33 Sticks where his purpose is to create positive experiences for employees, customers, and marketplace.

He is an Industry Fellow at East Tennessee State University’s Research Corporation providing experiential learning opportunities for students around brand strategy and analytics. And also the co-author of the analytics children’s book A is for Analytics.

https://www.hippieceolife.com/
Previous
Previous

Why We Created 'A is for Analytics'

Next
Next

Everyone Hates Popups But……They Work!!!