Getting Data Quality Right: A Tactical and Strategic Guide

Sean Hewitt
July 16, 2025
ABSTRACT: Data quality issues often reflect broken processes, not bad data. This article outlines a dual-track approach to improvement—balancing tactical fixes with long-term strategy.
What You’ll Learn in This Guide:
Why data quality issues persist—and how misalignment, not malfunction, is often the root cause.
How to strike a balance between quick tactical fixes and long-term strategy.
What a mature DQM program looks like—and how to grow into it step by step.
The role of diagnostic tools and workflow mapping in targeting the right problems.
A five-step tactical framework to stabilize data quality fast.
How to use SIPOC as a practical tool to align teams and uncover root causes.
What is data quality, and why does it remain so elusive, even for modern organizations?
Data quality is best understood as a contextual measure of fitness for purpose. What qualifies as 'good data' depends on how and where it's used. As Sean Hewitt explained during Datalere's recent webinar, How to Handle Nasty, Gnarly, and Pernicious Data Quality Issues, data is considered high-quality only if it is "fit for purpose." That purpose may differ across departments, products, or decision cycles, which makes managing quality a dynamic challenge.
Bad data is rarely malicious or even incorrect in a strict sense. Instead, it is often misunderstood, misapplied, or misaligned with expectations. This is why effective data quality management (DQM) is more about sustained alignment than one-time cleansing. A DQM framework offers that structure: a repeatable process to monitor, manage, and improve the usefulness of data over time.
“We don’t fix data—we fix the processes and systems that create it.” — Sean Hewitt, Data Governance Lead.
The Nasty, Gnarly, and Pernicious Challenges
Improving data quality often starts with fixing fields, but it rarely ends there. It requires examining the deeper processes that produce and manage the data. We describe persistent quality issues as "nasty, gnarly, and pernicious" because they are often buried in the very way organizations operate:
People: Data ownership is unclear. Teams are protective of "their" data, reluctant to share across departments, or fearful that quality improvements might expose past errors. In some cases, this reluctance stems from concerns about job security or territorial control.
Processes: Data gets pulled offline into spreadsheets, modified in silos, or re-entered manually, creating a chain of inconsistency and invisibility. Wayne Eckerson shared a real-world example of the Royal Canadian Police, where officers manually entered information into custom monitors after an incident—a process that invited inconsistency and error.
Metadata: There's often a lack of shared definitions, business rules, and lineage. What one team calls a "customer" might be radically different in another system.
Wayne noted that these problems intensify when data moves across systems and teams, as in partnerships or multi-team workflows. Synchronization becomes a moving target.
In these conditions, teams often end up chasing ghosts rather than solving real problems. Business users don’t trust the numbers. IT teams revalidate reports repeatedly. The data appears to be incorrect, even when it isn't.
A Multi-Angled Approach: Tactical Meets Holistic
As Sean put it, "you can't do all of strategy and all of tactical at once," emphasizing the importance of balance over sequencing. This interplay is especially valuable in environments where trust in data is low but expectations remain high.
Solving these issues requires a blended approach. Teams must act fast to regain trust while simultaneously laying the groundwork for sustainable quality. We suggest a dual path forward:
Tactical fixes restore stability. These are focused actions to reduce noise and address specific bottlenecks.
Holistic strategy builds structure. Over time, teams formalize ownership, governance, and shared practices.
These two modes are not sequential but symbiotic. As you build more strategic capabilities in the background—such as standard definitions, quality controls, or metadata models—you unlock more tools and processes that can be embedded into tactical workflows. The more structure you have in place behind the scenes, the more leverage you gain in addressing real-time issues.
Over time, this layered approach keeps the organization in sync, improves data quality metrics, and steadily reduces the operational backlog.
The Data Quality Management Program Journey
Once organizations acknowledge that tactical and holistic approaches must work in parallel, the next question becomes: what does a structured path toward long-term improvement look like?
The journey typically unfolds across six stages, moving from reactive fixes to embedded governance and continuous improvement:
1. Initiation: Teams begin by assigning responsibility, aligning on goals, and forming a shared vocabulary around data quality. This stage marks the shift from individual firefighting to collective intent.
2. Remediation: A dedicated working group—or “tiger team”—is formed to tackle recurring issues. Tactical fixes become standardized through documentation and process clarity, while quality-by-design practices start influencing new data flows.
3. Planning: At this stage, the organization starts to think strategically. Teams define formal DQM standards, identify critical data elements (CDEs), and lay the groundwork for a Center of Excellence or Community of Practice.
4. Implementation: Plans become operational: platforms are launched, CoEs/CoPs are activated, and proactive monitoring is embedded into workflows. Data quality becomes part of daily execution—not just ad hoc cleanup.
5. Optimization: Organizations expand monitoring, increase automation, resolve systemic root causes, and begin certifying critical reports. DQM is now an integrated part of operations, not a parallel initiative.
6. Sustain: Continuous improvement takes root. Regular reviews, governance integration, and stakeholder feedback loops ensure that quality remains aligned with business needs—even as they evolve.
See our DQM Program Journey slide for a visual summary of these stages.
If you're unsure how to structure or scale your data quality efforts, our consultants can help assess your current practices and recommend the right next steps.
→ Talk to Our Data Quality Experts
From Program to Practice: Applying Tactical Stabilization
Diagnostic Tools and Mapping
Before any fix can be effective, teams need to understand where data issues originate. That’s why the first step in any tactical approach is diagnostic: mapping workflows, interviewing stakeholders, and surfacing the underlying inefficiencies that quietly erode quality. These conversations often reveal undocumented steps, duplicate processes, or mismatched assumptions that prevent sustainable fixes.
With this shared visibility in place, teams are better equipped to take action. Tactical approaches help reduce noise, build trust, and buy time for longer-term improvements to take hold.
We recommend a five-step tactical approach, adapted from Lean and Six Sigma, to help teams regain control quickly:
Tactical Approach
Understand the Process and Issues: Map out how data flows across people and systems. Use tools like SIPOC (see section below) to identify where handoffs break down. Intake processes that span multiple departments often reveal inconsistencies—especially when data is entered manually or duplicated across disconnected systems.
Design Focused Improvements: Address upstream issues through clearer metadata, standardized terms, or aligned processes. Varying interpretations of core business entities—such as a 'customer'—often create friction between teams and require both system-level changes and semantic alignment.
Plan the Implementation: Assign responsibilities, define escalation paths, and determine how improvements will be tracked. Without clearly defined ownership and cadence, even well-intentioned fixes can lose traction.
Implement and Standardize: Roll out the change through documented SOPs and governance routines. Improvements are more likely to stick when embedded into onboarding, training, and ongoing workflows.
Monitor and Manage: Track quality levels continuously and keep communication loops open. Automated monitoring and recurring reviews help maintain visibility and drive ongoing improvement.
The Role of SIPOC in Stabilization
To support this mapping effort, we recommend the SIPOC model—a Lean tool that helps teams break down complex processes into digestible parts.
“Use SIPOC as a conversation starter,” we often advise. “It’s a practical tool for surfacing early insights and getting everyone aligned on what the process really looks like.”
The acronym stands for:
Suppliers
Inputs
Process
Outputs
Customers
By filling out a SIPOC diagram collaboratively, cross-functional teams can align on what matters and who owns it, without needing a detailed systems diagram.
[Download our SIPOC template here]
Many organizations engage external consultants at this stage to facilitate mapping sessions or SIPOC workshops. An outside lens often reveals what internal teams may overlook.
→ Talk to Our Data Quality Experts
Conclusion: Begin with What You Can Control
Improving data quality can feel overwhelming, but you don’t have to solve everything at once. Start with what you can see and influence. Map the flow. Find the friction. Fix what you can.
By focusing on tactical stabilization first, teams can quickly reduce noise, restore trust, and create the necessary space to build toward a more comprehensive program. Over time, these small, focused wins help formalize ownership, embed shared standards, and reduce the organizational drag caused by unreliable data.
In our next article, we’ll explore how engineering practices—from test-driven pipelines to reusable data components—can help teams embed quality by design.
Need Help Implementing a Data Quality Program?
Whether you're mapping workflows, planning tactical fixes, or launching a formal DQM program, we can help.
→ Book a Data Quality Assessment
Key Takeaways:
Data quality problems often stem from misalignment—not malfunction. Inconsistent definitions, disconnected workflows, and siloed ownership create more issues than outright data errors.
Tactical fixes stabilize, strategy sustains. Quick interventions help restore trust, but long-term health requires formal programs, shared standards, and proactive monitoring.
Understanding comes before fixing. Tools like workflow mapping and SIPOC uncover hidden breakdowns, helping teams prioritize the right interventions.
Progress is layered. Tactical improvements buy time and confidence, while strategic work in the background builds a stronger foundation.
You don’t need to fix everything at once. Start where you have visibility and influence—then grow your program with intention.

Sean Hewitt
A proven leader in Data Governance, Privacy, and Analytics with a solid track record of managing teams, defining needs and delivering solutions. Over 20 years of experience working in a...
Talk to Us