Data Quality Engineer
Calimala partners with enterprises across the Gulf and Europe to design, build, and scale Data & AI teams. As a Data Quality Engineer, you’ll join a network of practitioners who understand that no matter how advanced the platform or model, it only works if the underlying data is reliable.
This role sits at the core of how data is validated, monitored, and improved. You’ll design and implement data quality controls across critical pipelines and domains—helping clients define what “good” looks like, detect issues quickly, and feed improvements back into upstream processes and systems.
What you'll be doing
As a Data Quality Engineer at Calimala, you’ll lead and support engagements where trust in data is business-critical. One project might involve defining quality rules and checks for a new data platform; another could focus on building automated tests and monitoring on top of existing pipelines, surfacing issues before they reach dashboards or models.
“We treat data quality as a continuous practice: measurable expectations, automated checks, and clear ownership when something doesn’t look right.”
You’ll work closely with data engineers, governance teams, and business stakeholders to understand how data is used and where errors matter most. You’ll help define quality dimensions and KPIs, implement rule engines and validation logic, and create feedback loops so that recurring issues are fixed at the source rather than patched downstream.
Who we're looking for
You’re comfortable working close to the data, with a strong eye for detail and a practical mindset. You can profile datasets, design checks that matter, and translate business expectations into technical rules and monitoring.
You’ve likely worked in data engineering, BI, or data management roles and found yourself naturally gravitating toward quality and consistency. At Calimala, we value depth, accountability, and partnership—you take pride in knowing that people can rely on the numbers because the right controls are in place.
Strong SQL skills and experience working with large datasets in warehouses or data platforms
Hands-on experience implementing data quality checks, validation rules, or reconciliation processes
Familiarity with data quality and profiling tools (native to the platform or dedicated solutions)
Understanding of key quality dimensions (completeness, accuracy, consistency, timeliness, uniqueness, validity) and how to measure them
Experience working with data engineers and analysts to embed checks into pipelines (batch and/or streaming)
Exposure to monitoring and alerting practices for data quality, including dashboards and reporting to stakeholders
Awareness of how data quality ties into governance, regulatory requirements, and business decision-making
We’re looking for practitioners who see data quality as part of the product, not an afterthought: people who enjoy making issues visible, fixing root causes, and building systems where reliable data is the default, not the exception.

