Data Quality Engineer
Job description
What
You'll Do:
· Serve as the voice of
data quality within the Data Engineering team.
· Design and implement
automated testing and alerting frameworks to monitor data pipelines.
· Drive quality from
data ingestion through to data presentation layers.
· Collaborate with data
engineers and stakeholders to ensure quality is built-in from the start.
· Analyze complex
datasets to identify inconsistencies, trends, and areas for improvement.
· Enhance and maintain
proprietary and standardized taxonomies.
· Ensure performance and
reliability across data systems and pipelines.
· Promote and apply best
practices for testing with data in mind across the broader engineering
community.
Requirements
What
We’re Looking For:
- 5+
years of experience in Data Engineering or Data Quality Engineering roles in
fast-paced environments.
- Deep
understanding of data flows, patterns, and common error sources in large-scale
environments.
- Experience
in working with large-scale Enterprise data warehouse/Data Lake, data
integration, data migration, and data quality verification
- Experience
with both proprietary and open-source big data technologies and platforms
(Snowflake, Databricks, Vertica, Spark, Airflow) and open-source/3rd party data
quality tools
- Sound
understanding of various cloud technologies, especially AWS
- Experience
defining and crafting automated data quality monitoring/testing/alerting
frameworks for data projects
- Strong
communication and collaboration skills, able to work effectively with
developers, analysts, and leadership.
- Proactive
mindset with strong problem-solving instincts and a curiosity for root-cause
analysis.