Submit

Data Quality Engineer

Job description

We are looking for a Data Quality Engineer to join our Data Engineering team. In this role, you will work alongside Data Engineers as the champion of data quality and quality engineering practices, ensuring high standards from ingestion to presentation across our data ecosystem.
You’ll take a hands-on role in the design, development, and scaling of automated data quality frameworks, contributing to improved data reliability, testing coverage, and observability across our pipelines and systems. You will work to shift quality left, embedding quality-focused thinking early in the development lifecycle, and foster a culture of shared quality ownership across teams.
Your day-to-day will involve validating data transformations and pipelines, developing tests to catch anomalies, monitoring and refining data taxonomy structures, identifying issues early, and collaborating with stakeholders to proactively improve data systems and processes.

What You'll Do:

·        Serve as the voice of data quality within the Data Engineering team.

·        Design and implement automated testing and alerting frameworks to monitor data pipelines.

·        Drive quality from data ingestion through to data presentation layers.

·        Collaborate with data engineers and stakeholders to ensure quality is built-in from the start.

·        Analyze complex datasets to identify inconsistencies, trends, and areas for improvement.

·        Enhance and maintain proprietary and standardized taxonomies.

·        Ensure performance and reliability across data systems and pipelines.

·        Promote and apply best practices for testing with data in mind across the broader engineering community.



Requirements

What We’re Looking For:

  • 5+ years of experience in Data Engineering or Data Quality Engineering roles in fast-paced environments.
  • Deep understanding of data flows, patterns, and common error sources in large-scale environments.
  • Experience in working with large-scale Enterprise data warehouse/Data Lake, data integration, data migration, and data quality verification
  • Experience with both proprietary and open-source big data technologies and platforms (Snowflake, Databricks, Vertica, Spark, Airflow) and open-source/3rd party data quality tools
  • Sound understanding of various cloud technologies, especially AWS
  • Experience defining and crafting automated data quality monitoring/testing/alerting frameworks for data projects
  • Strong communication and collaboration skills, able to work effectively with developers, analysts, and leadership.
  • Proactive mindset with strong problem-solving instincts and a curiosity for root-cause analysis.


Want to apply?
Position
Name*
Email*
Phone number*
Country*
City*
Linkedin
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)
Submit