Submit

DataOps Engineer

Lisboa

Job description

We are seeking a skilled DataOps Engineer to join our team.
The ideal candidate will have a strong background in data engineering, with expertise in automation, containerization, monitoring, and data governance. This role requires a proactive individual who can design and implement robust data pipelines and infrastructure while adhering to DataOps best practices.

Responsibilities

  • Design, build, and maintain scalable data pipelines and infrastructure to support data-driven applications.
  • Implement and manage CI/CD pipelines for data workflows using automation tools.
  • Deploy and orchestrate containerized applications using Docker and Kubernetes.
  • Monitor data systems using tools like Prometheus, Grafana, or ELK Stack to ensure performance and reliability.
  • Apply DataOps practices to streamline data operations and improve collaboration between data teams.
  • Ensure data security and governance using tools like Open Policy Agent.
  • Develop and maintain solutions using Kafka and Elasticsearch in production environments.
  • Write and optimize code in Python or Java to support data processing and system integrations.
  • Contribute to the design and implementation of text search and analysis systems.
  • Collaborate with cross-functional teams to align data solutions with business objectives.

Requirements

  • Bachelor’s degree in IT or a related field.
  • Minimum of 3 years of proven experience in data engineering or similar roles.

Minimum of 2 years of proven experience with:
  • Automation and CI/CD tools (e.g., GitLab CI, Jenkins).
  • Containers (Docker) and orchestration (Kubernetes).
  • Monitoring tools such as Prometheus, Grafana, or ELK Stack.
  • DataOps practices.

Minimum of 1 year of proven experience with:

  • Infrastructure as Code (IaaS) tools like Terraform or Ansible.
  • Data security and governance practices (e.g., Open Policy Agent).
  • Kafka and Elasticsearch in production environments.
  • Programming in Python or Java.
  • Text search and analysis systems.
  • Strong problem-solving skills and ability to work independently and collaboratively.
  • Excellent communication skills to interact with technical and non-technical stakeholders.
Nice to have:

  • Certifications in relevant technologies (e.g., AWS Certified Data Engineer, Kubernetes Certified Administrator).
  • Experience with cloud platforms (e.g., AWS, Azure, or GCP).
  • Familiarity with Agile or DevOps methodologies.
  • Knowledge of additional data processing frameworks (e.g., Apache Spark, Airflow).


Want to apply?
Position
Name*
Email*
Phone number*
Country*
City*
Linkedin
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)
Submit