Submit

Data Cloud Platform Architect (Porto)

Porto

Job description

We are seeking a Data Cloud Platform Architect with over 10 years expertise in AWS data services, extensive exposure to data engineering, experience in European corporate environments, and you should be based in Porto.

Exposure to Azure technologies is nice to have. The ideal candidate will have experience with Infrastructure as Code, and a proven ability to implement and enforce data governance and security standards. You should have a strong track record of leading large scale cloud data projects, as well as hands-on coding skills in Python and SQL for prototyping and setting technical standards. You will be responsible for designing, implementing, and maintaining scalable data solutions, primarily on AWS, ensuring alignment with defined data architecture and governance frameworks. You will collaborate with cross functional teams to develop and optimize data pipelines, ETL processes, and orchestration workflows using tools such as Apache Airflow, AWS Step Functions, Lambda, and Glue. Experience with Iceberg tables and managing large datasets is required. Exposure to Azure platform technologies, including Databricks, is considered an advantage.

As a Data Cloud Platform Architect, you will translate business requirements into technical solutions, distribute tasks effectively within your team, and select the most appropriate tools for cost and performance. You will provide technical leadership, mentorship, and guidance, ensuring adherence to best practices in production awareness, troubleshooting, and security.

HYBRID: 3X WEEK ONSITE

Requirements

Expected contribution and deliverables
• Demonstrate deep expertise in AWS data services, including Lambda, Glue, Step Functions, and Redshift, with exposure to Azure data services.
• Lead the implementation and enforcement of data governance and security standards across cloud platforms.
• Oversee the implement Infrastructure as Code (IaC) solutions using Terraform and CloudFormation.
• Lead the optimisation projects, ensuring minimal disruption and maximum efficiency.
• Apply hands-on coding skills in Python and SQL to prototype solutions and establish coding standards.
• Develop robust solution architectures with a focus on scalability, performance, security, and cost optimisation.
• Design efficient data models and optimise query performance for large datasets.
• Manage ETL processes and data integration into Redshift, DuckDB, and PostgreSQL.
• Set up and manage logging and tracing mechanisms in AWS using services such as CloudTrail and X-Ray.
• Implement orchestration solutions using Apache Airflow and AWS Step Functions.
• Utilise Athena for interactive query analysis of large datasets in Amazon S3.
• Provide technical leadership and act as a subject matter expert in cloud data engineering.
• Write comprehensive solution and technical documentation.
• Stay updated on emerging technologies and industry trends.
• Challenge business requirements and propose innovative solutions for efficiency and performance improvement.

The base skills we are searching are:
• Deep expertise in AWS data services, with exposure to Azure data services.
• Extensive experience with Infrastructure as Code (IaC) using Terraform and CloudFormation.
• Proven ability to define and enforce data governance and security standards.
• Demonstrated experience leading large-scale data migration and optimisation projects.
• Strong programming skills in Python and SQL, with experience in prototyping and setting coding standards.
• Experience with Iceberg tables and managing large datasets efficiently.
• Proficiency in designing scalable and efficient data solutions on AWS, following best practices for cloud architecture and infrastructure.
• Experience with orchestration tools such as Apache Airflow and AWS Step Functions.
• Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka.
• Proactive approach to production monitoring and troubleshooting.
• Excellent communication and teamwork skills, with the ability to provide technical leadership and mentorship.
• Strong analytical and problem-solving skills, with the ability to analyse requirements and propose innovative solutions.
• Experience in writing solution documents and technical documentation.
• Familiarity with Azure Databricks for data engineering and analytics tasks is an advantage.

Want to apply?
Position
Name*
Email*
Phone number*
Country*
City*
Linkedin
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)
Submit