We are building a new capability inside Platform Engineering, within our Infrastructure Shared Services (ISS) organisation: Data Platform Engineering. Our goal is to create a platform that removes friction, providing a consistent way to build, operate, and evolve data-driven services with clear ownership, guardrails, and a strong developer experience.
Requirements
- 6+ years of experience in the design and development of data pipeline automation
- Technical expertise in developing complex automation frameworks, data modeling, and ETL processes using SQL, Python, DBT, Apache Airflow, or similar languages
- Full-stack proficiency across the data stack, with deep knowledge of Python, SQL, and ETL methodologies
- Hands-on familiarity with the big data ecosystem, including technologies such as Trino, Kafka, and Iceberg
- Strong collaborative communication skills, with a proven ability to align and work effectively across diverse technical stakeholder groups
Benefits
- Flexible time off
- Wellness resources
- Company-sponsored team events

Follow us on social media