Our client is seeking a Data Engineer to design, build, and maintain reliable data pipelines and infrastructure that deliver clean, accessible, and actionable data.
Requirements
- 3+ years in data engineering or back-end development.
- Strong Python and SQL skills.
- Experience with at least one major data warehouse (Snowflake, Redshift, BigQuery).
- Familiarity with pipeline orchestration tools (Airflow, Prefect).
- Strong software engineering fundamentals, experience with modern data stacks, and an eye for quality and scalability.
- Experience with dbt for transformations and data modeling.
- Streaming data experience (Kafka, Kinesis, Pub/Sub).
- Cloud-native data platforms (AWS Glue, GCP Dataflow, Azure Data Factory).
- Background in regulated industries (healthcare, finance) with strict compliance.
- Pipeline development and maintenance.
- Data warehousing and quality governance.
- Streaming and real-time data.
- Collaboration and infrastructure management.
Benefits
- Competitive salary
- Opportunities for growth and professional development
- Collaborative and dynamic work environment
- Flexible working hours and remote work options

Follow us on social media