Design, develop, and maintain robust and scalable data pipelines and infrastructure. Apply expertise to extract, transform, and load critical data from various sources, ensuring high data quality and accessibility for analytics, reporting, and machine learning initiatives across the organization.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related technical field.
- +7 years of proven experience in data engineering, with a strong focus on building and maintaining large-scale data platforms.
- Proficiency in at least one major programming language (e.g., Python, Java, Scala) and strong SQL skills.
- Extensive experience with cloud data platforms (e.g., AWS, Azure, GCP) and their relevant data services (e.g., S3, Redshift, Snowflake, Databricks, Azure Data Lake, BigQuery).

Follow us on social media