Design and develop data pipelines using dbt and Snowflake, orchestrate data workflows with Apache Airflow, and optimize data platform efficiency.
Requirements
- Bachelor/Master degree in Analytics, Data Science, Mathematics, Computer Science, Information Systems, Computer Engineering, or related technical field.
- Demonstrated mastery of complex SQL queries and data warehouse/Data Lake features.
- 5+ years of experience with SQL, data loading, transformations, performance optimization, and security features.
- Proven experience in building and managing complex data transformation pipelines using dbt and Apache Airflow.
- Strong Python scripting skills for data manipulation and API integrations.
- Analytical and independent problem solver with high attention to detail and effective communication skills.
- Solid understanding of data warehousing concepts, dimensional modeling, and data lake architectures.
- Deep understanding of ELT principles and best practices.
- Familiarity with data quality frameworks, data lineage, and data governance principles.
To apply for this job please visit gxs.wd3.myworkdayjobs.com.

Follow us on social media