Design, develop, and maintain data pipelines and ETL processes using AWS and Snowflake, and implement data transformation workflows using DBT (Data Build Tool).
Requirements
- Bachelor’s or master’s degree in computer science, Engineering, or a related field.
- Proven experience as a Data Engineer or similar role.
- Strong proficiency in AWS and Snowflake.
- Expertise in DBT and Python programming.
- Experience with data modeling, ETL processes, and data warehousing.
- Familiarity with cloud platforms and services.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.

Follow us on social media