We are looking for a highly experienced Data Engineer to design and build scalable data platforms using Azure and Databricks. The role involves leading end-to-end data engineering initiatives, building high-performance data pipelines, and enabling advanced analytics through modern Lakehouse architecture.
Requirements
- 8+ years of experience in Data Engineering
- Strong expertise in Python, PySpark, Apache Spark
- Hands-on experience with Databricks (Delta Lake, Lakehouse architecture)
- Experience with Azure Data Platform (ADF, ADLS, Azure Databricks)
- Strong SQL and data modeling skills
- Experience with Snowflake / Oracle / SQL Server
- Knowledge of orchestration tools like Control-M / Airflow / ADF
- Experience in performance tuning and optimization of large datasets
- Familiarity with Git, CI/CD pipelines, and DevOps practices
To apply for this job please visit jobs.workable.com.

Follow us on social media