Design and maintain CI/CD pipelines for data workflows and machine learning jobs, implement monitoring solutions, and automate deployment of data infrastructure.
Requirements
- 9+ years of experience in DataOps/DevOps
- Hands-on experience with Databricks Workflows, Delta Lake, Unity Catalog, and MLflow
- Proficiency in Python (PySpark) and SQL
- Familiarity with Oracle, MySQL, PostgreSQL, and integration with Databricks
- Experience with Airflow, Prefect, Dagster, or Databricks Workflows for scheduling and monitoring complex pipelines
- Understanding of data modeling principles, Delta Lake architecture, and performance tuning for big data environments
To apply for this job please visit nxp.wd3.myworkdayjobs.com.

Follow us on social media