As a DataOps Engineer, you will operate large-scale big data platforms across hybrid (on-premises and cloud) environments, enabling reliable analytics and data-driven use cases.
Requirements
- Degree in Computer Science, Software Engineering, Data Science, or equivalent experience
- 2–5+ years of hands-on experience in DevOps, Data Platform Engineering, or Infrastructure Engineering, supporting big data or analytics platforms.
- Experience managing on-premises and cloud infrastructure, including compute, storage, and network configuration.
- Hands-on experience with cloud and/or hybrid environments supporting big data pipelines, with Spark / PySpark and Airflow at an operational level (job execution, basic tuning, failure handling).
- Knowledge of containerization and orchestration, including Docker fundamentals and Kubernetes (deployments, services, ingress, scaling, resource limits).
- Strong understanding of networking fundamentals, including VPC design, routing, DNS, firewalls, and load balancing.
- Knowledge of security concepts, such as IAM, access control, and compliance basics.
- Experience with CI/CD pipelines and automation using tools such as GitLab CI/CD, Jenkins, or similar.
- Basic understanding of MLOps concepts from a platform and infrastructure support perspective.
- Strong troubleshooting and problem-solving skills in production environments.
- Comfortable working with cross-functional teams (data engineers, ML engineers, security, product).
To apply for this job please visit careers.starhub.com.

Follow us on social media