We’re looking for a Senior Data Operations (DataOps) Engineer to lead the evolution of DataOps practices at a global scale, designing highly automated, resilient, and scalable data platforms on GCP.
Requirements
- 8+ years of progressive experience in DataOps, Data Engineering, or Platform Engineering roles.
- Strong expertise in data warehousing, data lakes, and distributed processing technologies (Spark, Hadoop, Kafka).
- Advanced proficiency in SQL and Python; working knowledge of Java or Scala.
- Deep experience with Google Cloud Platform (GCP) data and infrastructure services.
- Expert understanding of microservices architecture and containerization (Docker, Kubernetes).
- Proven hands-on experience with Infrastructure as Code tools (Terraform preferred).
- Strong background in CI/CD methodologies applied to data pipelines.
- Experience designing and implementing data automation frameworks.
- Advanced knowledge of data orchestration, monitoring, and observability tooling.
- Ability to architect highly scalable, resilient, and fault-tolerant data systems.
- Strong problem-solving skills and ability to operate independently in ambiguous environments.
Benefits
- Generous Paid Time Off
- 401k Matching
- Retirement Plan
To apply for this job please visit smashcr.applytojob.com.

Follow us on social media