Senior Data Operations (DataOps) Engineer P-134

Remote Full TimeOregon, United States (Remote)SMASH

We’re looking for a Senior Data Operations (DataOps) Engineer to lead the evolution of DataOps practices at a global scale, designing highly automated, resilient, and scalable data platforms on GCP.

Requirements

  • 8+ years of progressive experience in DataOps, Data Engineering, or Platform Engineering roles.
  • Strong expertise in data warehousing, data lakes, and distributed processing technologies (Spark, Hadoop, Kafka).
  • Advanced proficiency in SQL and Python; working knowledge of Java or Scala.
  • Deep experience with Google Cloud Platform (GCP) data and infrastructure services.
  • Expert understanding of microservices architecture and containerization (Docker, Kubernetes).
  • Proven hands-on experience with Infrastructure as Code tools (Terraform preferred).
  • Strong background in CI/CD methodologies applied to data pipelines.
  • Experience designing and implementing data automation frameworks.
  • Advanced knowledge of data orchestration, monitoring, and observability tooling.
  • Ability to architect highly scalable, resilient, and fault-tolerant data systems.
  • Strong problem-solving skills and ability to operate independently in ambiguous environments.

Benefits

  • Generous Paid Time Off
  • 401k Matching
  • Retirement Plan

To apply for this job please visit smashcr.applytojob.com.

Tired of manual job applications?

JobCopilot auto-applies to thousands of RevOps and GTM roles on your behalf — so you can focus on interviews, not applications.

Applying for this role?

Tailor your resume to this exact role — hiring managers notice the difference.

Latest articles on the blog

RECRUITERS!

Reduce the risk of your recruitment process (applicant quality, long and inefficient process) by selecting from a relevant pool of candidates.

POST A NEW JOB NOW!