This role is for one of the Weekday’s clients. The Data Platform Engineer will be responsible for designing and managing the foundational infrastructure upon which everything else is built. The role involves taking full ownership of the data lakehouse, developing and managing real-time stream processing frameworks, and implementing cost observability measures.
Requirements
- 3–12 years of experience in data engineering
- 1–7 years focused on building or managing a data platform
- Deep hands-on expertise with tools like Spark, Hudi/Delta Lake, Kafka, Airflow, Debezium, Presto/Trino, DBT, Airbyte
- Comfortable working with the AWS data ecosystem
- Managed daily processing of terabytes and billions of events
- Reduced infrastructure costs and can provide metrics showing impact
- Proficient in Java, Python, or Scala—ideally experienced in all three
- Preferably experience as a pod lead or tech lead
- Experience with OLAP engines such as Pinot, Druid, or ClickHouse
- Built or contributed to data movement or reverse-ETL APIs
- Familiarity with feature stores (Feast, Feathr) or data catalog tools like Datahub
Benefits
- Competitive salary
- Benefits package
To apply for this job please visit jobs.workable.com.

Follow us on social media