We are looking for a highly skilled Data Engineer with strong expertise in real-time streaming and event-driven architectures.
Requirements
- Design and implement real-time data streaming pipelines using technologies like Apache Flink, Kafka, and Java
- Build and maintain event-driven architectures for large-scale distributed systems
- Perform JVM tuning and performance optimization for streaming applications
- Develop and deploy applications using containerization tools (Docker, Kubernetes)
- Work with Cloudera platform for data engineering and pipeline orchestration
- Implement robust design patterns and ensure high-quality coding standards
- Troubleshoot and resolve issues in distributed systems ecosystem
- Collaborate with DevOps teams to maintain CI/CD pipelines (GitHub, Jenkins)
- Work on Linux-based systems, including configuration and shell scripting
- Optimize data processing with caching mechanisms (e.g., Redis – nice to have)

Follow us on social media