Design and implement data solutions for Kotak 811, lead data engineering projects, mentor junior team members, and collaborate with cross-functional teams to deliver high-quality and scalable data infrastructure.
Requirements
- Design and develop scalable, high-performance data architecture and data models.
- Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions.
- Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases.
- Implement ETL processes to integrate data from various sources into data storage systems.
- Optimise data pipelines for performance, scalability, and reliability.
- Implement data quality frameworks and processes to ensure high data integrity and consistency.
- Design and enforce data management policies and standards.
- Develop and maintain documentation, data dictionaries, and metadata repositories.
- Conduct data profiling and analysis to identify data quality issues and implement remediation strategies.
- Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments.
- Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes.
- Optimise model performance and latency for real-time inference in consumer applications.
- Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues.
- Implement monitoring and logging solutions to track model performance, data drift, and system health.

Follow us on social media