NTT DATA is seeking a Data Engineer – Python, Kafka, Hadoop, Snowflake, and Apache Iceberg for a high-visibility project at Goldman Sachs in Bangalore, India. The successful candidate will be responsible for performing end-to-end data store migration from on-prem DataLake to AWS hosted LakeHouse, and will require a sophisticated understanding of data modeling concepts, including Temporal Data Modeling, Schema Management, and Performance Optimization.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
- Minimum of 3-5 years of professional ‘hands-on-keyboard’ coding experience in a collaborative, team-based environment
- Professional proficiency in Python or Java
- Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
- Core Data Engineering Competencies: Temporal Data Modeling, Schema Management, Performance Optimization, Architectural Theory
Benefits
- Competitive salary
- Opportunities for career growth and development
- Collaborative and dynamic work environment
To apply for this job please visit careers-inc.nttdata.com.

Follow us on social media