Data Engineer – Python AND Kafka AND (Hadoop OR HDFS OR Hive) AND Snowflake AND apache AND (iceberg

On Site Full TimeIndiaNTT DATA

NTT DATA is seeking a Data Engineer – Python, Kafka, Hadoop, Snowflake, and Apache Iceberg for a high-visibility project at Goldman Sachs in Bangalore, India. The successful candidate will be responsible for performing end-to-end data store migration from on-prem DataLake to AWS hosted LakeHouse, and will require a sophisticated understanding of data modeling concepts, including Temporal Data Modeling, Schema Management, and Performance Optimization.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
  • Minimum of 3-5 years of professional ‘hands-on-keyboard’ coding experience in a collaborative, team-based environment
  • Professional proficiency in Python or Java
  • Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
  • Core Data Engineering Competencies: Temporal Data Modeling, Schema Management, Performance Optimization, Architectural Theory

Benefits

  • Competitive salary
  • Opportunities for career growth and development
  • Collaborative and dynamic work environment

Tagged as: ,

To apply for this job please visit careers-inc.nttdata.com.


You can apply to this job and others using your online resume. Click the link below to submit your online resume and email your application to this employer.

Tired of manual job applications?

JobCopilot auto-applies to thousands of RevOps and GTM roles on your behalf — so you can focus on interviews, not applications.

Applying for this role?

Tailor your resume to this exact role — hiring managers notice the difference.

Latest articles on the blog

RECRUITERS!

Reduce the risk of your recruitment process (applicant quality, long and inefficient process) by selecting from a relevant pool of candidates.

POST A NEW JOB NOW!