We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will support analytics and data science teams by ensuring reliable data collection, transformation, and accessibility across the organization.
Requirements
- Design, develop, and maintain scalable ETL/ELT data pipelines
- Build and optimize data architectures, data lakes, and data warehouses
- Ensure data quality, integrity, and security
- Work with large datasets (structured and unstructured)
- Collaborate with Data Scientists, Analysts, and Software Engineers
- Optimize data systems for performance and scalability
- Implement monitoring, logging, and data validation processes
- Manage cloud-based data platforms (AWS, Azure, or GCP)
- Automate workflows and improve data reliability
- Strong programming skills in Python, SQL, or Scala
- Experience with ETL tools (Airflow, Talend, Informatica, etc.)
- Knowledge of data warehousing solutions (Snowflake, Redshift, BigQuery)
- Experience with cloud platforms (AWS, Azure, GCP)
- Familiarity with big data tools (Spark, Hadoop, Kafka)
- Understanding of database systems (PostgreSQL, MySQL, NoSQL)
- Experience with version control (Git) and CI/CD
Benefits
- Generous Paid Time Off
- 401k Matching
- Retirement Plan
- Visa Sponsorship
To apply for this job please visit careers-inc.nttdata.com.

Follow us on social media