This is a Data Engineer role with the following main responsibilities: understanding the overall domain architecture and managing changes within them, developing big data analytic programs by using Hadoop framework, and working independently with self-enabled & motivated personality.
Requirements
- Bachelor /Masters Degree in Computer Science or IT/Maths/Statistics/Quantitative/Business
- Strong knowledge of IT principles and system development lifecycle
- Good analytical & interaction skills
- Sound statistical knowledge, analytical and problem-solving skills are desirable
- Good communication skills to be able to understand requirements and articulate solutions/feedbacks
- Curious to learn new technologies
- Positive Attitude
- Proactive approach than reactive
- Experience of about 10+ years anticipated
- Hands on & Consulting Skills on Hadoop development Framework
- Hands-on experience on Big data technologies
- Knowledge on Oracle data bases
- BI tools such as Tableau
- Development capabilities using python, Spark, Scala programming skills
- Java, Spring, Spring Boot, Spring Cloud, Oracle, Elasticsearch, Hazelcast, Kafka, REST APIs, JSON/YML
Benefits
- Competitive salary
- Core bank funding for retirement savings
- Medical and life insurance
- Flexible and voluntary benefits
- Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days)
- Flexible working options
- Proactive wellbeing support through Unmind
- Continuous learning culture
- Inclusive and values driven organisation
To apply for this job please visit jobs.standardchartered.com.

Follow us on social media