BitGo is seeking a skilled Data Engineer with strong experience in building robust, scalable data pipelines and monitoring systems. The ideal candidate will have expertise in SQL and Python, and hands-on experience with modern data platforms such as Snowflake, dbt, and cloud-native orchestration tools (e.g., Airflow). Familiarity with reconciliation processes, anomaly detection, and data quality monitoring is key.
Requirements
- 5+ years of work experience in relevant field (Data Engineer, Software Engineer)
- Strong experience with server-side languages (Python)
- Strong experience with SQL databases like Postgres or MySQL
- Experience building data pipelines/ETL and familiarity with design principles
- Experience with data warehouse technologies and data modeling best practices (Snowflake, BigQuery, Spark etc)
- Strong experience with systems design and event driven systems (Kafka)
- A self-starter capable of adapting quickly and being decisive
- A willingness to be on the forefront of designing for quality and attestable results
- Experience with unit and functional testing and debugging
- Experience in Git/GitHub and branching methodologies, code review tools, CI tools, JIRA, Confluence, etc.
- Ability to work independently in a fast-paced environment
- High integrity and accountability
- Comfortable with inclusion in on-call rotations for system support
- Engineering degree in Computer Science or equivalent
- Effective written and verbal communication skills
Benefits
- Competitive salary
- IT equipment support for work
- Meal & Commute allowance
- Medical Insurance
- Attractive Well-being allowance (comprises of medical, wellness and fitness aspects)
- Snacks: on-the-house in the Bangalore office
- Great/Talented workforce to learn and grow with

Follow us on social media