We are looking for a Data Engineer – Web Scraping to work with us remotely on a 4 months contract. The ideal candidate will have proven experience in data engineering and expertise in designing scalable data architectures.
Requirements
- Maintain and manage website scraping configurations using Python.
- Monitor scraping configurations for errors and potential crashes.
- Oversee retrieved data to detect potential issues and blockages.
- Coordinate with stakeholders to understand scraping task requirements and report issues.
- Prepare and share periodic reports on scraping activities with stakeholders.
- Develop necessary pipelines to ingest data into the Datalake and perform required transformations.
- Strong experience with ETL processes, data modeling, and data warehousing (Airflow & DBT preferred).
- Expertise in database technologies, both relational (SQL) and NoSQL.
- Knowledge of cloud platforms, particularly Azure.
- Solid understanding of data security measures and compliance standards.
- Excellent Python experience for data engineering and automation.
- Strong collaboration skills to work closely with data scientists and analysts.
- Ability to optimize data pipelines for performance and efficiency.
- Ability to build, test, and maintain tasks and projects.
- Experience with version control systems, such as Git.
- Hands-on experience with Airflow and/or DBT.
- Experience with Terraform for infrastructure management.
- Minimum 2 years of experience in a similar role.
- Strong academic background in a relevant field.
- Fluent in English (French is a plus).
Benefits
- 4 months contract
To apply for this job please visit jobs.workable.com.

Follow us on social media