Data Engineer, 80-100%
New
Source API remote eligibility restrictions: Australia, Bangladesh, Canada, India, Pakistan, South Africa, United Kingdom, United States, maximum difference of +/- 2 hours from Switzerland (GMT +2)Full-TimeMiddle
Salary not disclosed
Apply NowOpens the employer's application page
Job Details
- Experience
- At least 3 years
- Required Skills
- DockerPythonSQLGitJavaMachine LearningAirflowSparkCI/CDTerraformBigQuerydbt
Requirements
- Knowledgeable and experienced in SQL programming and dbt
- At least 3 years of experience in a similar data science/engineering environment with a strong track record of data projects on Google Cloud Platform (GCP)
- Experience with GCP native services like BigQuery, Cloud Run, Dataproc, and Dataflow highly appreciated
- Experience in developing high-quality software incl. unit & integration tests in one or more languages (such as Python or Java) while leveraging CI/CD tools and Git
- Experience with managing data pipeline workflows with Airflow
- Experience working with containerization (Docker) and infrastructure as a code frameworks (Terraform)
- Having worked with distributed computing technologies like Apache Spark and knowledge of ML concepts/frameworks is a strong plus
- Open for new technologies and challenges
Responsibilities
- Design, implement and continuously optimize platform services on our modern industry standard GCP based data environment
- Interacting with our software architects/engineers, data scientists and business analysts in an agile environment to define data needs & provide solutions
- Developing a high-quality code base for our data pipelines
- Development, provisioning and monitoring of our data products
- Maintenance of our data pipeline orchestration, datalake and data warehouse
- ...room for own data & engineering initiatives
View Full Description & ApplyYou'll be redirected to the employer's site