Apply📍 Germany, Spain, Portugal, Greece
🏢 Company: WorkMotion👥 101-250💰 $10,000,000 Series B almost 3 years agoComplianceHuman ResourcesEmployee Benefits
- 3-5 years of professional experience in Data Engineering or Software Development with a focus on data
- Strong knowledge of Python and SQL; and PySpark
- Hands-on experience with AWS services (Glue, S3, Athena, EC2)
- Experience with Apache Airflow, preferably in a Dockerized/cloud-native environment
- Familiarity with Delta Lake or similar data lake frameworks
- Proficiency with source control (GitHub) and CI/CD workflows
- Strong understanding of data modeling, ETL best practices, and data pipeline performance optimization
- Design, build, and maintain scalable ETL pipelines using Apache Airflow and AWS Glue (Spark)
- Work with a range of data sources including Salesforce, NetSuite, PostgreSQL, and MongoDB
- Develop and optimize PySpark jobs for large-scale data transformation and analytics
- Manage data lake infrastructure using Delta Lake on S3 with Athena as the query layer
- Ensure data quality, performance, and reliability through monitoring, testing, and documentation
- Collaborate with analytics, product, and engineering teams to define data requirements
- Contribute to CI/CD workflows with GitHub and deployment automation
- Participate in architectural discussions and advocate for best practices in data engineering
AWSDockerPythonSoftware DevelopmentSQLApache AirflowETLGitData engineeringSparkCI/CDData modeling
Posted about 5 hours ago
Apply