Job Details
- Languages
- English
- Experience
- 7+ years
- Required Skills
- AWSDockerPythonSQLApache AirflowETLGCPKubernetesSnowflakeAzure
Requirements
- 7+ years of experience in data warehousing, database administration, or database development
- 5+ years of hands-on experience as a Data Engineer using SQL and Python
- Expertise with cloud-based platforms such as AWS, GCP, or Azure
- Familiarity with containerization tools like Docker or Kubernetes
- Proven experience working with large-scale datasets using technologies such as Snowflake, BigQuery, Redshift, Databricks, or similar
- Strong knowledge of ELT/ETL tools and data integration frameworks, including Airflow, DBT, Python, and REST APIs
- Solid understanding of data modeling, performance optimization, and pipeline maintainability
- Excellent English communication skills
Responsibilities
- Maintain, configure, and optimize cloud-based data warehouse platform
- Design and implement incremental and batch data integration pipelines
- Collaborate with development, analytics, and operations teams
- Explore and integrate new tools, technologies, and best practices
- Implement and maintain data security, auditing, and monitoring processes
- Troubleshoot and resolve data-related issues
- Contribute to data engineering practices and platform capabilities