Data Engineer / DataOps Engineer

New
Fully remoteContractMiddle
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Languages
English
Experience
5+ years
Required Skills
ETLGitKubernetesMicrosoft AzureSnowflakeAirflowCI/CDDevOpsdbt

Requirements

  • 5+ years of professional practice in data engineering
  • Strong, practical experience with Snowflake (views, tables, performance tuning, orchestrated ELT processes)
  • Solid expertise using dbt for SQL-based transformations
  • Hands-on experience with Airflow for workflow scheduling and automation
  • Experience deploying and maintaining containerized workloads on Kubernetes
  • Familiarity with cloud environments, with strong understanding of Microsoft Azure services
  • Practical experience building ETL/ELT pipelines and maintaining production data workflows
  • Good understanding of Git-based development, CI/CD pipelines, and general DevOps principles
  • Analytical mindset and ability to troubleshoot issues in complex systems

Responsibilities

  • Develop and maintain end-to-end data pipelines using Snowflake as the core data platform
  • Build ELT workflows using dbt and manage orchestration with Airflow
  • Implement and support DataOps processes, including CI/CD automation, monitoring, and workload deployment on Kubernetes
  • Optimize Snowflake performance, including warehouses, storage usage, and query efficiency
  • Ensure data reliability through data validation, testing, and monitoring practices
  • Integrate various data sources and manage ingestion processes into Snowflake
  • Collaborate with cross-functional teams to deliver reliable, production-ready data solutions
  • Follow engineering best practices, maintain coding standards, and support continuous improvement
  • Support team knowledge sharing and mentor junior developers when needed
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now