Apply

Cloud DevOps Engineer (AWS) (Remote, Canada)

Posted 2024-11-20

View full description

💎 Seniority level: Senior, 5+ years

📍 Location: Canada

🔍 Industry: Technology

🗣️ Languages: English

⏳ Experience: 5+ years

🪄 Skills: AWSDockerPythonSQLApache AirflowArtificial IntelligenceETLGitJavaMachine LearningSnowflakeAirflowData engineeringData scienceCI/CDDevOpsTerraformDocumentation

Requirements:
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • 5+ years of experience in Data Engineering, with at least 3+ years in AWS environments.
  • Strong knowledge of AWS services including SageMaker, Lambda, Glue, and Redshift.
  • Hands-on experience deploying machine learning models in AWS SageMaker.
  • Proficiency in DevOps practices, CI/CD pipelines, Docker, and infrastructure-as-code tools.
  • Advanced SQL skills and experience with complex ETL workflows.
  • Proficiency in Python and skills in Java or Scala.
  • Experience with Apache Airflow for data orchestration.
  • Effective communication skills and a result-oriented approach.
Responsibilities:
  • Design, develop, and maintain ETL pipelines ensuring reliable data flow and high-quality data for analytics.
  • Build and optimize data models to efficiently handle large data volumes in Snowflake.
  • Create complex SQL queries for data processing and analytics.
  • Manage orchestration and scheduling using Apache Airflow.
  • Document data pipelines and architecture.
  • Architect, build, and maintain data science infrastructure on AWS.
  • Collaborate with Data Scientists on deploying ML models using SageMaker.
  • Automate ML model deployment and monitoring with CI/CD and IaC tools.
  • Set up monitoring solutions to ensure effective operation of data pipelines.
Apply