Data Engineer

New
Calgary, AlbertaFull-TimeSenior
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Experience
5+ years
Required Skills
DockerPythonSQLGCPAirflowSparkDatabricks

Requirements

  • 5+ years of hands-on experience in data engineering, data integration, or data platform development.
  • Degree in Computer Science, Engineering, Mathematics, or related STEM discipline.
  • Strong programming and query skills in SQL and Python.
  • Experience with Git in an Agile/Scrum environment.
  • Experience designing and orchestrating ETL pipelines, particularly with Databricks.
  • Experience working within cloud environments (GCP, AWS, or Azure).
  • Experience with database systems such as MongoDB and Elasticsearch.
  • Strong understanding of data warehousing and dimensional modeling methodologies.
  • Hands-on experience with Airflow and Hadoop.
  • Experience using Docker for containerized workflows.
  • Strong business awareness and communication skills.

Responsibilities

  • Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines using GCP services.
  • Architect and implement robust data infrastructure for high-volume ingestion and processing.
  • Develop and manage the central data warehouse in Google BigQuery.
  • Design and implement data models, schemas, and table structures.
  • Write clean, efficient, and maintainable SQL and Python code for data transformation.
  • Monitor, troubleshoot, and optimize data infrastructure for performance and cost efficiency.
  • Implement automated data quality checks, validation rules, and monitoring frameworks.
  • Lead client and stakeholder communication to translate business needs into scalable data solutions.
  • Partner with product teams to align technical solutions with business strategy.
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now