Apply

Senior Data Engineer, Platform

Posted 4 days agoViewed

View full description

💎 Seniority level: Senior

📍 Location: Canada

💸 Salary: 125800.0 - 170100.0 USD per year

🔍 Industry: Software Development

🏢 Company: Jobber👥 501-1000💰 $100,000,000 Series D over 2 years agoSaaSMobileSmall and Medium BusinessesTask Management

🪄 Skills: AWSPythonSQLAirflowData engineeringSparkCI/CDTerraformData modeling

Requirements:
  • Strong coding skills in Python and SQL.
  • Expertise in building and maintaining ETL pipelines using tools like Airflow and dbt.
  • Experience working with AWS data infrastructure, particularly Redshift, Glue, Lambda, and ECS Fargate.
  • Familiarity with handling large datasets using tools like Spark or similar (e.g., Trino).
  • Experience with Terraform for infrastructure management.
  • Experience with dimensional modelling, star schemas, and data warehousing in a cloud environment (preferably AWS Redshift).
  • Knowledge of CI/CD processes, data ingestion, and optimizing data flow across systems.
  • Proficient in working with high-volume, scalable data infrastructure.
Responsibilities:
  • Build Scalable Data Solutions: Design, develop, and maintain batch and real-time data pipelines within cloud infrastructure (preferably AWS). Leverage Python, SQL, and AWS technologies (Glue, Lambda, ECS Fargate) to ensure smooth data operations. Build scripts, serverless applications, and automated workflows.
  • Empower Internal Teams: Develop tools and frameworks that automate manual processes, set up alerting/monitoring systems, and help teams run data-driven experiments and analyze results. Work closely with cross-functional teams to support their needs and ensure data accessibility.
  • Accelerate Business Growth: Collaborate with data analysts, scientists, and product teams to extract actionable insights from data. Utilize tools like Airflow and dbt to streamline ETL/ELT pipelines and ensure the seamless flow of data.
  • Strategic Planning and Innovation: Lead initiatives to research and propose new technologies and tooling for our data stack, with an emphasis on performance and scalability. Participate in design and code reviews, continuously learning from and mentoring your peers.
  • Data Integrity: Own the integrity of our data and maintain a high level of trust across the organization.
Apply