Apply

Senior Staff Data Ops Engineer (Canada)

Posted 10 days agoViewed

View full description

💎 Seniority level: Senior, 13+ years

📍 Location: Canada

🔍 Industry: Software Development

🏢 Company: PeakMetrics👥 11-50💰 $4,200,000 Seed about 1 year agoMachine LearningCyber SecurityMarketing AutomationNatural Language ProcessingSoftware

⏳ Experience: 13+ years

🪄 Skills: AWSDockerPostgreSQLPythonSQLBashCloud ComputingElasticSearchETLJenkinsKafkaKubernetesMachine LearningAirflowData engineeringCI/CDDevOpsTerraformAnsibleData modelingData management

Requirements:
  • 13+ years of experience in engineering, with at least 5+ years in data engineering, data ops, or related roles.
  • Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines.
  • Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
  • Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
  • Proficiency in programming and scripting languages (e.g., Python, Java, Bash).
  • Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar.
  • Track record of successfully managing and scaling high-performing technical teams.
  • Experience with big data technologies (e.g., Kafka).
  • Familiarity with ML/AI infrastructure and frameworks.
  • Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
  • Certifications in cloud platforms or DevOps practices are a plus.
  • You have maintained data management systems and have built new data pipelines from scratch.
  • You are comfortable automating data flows with resilient code using Python.
  • Experience with Dagster or other data orchestration platforms such as Airflow.
  • You have strong database architecture design and management knowledge in both structured and unstructured data.
  • Advanced knowledge of Elasticsearch or OpenSearch, including configuration, operation, and using for search
  • Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
Responsibilities:
  • Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
  • Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
  • Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
  • Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
  • Foster a culture of innovation, accountability, and collaboration within the team.
  • Establish best practices for performance management, career development, and skills growth.
  • Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
  • Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Drive the implementation of best practices in data governance, quality, and security.
  • Ensure the availability, reliability, and performance of data systems
  • Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
  • Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
  • Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
  • Present project updates, performance metrics, and strategic initiatives to leadership.
Apply