Apply

Engineering Manager, Data

Posted 7 days agoViewed

View full description

💎 Seniority level: Manager, 2+ years

🔍 Industry: Software Development

🏢 Company: Algolia👥 501-1000💰 $150,000,000 Series D almost 4 years agoSemantic SearchSearch EngineCloud ComputingVertical Search

🗣️ Languages: English

⏳ Experience: 2+ years

Requirements:
  • 2+ years in a managing/leading role
  • Designed and operated data pipelines with high volume and throughput
  • Experience designing data intensive applications leveraging relational (e.g. PostgreSQL) or NoSQL (e.g. Bigtable) databases
  • Experience designing new applications with reliability, operability, and availability in mind
  • Experience writing robust Go code
  • Experience with GCP (especially Bigtable/BigQuery), Kubernetes, and Terraform
Responsibilities:
  • Help the team plan, execute and ship releases respecting timelines and high-quality development, working closely with the product and leadership teams
  • Hire, coach and mentor engineers to excel at their work and grow
  • Be a key contributor to the design, development, operation and deployment of all features of the Usage Platform processing pipelines and APIs
  • Be responsible for the quality and the robustness of the APIs and services
  • Improve engineering quality, processes and tooling
  • Work closely with the team’s Product Manager to help define the roadmap and assist internal stakeholders (other product teams) with releasing new products
Apply

Related Jobs

Apply

📍 INDIA

🧭 Full-Time

🔍 Software Development

🏢 Company: ext_apac

  • 10+ years in Public Cloud based engineering
  • Experience with React components, hooks, and state management.
  • Experience in server-side development using Node.js.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience using cloud platforms like Azure or GCP.
  • Lead team of talented developers and leads working on full stack frameworks and data engineering.
  • Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
  • Mine and analyze data from different NCR data sources to drive optimization of operations, and improve customer experience.
  • Assess the effectiveness and accuracy of new data sources and data gathering techniques.
  • Develop custom data models and algorithms to apply to data sets.
  • Use predictive modeling to increase and optimize customer experiences, cost savings, actionable insights and other business outcomes.
  • Develop company A/B testing framework and test model quality.
  • Collaborate with different functional teams to implement models and monitor outcomes.
  • Develop processes and tools to monitor and analyze model performance and data accuracy.
  • Be part of an Agile team, participate in all Agile ceremonies & activities and be accountable for the sprint deliverable
  • Create and maintain optimal data delivery architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure and GCP ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data delivery needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and cloud regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

DockerGraphQLLeadershipNode.jsProject ManagementPythonSQLAgileData AnalysisETLFull Stack DevelopmentGCPJavaJavascriptJenkinsKafkaKubernetesPeople ManagementReact.jsTypeScriptC#AzureData engineeringNosqlSparkCommunication SkillsCI/CDRESTful APIsDevOpsProblem-solving skillsScalaTeam managementData modelingSoftware EngineeringData analytics

Posted 1 day ago
Apply
Apply

🧭 Full-Time

🔍 Software Development

🏢 Company: Algolia👥 501-1000💰 $150,000,000 Series D almost 4 years agoSemantic SearchSearch EngineCloud ComputingVertical Search

  • Have 3+ years of engineering management experience and 5+ years of engineering experience
  • Are comfortable with our tech stack. Go/GCP based backend, React frontend
  • Experienced with data heavy products and platforms
  • Are an excellent communicator able to translate product requirements into technical tasks and vice-versa
  • Are capable of jumping in the trenches with the engineers — managing incidents, handling on-call, etc.
  • Leading a team, as well as hiring, onboarding, and fostering professional development
  • Working closely with other EMs and Product teams to build quality products
  • Proven ability to inspire and guide team members, with excellent communication skills
  • Leading both backend and frontend engineers
  • Ability to plan, build teams, and manage execution to deliver on commitments reliably
Posted 7 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 200000.0 - 240000.0 USD per year

🔍 Software Development

🏢 Company: Vannevar Labs

  • 3+ years of experience in leading platform engineering teams
  • 10+ years of experience as a backend, data, or search engineer
  • Strong technical expertise in building ETL pipelines, data warehousing and designing APIs
  • Ability to deliver scalable and reliable platform solutions over a diverse collection of data types
  • High ethical standards for handling sensitive data, ensuring adherence to data privacy rules and compliance standards
  • Proficiency coding with Python, TypeScript, or similar
  • Proficiency with PostgreSQL, or other relational databases
  • Proficiency with Elasticsearch, or other search engines
  • Proficiency with software development in AWS or other cloud services
  • Design and build scalable, reliable data pipelines, warehouses, and search capabilities for supporting internal developers
  • Identify opportunities for improving developer experience and time-to-market for product engineers building with core datasets
  • Work closely with engineers, DevOps, and product engineering teams, leveraging your communication skills to ensure effective teamwork
  • Develop and lead a strong cross-functional team that forms strong partnerships with developers across the organization

AWSBackend DevelopmentLeadershipPostgreSQLPythonElasticSearchETLCross-functional Team LeadershipAPI testingData engineeringCommunication SkillsCI/CDDevOpsTeam managementSoftware Engineering

Posted about 1 month ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 165400.0 - 255400.0 USD per year

🔍 Software Development

🏢 Company: Careers at Drata

  • 10+ years of hands-on experience as a Data/Software Engineer working on data pipelines, distributed systems, or analytics infra
  • 3+ years of experience managing engineering teams, ideally across platform and data disciplines
  • Strong expertise in Python and SQL, with a track record of building reliable ETL/ELT jobs
  • Proficiency in tools like Snowflake, dbt, Airflow, Terraform, Decodable (Flink), Cube.dev, or equivalent
  • Experience owning high-scale production data systems (batch + streaming) and enforcing strong testing & observability practices
  • Familiarity with leveraging AI tools through data platforms such as Snowflake Cortex
  • Excellent communicator and cross-functional partner; comfortable navigating product, infra, and business tradeoffs
  • Excellent communication and storytelling skills with the ability to rationalize the "why" behind decisions
  • Manage, coach, and mentor a team of highly skilled engineers (data, platform)
  • Design and deliver production-grade infrastructure for data transformation and real-time streaming
  • Be a hands-on contributor—review code, own architecture, troubleshoot performance, and unblock engineers
  • Drive data platform reliability across our Snowflake, dbt, and Decodable (Flink) pipelines
  • Work closely with the Business Analytics team, helping them scale analyses, ingest new data sources, and productionize models in dbt
  • Act as a bridge between the Business Analytics and Engineering teams—modeling upstream product data, supporting business operations use cases, and building systems that are analysis-friendly and production-grade
  • Utilize AI capabilities through Snowflake Cortex for tasks such as semantic search, text embedding, and classification
  • Partner with business departments to activate Large Language Models (LLMs) that improve operations, automate workflows, and reduce cognitive load for decision-makers across the company
  • Define, enforce, and evolve SLAs, data governance, lineage, and monitoring standards (Monte Carlo, etc.)
  • Own CI/CD and infra-as-code practices for the data platform (e.g., Snowflake Terraform, Airflow orchestration, observability)
  • Partner with technical leads and report to the VP of Engineering, Data and AI to drive long-term strategy and team growth
  • Collaborate with Engineering to understand the upstream data model and model it properly in the data warehouse. The team streams thousands of production MySQL databases using Decodable and assists Engineering with data quality investigations and surfacing upstream changes that could impact business decisions.
  • Partner with Customer Success to transform upstream data into actionable insights for CSMs. This includes activating playbooks, pushing operational signals to downstream systems, and supporting customer communications.
  • Support Marketing with targeting and segmentation initiatives, as well as attribution modeling and campaign analytics.
  • Enable customer-facing product features by pushing data back into the product platform through Cube.dev, Drata's semantic layer.

AWSLeadershipPythonSQLETLPeople ManagementSnowflakeAirflowApache KafkaData engineeringCI/CDRESTful APIsMentoringTerraformMicroservicesComplianceTeam managementData modelingData analyticsCustomer Success

Posted about 1 month ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 153600.0 - 264000.0 USD per year

🔍 FinTech

🏢 Company: Galileo Financial Technologies👥 501-1000💰 $77,000,000 Series A over 5 years agoIT ManagementFinancial ServicesBankingFinTech

  • 3+ years of experience managing a team of engineers
  • 10+ years of experience with Data Engineering or Business Intelligence
  • BS/MS degree in Computer Science, Engineering or a related subject
  • Consistent track record of performance management and career development of individual contributors
  • Ability to engage in technical and business discussions from both tactical and strategic viewpoints
  • Within Data Engineering, ability to build/design/develop a database; not just run it.
  • Having worked with Python, SQL, Databricks, and Snowflake (looking to transition from Oracle to Snowflake).
  • Within Business Intelligence, experience within FinTech building out reports for clients and own our internal reports for fraud & risk.
  • Familiar with Cognos for reporting.
  • Ability to balance delivering on business objectives while also protecting the long term quality of platform
  • Experience leading cross-organizational efforts with different teams to ensure successful delivery of projects
  • Own the vision of the future Galileo Data Engineering/Business Intelligence within our Payments platform
  • Lead by example, care for the team and establish credibility with the quality of the team’s technical execution
  • Lead the development, delivery and operation of the Authorization service
  • Proactively ensure the highest levels of system availability
  • Nurture a culture of continuous improvement and blameless retrospection within the organization
  • Grow and mentor a team of engineers, building their talents by exposing them to new challenges and experiences
  • Participate in deep technical design discussions within the team, and across partner teams to ensure that we're building the right systems and keeping the quality high
  • Track project performance against defined milestones/goals.
  • Conduct process improvement projects to increase performance in vital program metrics.
  • Communicate ongoing project health with key stakeholders and business leadership

Backend DevelopmentPythonSQLBusiness IntelligenceJavaSnowflakeData engineeringData visualizationData modeling

Posted about 1 month ago
Apply
Apply

📍 Arizona, California, Colorado, Connecticut, Florida, Georgia, Hawaii, Illinois, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Hampshire, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, Texas, Utah, Vermont, Virginia, Washington, Wisconsin, and Washington D.C.

🧭 Full-Time

💸 173676.0 - 210741.0 USD per year

🔍 Software Development

🏢 Company: ActBlue👥 51-100💰 $22,000,000 Series A over 14 years agoPoliticsNon ProfitEnterprise Software

  • At least 5-7 years of experience in data engineering or related roles
  • At least 3 years of staff management experience is preferred
  • Technical experience managing an ML platform, including proactively identifying how to grow the platform to support organizational needs, particularly in areas of governance
  • Experience deploying machine learning models into production, preferably using AWS SageMaker
  • Familiarity with Agile development processes
  • Proficiency in SQL and working with cloud-based data warehouses (e.g., Redshift, PostgreSQL)
  • Experience designing, building, and maintaining ETL/ELT pipelines
  • Motivated to find and implement pragmatic solutions for stakeholders, leveraging strong problem-solving skills
  • Experience setting goals and key results (OKRs) to measure team success
  • Dependability in delivering on commitments and driving projects to completion
  • Experience with security, reliability, scaling, monitoring, and other aspects of production platforms; you should be able to describe, to technical detail, the flow of data into and outside of a production environment
  • An inclusive, generous working style: You like to mentor, collaborate, and elevate your team by supporting your peers
  • Serve as a leader in the Data Department, providing input on strategy and goals
  • Own and maintain the data platform, ensuring it supports high-quality data products, robust data models, centralized dashboards, and reliable ML Ops
  • Oversee the deployment of data and ML models into a production environment
  • Collaborate with a variety of platform users across technical and non-technical stakeholders, including Data Science, Analytics and Customer Support
  • Lead and coach continual process improvement in the Data Platform team; evangelize the data platform to other teams outside of Data
  • Mentor and support team members in their professional goals while maintaining an environment of psychological safety
  • Coordinate with ActBlue leadership to establish measurable goals and consistent metrics
  • Work collaboratively within the department and across ActBlue to help achieve organizational goals; navigating sometimes conflicting stakeholder data needs
  • Assess knowledge gaps in ActBlue’s workforce and strategize filling data needs that will further institutional knowledge and information sharing
  • Direct project management and delegate large, complex projects as assigned

AWSLeadershipPostgreSQLProject ManagementPythonSQLAgileCloud ComputingETLMachine LearningMLFlowData engineeringCI/CDRESTful APIsData visualizationTeam managementData modelingData analytics

Posted about 1 month ago
Apply
Apply

📍 USA

🧭 Full-Time

💸 180000.0 - 250000.0 USD per year

🔍 AI observability and evaluation

🏢 Company: Arize AI👥 51-100💰 $38,000,000 Series B over 2 years agoArtificial Intelligence (AI)Machine LearningInformation TechnologySoftware

  • Strong hands-on technical abilities across infrastructure (Kubernetes, Terraform) and data systems (OLAP databases, Kafka, distributed KV stores).
  • Experience leading engineering teams while maintaining technical depth.
  • Experience debugging complex distributed systems and optimizing for performance at scale.
  • Focus on pragmatic solutions that deliver business value over chasing trending technologies.
  • Lead and scale a team of engineers building our next-generation data platform infrastructure.
  • Collaborate with Tech Lead and CTO to drive architectural decisions for mission-critical systems.
  • Mentor and grow engineering talent while fostering a culture of technical excellence.
  • Partner with engineering teams to understand requirements and deliver robust platform capabilities that accelerate their velocity.
  • Maintain high technical standards while meeting aggressive business objectives.

KafkaKubernetesTerraform

Posted 4 months ago
Apply
Apply

📍 United States, the Netherlands, United Kingdom, Ireland, Estonia, Portugal, Spain, France, Sweden, Canada

🔍 Climate technology

🏢 Company: Overstory👥 1-10E-Commerce

  • Passion for climate.
  • Experience in a high growth scale-up environment.
  • Product-minded with demonstrated impact through technology.
  • At least 2 years of experience leading and managing engineers across multiple teams.
  • Familiarity with technologies like React/Typescript, Python/fastAPI, Postgres DB, Data pipelines.
  • Comfort in a build it, run it environment (GCP, cloud run, grafana, k8s).
  • Strong leadership and mentorship skills.
  • Excellent communication and collaboration abilities.
  • Passion for learning and staying updated with technologies.
  • Based in GMT/CET time zone.
  • Enable Vegetation Modelling and Data Ingestion teams to be highly productive.
  • Grow teams by attracting great talent.
  • Foster an inclusive and caring culture.
  • Provide regular coaching and feedback to team members.
  • Work strategically with product managers on technical decisions.
  • Ensure architecture suitability for long-term goals.
  • Collaborate with other Product & Engineering leaders on practices and capabilities.

PostgreSQLPythonGCPKubernetesTypeScriptFastAPIReact

Posted 5 months ago
Apply

Related Articles

Posted about 1 month ago

How to Overcome Burnout While Working Remotely: Practical Strategies for Recovery

Burnout is a silent epidemic among remote workers. The blurred lines between work and home life, coupled with the pressure to always be “on,” can leave even the most dedicated professionals feeling drained. But burnout doesn’t have to define your remote work experience. With the right strategies, you can recover, recharge, and prevent future episodes. Here’s how.



Posted 5 days ago

Top 10 Skills to Become a Successful Remote Worker by 2025

Remote work is here to stay, and by 2025, the competition for remote jobs will be tougher than ever. To stand out, you need more than just basic skills. Employers want people who can adapt, communicate well, and stay productive without constant supervision. Here’s a simple guide to the top 10 skills that will make you a top candidate for remote jobs in the near future.

Posted 9 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 10 months ago

Read about the recent updates in remote work policies by major companies, the latest tools enhancing remote work productivity, and predictive statistics for remote work in 2024.

Posted 10 months ago

In-depth analysis of the tech layoffs in 2024, covering the reasons behind the layoffs, comparisons to previous years, immediate impacts, statistics, and the influence on the remote job market. Discover how startups and large tech companies are adapting, and learn strategies for navigating the new dynamics of the remote job market.