Apply

Senior Analytics Engineer

Posted 5 days agoViewed

View full description

💎 Seniority level: Senior, 3+ years

💸 Salary: 110000.0 - 170000.0 USD per year

⏳ Experience: 3+ years

Requirements:
  • Experienced in SQL, with an emphasis on writing performant code
  • Data modeling expert, and have designed schemas before at previous organizations, ideally using dbt
  • Strong comprehension of data warehouses and the modern data stack
  • Track record of delivering projects, including datasets that were used to drive key business outcomes
  • Can self-serve by building ELT pipelines and reverse ETL pipelines using off-the-shelf tools, or simple DAGs in Airflow leveraging python.
  • Experience leveraging tools for data governance, data quality testing, and documentation
  • Can clearly communicate about technical subjects to non-technical people, and behave as a liaison between Business, Engineering, and Analytics
Responsibilities:
  • Ingest, clean and organize raw data
  • Lead data projects from ingestion to reporting
  • Liaise between Business, Engineering, and Data to translate business requirements into proper data capture and schema design
  • Help shape the data model
  • Work with key stakeholders to understand existing processes and collaborate to design and implement improved and automated processes
Apply

Related Jobs

Apply

📍 Spain

🧭 Full-Time

🏢 Company: Voicemod👥 101-250💰 $14,500,000 Series A over 2 years agoAudioDeveloper APIsGamingSoftware

  • Proven experience in Business Intelligence, Data Engineering, or Data Analytics with advanced SQL skills
  • Solid knowledge of data modelling
  • Experience with DBT and Python
  • Experience with event tracking tools like Segment, mParticle, or similar.
  • Collaborate with engineers, product teams and analysts to develop data products that are precise and insightful.
  • Find creative ways to integrate AI into the Data lifecycle.
  • Own the data product lifecycle, including designing tracking plans, developing data models, ELT pipelines, and self-serve data products.
  • Be the data steward, ensuring data quality and consistent metrics.

PythonSQLBusiness IntelligenceETLData engineeringData visualizationData modelingData analyticsData management

Posted 5 days ago
Apply
Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

💸 162000.0 - 190000.0 USD per year

🔍 Software Development

🏢 Company: Lob👥 101-250💰 $50,000,000 Series C over 4 years agoDeveloper APIsShippingSaaSMarketingHealth CareSoftwareCourier Service

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience: at least one big data warehouse system such as Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating live production systems using dbt and Python.
  • 3+ years of BI Software experience: at least one analytics platform such as Looker, Power BI, or Tableau.
  • Empathy and effective communication skills: You can explain complex analytical issues to both technical and non-technical audiences.
  • Strong interpretive skills: You can deconstruct complex source data to compose curated models that can be explored by stakeholders.
  • Product mindset: You build data systems that will be used to generate insights for years to come, not just one-off analyses.
  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams as they identify ways to improve the quality of the metrics they produce and analyze.
  • Champion data governance, security, privacy, and retention policies to protect end users, customers, and Lob.
  • Support and mentor fellow engineers and data team members through coffee chats, code review, and pair programming.

PythonSQLETLSnowflakeCommunication SkillsAnalytical SkillsData visualizationData modeling

Posted 8 days ago
Apply
Apply

🔍 GameTech

🏢 Company: Kaizen Gaming👥 2700-2700Media and EntertainmentDigital Entertainment

  • 6+ years of hands-on experience in writing complex, highly optimized SQL code for data manipulation, transformation and reporting.
  • 4+ years of experience in ETL/ELT development, data modeling, data warehouse architecture, and reporting tools.
  • Proven ability to design and maintain scalable ELT workflows in modern, cloud-native environments.
  • Deep understanding of data modeling techniques and enterprise data warehouse principles.
  • A track record of technical leadership—driving initiatives, mentoring peers, and establishing best practices.
  • Hands-on experience with the modern data stack, including tools like dbt, Airflow, and Databricks.
  • Familiarity with CI/CD practices in data engineering and deployment pipelines.
  • Strong communication skills, with the ability to effectively bridge technical and non-technical audiences.
  • Lead the design and implementation of robust, scalable data models and transformation pipelines across key domains.
  • Architect and maintain high-performance ELT workflows, ensuring efficiency, reliability, and observability at scale.
  • Translate complex business problems into elegant technical solutions that align with our long-term data strategy.
  • Establish and champion best practices in data governance, modeling, testing, and documentation across teams.
  • Mentor mid-level analytics engineers and contribute to the development of team-wide standards and frameworks.
  • Collaborate cross-functionally with data, product, and engineering peers to drive strategic initiatives and build innovative data products.
Posted 9 days ago
Apply
Apply

🏢 Company: Packlink Careers

  • Advanced knowledge in programming languages like Python.
  • Advanced expertise with development best practices and tools.
  • Experience with Streaming and Batch ingestion.
  • Experience with job orchestration tools like Apache Airflow.
  • Experience with SQL-like languages.
  • Experience in data modeling and architecture with DBT or similar.
  • Experience with analytical tools like Looker.
  • Work with internal technology teams to deliver key data engineering/analytics capabilities.
  • Design, build, and support data-centric services, including, but not limited to, event streaming, ETL pipelines, and distributed data storage utilizing Airflow, Apache Beam, BigQuery, and DBT.
  • Software engineering best practices to our data environment: testing and version control, CI/CD process in a GCP Cloud environment.
  • Work with architects and technology leads to ensure a reliable, scalable, and robust architecture for our data environment.
  • Improve, evolve, and make the company’s data warehouse scalable.
  • Build and design a horizontal self-service platform to achieve good data governance.
Posted 16 days ago
Apply
Apply

📍 USA

🧭 Full-Time

🔍 Healthcare

🏢 Company: Atropos Health

  • 5+ years of experience in data infrastructure, data analysis, data visualization, and dash boarding tools
  • Expertise in correct data modeling principles for analytics (dimensional, bonus points for data vault knowledge), integrating raw data from disparate sources, ETL, python, and building/maintaining data pipelines
  • Strong SQL skills
  • Experience building data warehouses and optimizing how data is modeled for downstream consumers
  • Familiarity with general cloud architecture
  • Quality mindset. Must approach data preparation development with data validation, ensuring accuracy and reliability in all data products
  • Must be curious, humble, a team player, operate with a bias to action, and have the ability to make decisions when confronted with uncertainty
  • Have worked in a fast-paced high-growth startup environment before and is comfortable switching work streams as needed
  • Design and develop robust and efficient analytics data products, such as data models, schemas, ETL jobs, data sets, BI visualizations, etc.,
  • Collaborate in partnership with engineering,  product, and other business teams to assess data and analytics needs
  • As part of development, ensure high level of data quality within data products through data validation and disciplined design practices
  • Contribute to a reliable and usable platform to deliver insights through raw data delivery, SQL access, API endpoints, and BI tooling.
  • Provide refined data to business users to enable KPI tracking and foster a culture of data-driven decision-making
  • Prioritize inbound requests from multiple teams and establish the business context for efficient problem-solving
  • Contribute to data asset documentation, data modeling, data validation and profiling, and tooling decisions

PythonSQLCloud ComputingData AnalysisETLAmazon Web ServicesData engineeringREST APIAnalytical SkillsData visualizationData modelingData analytics

Posted 24 days ago
Apply
Apply

📍 All over the world

🧭 Contract

  • You have proven experience as an Analytics Engineer, Data Engineer, or in a similar role.
  • You possess strong SQL skills and have experience working with modern data warehouses like Snowflake, BigQuery, or Redshift.
  • You have hands-on experience building data pipelines using tools such as dbt, Airflow, or Census.
  • You have strong data modelling skills.
  • Build robust, scalable pipelines to onboard new cost data providers into the Cloud Costs platform.
  • Design and implement methodologies to accurately model, allocate, and expose cloud and third-party spend to internal teams.
  • Partner with FinOps engineers and stakeholders to deliver high-impact cost visibility features that unlock real savings opportunities.
  • Develop reusable analytics components and dbt models to streamline cost data integration and reallocation.
  • Rapidly iterate on solutions to meet short-term project milestones while setting the stage for long-term scalability.
  • Collaborate closely with cross-functional teams to ensure alignment, reduce duplication, and drive shared success.

SQLCloud ComputingSnowflakeAirflowData engineeringData visualizationData modelingData analytics

Posted 25 days ago
Apply
Apply

📍 Canada

🧭 Full-Time

💸 150500.0 - 188000.0 CAN per year

🏢 Company: Babylist👥 101-250💰 $40,000,000 Series C over 3 years agoInternetBabyMarketplaceE-CommerceHealth CareParenting

  • 5+ years experience in analytics or data engineering, with a strong focus on building scalable data models
  • Deep fluency in SQL and dimensional modeling (star schema, slowly changing dimensions, surrogate keys, etc.)
  • 3+ years experience working in dbt, ideally in a production environment with CI/CD and testing as a well established core principle
  • Comfortable working across the full stack: Sigma, dbt, Snowflake, Airflow, Python (or similar tools)
  • Strong stakeholder management skills, including experience presenting to executives or C-level leaders
  • Own and evolve core data models that drive company-wide reporting, metrics, and analysis
  • Build robust, maintainable dbt models that form the foundation of our analytics layer
  • Collaborate with cross-functional teams to define and model key business metrics
  • Work across our stack (Snowflake, Airflow, dbt, Python, Sigma) to deliver clean and trustworthy data products
  • Help define, enforce, and improve standards for documentation, testing, observability, and performance
  • Partner with stakeholders—including senior leaders—to understand data needs and ensure solutions align with business goals
  • Mentor others in data modeling and best practices

AWSPythonSQLGitSnowflakeAirflowCI/CDData modeling

Posted 26 days ago
Apply
Apply

📍 California, Colorado, Florida, Illinois, Kansas, Maryland, Massachusetts, Missouri, New Jersey, New York, Texas, Washington

💸 117000.0 - 201000.0 USD per year

🔍 Gaming

🏢 Company: Mythical Games👥 251-500💰 Series C over 1 year ago🫂 Last layoff over 2 years agoVideo GamesBlockchainGamingApps

  • 3+ years of professional experience in analytics and data pipelines/warehouses
  • 3+ years of experience with SQL and a programming language (eg Python, Java)
  • Experience with gaming analytics and player behavior metrics
  • Hands-on experience with Google Cloud services (BigQuery, Cloud Functions, Dataflow)
  • Experience with Data Analysis Tools such as Looker or Tableau
  • Collaborate with stakeholders to gather requirements and design intuitive, streamlined analytical experiences
  • Develop sophisticated data models in BigQuery using Looker's LookML
  • Design robust data pipelines to ensure required data is available in a data warehouse with a reliable cadence
  • Create reports analyzing Mythical’s game/marketplace data and present them to appropriate stakeholders
  • Optimize analytics performance and data structure improvements (eg query optimization, data rollups)

PythonSQLData AnalysisETLGCPTableauData engineeringCommunication SkillsAnalytical SkillsRESTful APIsData visualizationData modeling

Posted 26 days ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

🧭 Full-Time

💸 150000.0 - 170000.0 USD per year

🔍 Software Development

🏢 Company: Mantra Health👥 51-100💰 $5,000,000 Series A about 2 years agoMental HealthMedicalWellnessHealth Care

  • 5-8 years of dedicated experience building and managing data pipelines, warehouses, and analytics solutions, with a proven track record in cloud environments (Snowflake strongly preferred) and startups.
  • Deep SQL proficiency, strong Python skills for data tasks, mastery of DBT (Cloud or Core) development and best practices, and significant experience enabling analytics with BI tools (Looker preferred).
  • Proficient in data modeling techniques (e.g., dimensional modeling, normalization) and can translate complex business requirements into effective, scalable data warehouse structures.
  • Meticulous attention to detail, take pride in data integrity, and understand the importance of robust processes, testing, and security, ideally with experience in regulated data environments (HIPAA/FERPA/SOC2).
  • Design, implement, monitor, and maintain scalable and reliable ELT pipelines using modern tooling, ingesting data from diverse sources including APIs, flat files, and databases like PostgreSQL.
  • Architect, develop, and refine robust data models within Snowflake (staging, unified models, data marts) using DBT, optimizing for analytical performance, clarity, and reusability.
  • Proactively partner with Product, Clinical, and other stakeholders to translate ambiguous business problems into analytical questions; conduct exploratory analysis to uncover insights; build and communicate compelling data narratives to drive decisions.
  • Implement rigorous data quality testing (e.g., DBT tests), monitoring, and validation; drive SDLC best practices (version control via Git, CI/CD via GitHub Actions/CircleCI) for all analytics code; ensure all data processes adhere to strict security and compliance standards (HIPAA/FERPA/SOC2); maintain clear documentation.
Posted about 1 month ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 United States

🧭 Full-Time

💸 160000.0 - 175000.0 USD per year

🔍 Blockchain Intelligence

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 8+ years of experience in analytics engineering, data engineering, or data science with a strong focus on building and scaling analytics workflows.
  • Strong experience across the entire Data Engineering lifecycle (from ETLs, Data Model design, infra, Data Quality, architecture etc.)
  • Deep proficiency in SQL and experience developing robust, modular data models using dbt (or equivalent tools) in a production environment.
  • Strong software engineering fundamentals, including experience with Python, CI/CD pipelines, and automated testing.
  • Proficiency in defining robust and scalable data models using best practices.
  • Experience using LLMs as well enabling AI through high quality data infrastructure.
  • Hands-on experience with cloud data warehouses and infrastructure (e.g., Snowflake, BigQuery, Redshift) and data orchestration tools (e.g., Airflow, Dagster, Prefect).
  • Proficiency in developing compelling dashboards using tools like Looker, Tableau, Power BI, Plotly or similar.
  • Lead the development and optimization of analytics pipelines and data models that power TRM’s products, investigations, and decision-making — enabling teams across the company and our customers (including former FBI, Secret Service, and Europol agents) to detect and respond to financial crime in the crypto space.
  • Define and implement best practices in analytics engineering (e.g., testing, observability, versioning, documentation), helping to level up the team’s development workflows and data reliability
  • Improve the scalability and maintainability of our analytics data ecosystem, which process large volume of data, through thoughtful architectural decisions and tooling improvements.
  • Partner closely with data scientists, data engineers, product and business teams to deliver production-ready datasets that support models, metrics, and investigative workflows.
  • Establish best-in-class data quality solutions to increase the reliability and accuracy of our data
  • Investigate performance issues, and bring creative, durable solutions to improve long-term reliability, cost and developer experience.
  • Drive adoption of modern data tools and workflows, helping the team evolve toward best-in-class analytics engineering practices.
  • Contribute to team on-call responsibilities that support the health and availability of our analytics and data science infrastructure (on-call is lightweight and shared equitably).

PythonSQLCloud ComputingSnowflakeTableauAirflowData engineeringData scienceCommunication SkillsCI/CDData modeling

Posted about 1 month ago
Apply

Related Articles

Posted about 1 month ago

How to Overcome Burnout While Working Remotely: Practical Strategies for Recovery

Burnout is a silent epidemic among remote workers. The blurred lines between work and home life, coupled with the pressure to always be “on,” can leave even the most dedicated professionals feeling drained. But burnout doesn’t have to define your remote work experience. With the right strategies, you can recover, recharge, and prevent future episodes. Here’s how.



Posted 6 days ago

Top 10 Skills to Become a Successful Remote Worker by 2025

Remote work is here to stay, and by 2025, the competition for remote jobs will be tougher than ever. To stand out, you need more than just basic skills. Employers want people who can adapt, communicate well, and stay productive without constant supervision. Here’s a simple guide to the top 10 skills that will make you a top candidate for remote jobs in the near future.

Posted 9 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 10 months ago

Read about the recent updates in remote work policies by major companies, the latest tools enhancing remote work productivity, and predictive statistics for remote work in 2024.

Posted 10 months ago

In-depth analysis of the tech layoffs in 2024, covering the reasons behind the layoffs, comparisons to previous years, immediate impacts, statistics, and the influence on the remote job market. Discover how startups and large tech companies are adapting, and learn strategies for navigating the new dynamics of the remote job market.