Apply

Senior Analytics Engineer

Posted 1 day agoViewed

View full description

💎 Seniority level: Senior, 5+ years

📍 Location: USA

🏢 Company: Engine

⏳ Experience: 5+ years

🪄 Skills: AWSDockerSQLETLGitSnowflakeAirflow

Requirements:
  • 5+ years of industry experience as an Analytics Engineer in high-growth environments.
  • Strong expertise using SQL, Snowflake, Airflow, and BI tools such as Looker.
  • A Bachelor's degree in Computer Science, Information Technology, Engineering, or a related technical field, or equivalent practical experience
Responsibilities:
  • Develop and implement tools and strategies to improve the data quality, reliability, and governance at Engine.
  • Collaborate with engineering, analytics, and business stakeholders to ensure high quality data empowers every business decision to drive measurable business impact.
  • Enhance data infrastructure and analytics capabilities by working closely with our data infrastructure and analyst teams.
  • Design and build our data pipelines to support long term business growth without compromising on our day to day execution speed.
Apply

Related Jobs

Apply

📍 United States

🧭 Full-Time

💸 150000.0 - 200000.0 USD per year

🔍 Energy

🏢 Company: Arcadia👥 501-1000💰 $125,000,000 over 2 years agoDatabaseCleanTechRenewable EnergyClean EnergySoftware

  • 4+ years as an Analytics Engineer or equivalent role; experience with dbt is strongly preferred
  • 6+ years, cumulatively, in the data space (data engineering, data science, analytics, or similar)
  • Expert-level understanding of conceptual data modeling and data mart design
  • An understanding of data structures and/or database design plus deep experience with SQL and Python
  • Experience building data pipelines and database management including Snowflake or similar
  • Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business
  • Experience in technical leadership or mentorship
  • Strong communication and collaboration skills
  • Proven ability to solve complex problems in a dynamic and evolving environment
  • Transform, test, deploy, and document data to deliver clean and trustworthy data for analysis to end-users
  • Collaborate with subject matter experts, engineers, and product managers to identify the most elegant and effective data structures to understand our constantly growing and evolving business
  • Help bring engineering best practices (reliability, modularity, test coverage, documentation) to our DAG and to our Data team generally
  • Collaborate with data engineers to build robust, tested, scalable ELT pipelines.
  • Data modeling: model raw data into clean, tested, and reusable datasets to represent our key business data concepts. Define the rules and requirements for the formats and attributes of data
  • Data transformation: build our data lakehouse by transforming raw data into meaningful, useful data elements through joining, filtering, and aggregating source dataData documentation: create and maintain data documentation including data definitions and understandable data descriptions to enable broad-scale understanding of the use of data
  • Employ software engineering best practices to write code and coach analysts and data scientists to do the same

AWSPythonSQLData AnalysisETLGitSnowflakeData engineeringData StructuresCommunication SkillsAnalytical SkillsCollaborationData visualizationData modelingData management

Posted 2 days ago
Apply
Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, and WA

🧭 Full-Time

💸 162000.0 - 190000.0 USD per year

🔍 Software Development

🏢 Company: Lob👥 101-250💰 $50,000,000 Series C over 4 years agoDeveloper APIsShippingSaaSMarketingHealth CareSoftwareCourier Service

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience: at least one big data warehouse system such as Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating live production systems using dbt and Python.
  • 3+ years of BI Software experience: at least one analytics platform such as Looker, Power BI, or Tableau.
  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams as they identify ways to improve the quality of the metrics they produce and analyze.
  • Champion data governance, security, privacy, and retention policies to protect end users, customers, and Lob.
  • Support and mentor fellow engineers and data team members through coffee chats, code review, and pair programming.

PythonSQLETLSnowflakeCommunication SkillsAnalytical SkillsData visualizationData modeling

Posted 16 days ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 USA

🧭 Full-Time

🔍 Healthcare

🏢 Company: Atropos Health

  • 5+ years of experience in data infrastructure, data analysis, data visualization, and dash boarding tools
  • Expertise in correct data modeling principles for analytics (dimensional, bonus points for data vault knowledge), integrating raw data from disparate sources, ETL, python, and building/maintaining data pipelines
  • Strong SQL skills
  • Experience building data warehouses and optimizing how data is modeled for downstream consumers
  • Familiarity with general cloud architecture
  • Quality mindset. Must approach data preparation development with data validation, ensuring accuracy and reliability in all data products
  • Must be curious, humble, a team player, operate with a bias to action, and have the ability to make decisions when confronted with uncertainty
  • Have worked in a fast-paced high-growth startup environment before and is comfortable switching work streams as needed
  • Design and develop robust and efficient analytics data products, such as data models, schemas, ETL jobs, data sets, BI visualizations, etc.,
  • Collaborate in partnership with engineering,  product, and other business teams to assess data and analytics needs
  • As part of development, ensure high level of data quality within data products through data validation and disciplined design practices
  • Contribute to a reliable and usable platform to deliver insights through raw data delivery, SQL access, API endpoints, and BI tooling.
  • Provide refined data to business users to enable KPI tracking and foster a culture of data-driven decision-making
  • Prioritize inbound requests from multiple teams and establish the business context for efficient problem-solving
  • Contribute to data asset documentation, data modeling, data validation and profiling, and tooling decisions

PythonSQLCloud ComputingData AnalysisETLAmazon Web ServicesData engineeringREST APIAnalytical SkillsData visualizationData modelingData analytics

Posted about 1 month ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 California, Colorado, Florida, Illinois, Kansas, Maryland, Massachusetts, Missouri, New Jersey, New York, Texas, Washington

💸 117000.0 - 201000.0 USD per year

🔍 Gaming

🏢 Company: Mythical Games👥 251-500💰 Series C over 1 year ago🫂 Last layoff over 2 years agoVideo GamesBlockchainGamingApps

  • 3+ years of professional experience in analytics and data pipelines/warehouses
  • 3+ years of experience with SQL and a programming language (eg Python, Java)
  • Experience with gaming analytics and player behavior metrics
  • Hands-on experience with Google Cloud services (BigQuery, Cloud Functions, Dataflow)
  • Experience with Data Analysis Tools such as Looker or Tableau
  • Collaborate with stakeholders to gather requirements and design intuitive, streamlined analytical experiences
  • Develop sophisticated data models in BigQuery using Looker's LookML
  • Design robust data pipelines to ensure required data is available in a data warehouse with a reliable cadence
  • Create reports analyzing Mythical’s game/marketplace data and present them to appropriate stakeholders
  • Optimize analytics performance and data structure improvements (eg query optimization, data rollups)

PythonSQLData AnalysisETLGCPTableauData engineeringCommunication SkillsAnalytical SkillsRESTful APIsData visualizationData modeling

Posted about 1 month ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 United States

🧭 Full-Time

💸 160000.0 - 175000.0 USD per year

🔍 Blockchain Intelligence

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 8+ years of experience in analytics engineering, data engineering, or data science with a strong focus on building and scaling analytics workflows.
  • Strong experience across the entire Data Engineering lifecycle (from ETLs, Data Model design, infra, Data Quality, architecture etc.)
  • Deep proficiency in SQL and experience developing robust, modular data models using dbt (or equivalent tools) in a production environment.
  • Strong software engineering fundamentals, including experience with Python, CI/CD pipelines, and automated testing.
  • Proficiency in defining robust and scalable data models using best practices.
  • Experience using LLMs as well enabling AI through high quality data infrastructure.
  • Hands-on experience with cloud data warehouses and infrastructure (e.g., Snowflake, BigQuery, Redshift) and data orchestration tools (e.g., Airflow, Dagster, Prefect).
  • Proficiency in developing compelling dashboards using tools like Looker, Tableau, Power BI, Plotly or similar.
  • Lead the development and optimization of analytics pipelines and data models that power TRM’s products, investigations, and decision-making — enabling teams across the company and our customers (including former FBI, Secret Service, and Europol agents) to detect and respond to financial crime in the crypto space.
  • Define and implement best practices in analytics engineering (e.g., testing, observability, versioning, documentation), helping to level up the team’s development workflows and data reliability
  • Improve the scalability and maintainability of our analytics data ecosystem, which process large volume of data, through thoughtful architectural decisions and tooling improvements.
  • Partner closely with data scientists, data engineers, product and business teams to deliver production-ready datasets that support models, metrics, and investigative workflows.
  • Establish best-in-class data quality solutions to increase the reliability and accuracy of our data
  • Investigate performance issues, and bring creative, durable solutions to improve long-term reliability, cost and developer experience.
  • Drive adoption of modern data tools and workflows, helping the team evolve toward best-in-class analytics engineering practices.
  • Contribute to team on-call responsibilities that support the health and availability of our analytics and data science infrastructure (on-call is lightweight and shared equitably).

PythonSQLCloud ComputingSnowflakeTableauAirflowData engineeringData scienceCommunication SkillsCI/CDData modeling

Posted about 1 month ago
Apply
Apply

📍 United States

💸 157300.0 - 255200.0 USD per year

🔍 SaaS

🏢 Company: Calendly👥 501-1000💰 $350,000,000 Series B over 4 years ago🫂 Last layoff almost 2 years agoProductivity ToolsEnterprise SoftwareCollaborationMeeting SoftwareSchedulingSoftware

  • 5+ years of experience working with a SaaS company, partnering with the GTM domain, and building data products in a cloud data warehouse (BigQuery, Snowflake, Redshift, Databricks, etc.)
  • Expert SQL skills with strong attention to data accuracy and integrity (R or Python is a plus)
  • Experience owning a dbt project with engineering best practices (e.g., unit testing, data quality checks, reusable building blocks, etc.)
  • Hands-on experience with Segment for web event tracking and customer data integration
  • Experience with version control systems like Github, Bitbucket, Gitlab, etc (bonus for CI/CD pipeline management)
  • Proficiency in data modeling to meet business and GTM needs for today, and beyond
  • Ability to diagnose data issues proactively and build trust in data across GTM teams
  • Experience with website experimentation (A/B testing, personalization, CRO analysis)
  • Experience with Salesforce, Braze, and paid ad platforms, ensuring seamless GTM data flow
  • Experience implementing and driving adoption of data quality tools, like Monte Carlo, or similar anomaly detection tool
  • Strong growth mindset—you understand how data drives revenue, acquisition, and retention
  • Design scalable queries and data syncs to provide customer traits, event data, and insights for GTM teams
  • Build customer journey models to drive lead prioritization, retention strategies, and precision targeting
  • Support web, CX, and content teams in implementing and analyzing experiments, personalization, and conversion optimizations
  • Drive adoption of Monte Carlo for GTM data monitoring and issue resolution, ensuring high trust in insights
  • Ensure timely and reliable data availability across Salesforce, Braze, Segment, and paid media platforms
  • Establish data governance and analytics engineering (AE) best practices, aligning with the core data platform and Analytics teams
  • Collaborate closely with Marketing, Sales, CX, Ops, and Analytics teams to ensure data is leveraged effectively for business decisions

PythonSQLCloud ComputingETLGitSalesforceSnowflakeAPI testingData engineeringCommunication SkillsAnalytical SkillsCI/CDProblem SolvingRESTful APIsAttention to detailCross-functional collaborationData visualizationStrategic thinkingData modelingData analyticsData managementSaaSA/B testing

Posted 3 months ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 150000.0 - 170000.0 USD per year

🔍 Payment technology

  • At least 5 years experience in analytics engineering.
  • Strong experience with dbt or similar tools for managing data.
  • Proficiency in SQL variants, including MySQL and SnowSQL.
  • Expertise in data warehouse modeling and techniques.
  • Strong programming experience, preferably in Python.
  • Collaborative team player with leadership skills.
  • Ability to build highly reliable systems.
  • Strong diagnostic, organizational, and communication skills.
  • Deliver comprehensive analytics by transforming and modeling data using Snowflake, dbt, and Looker.
  • Work with engineering teams to promote database design best practices.
  • Capture important business data from source systems into the data lake.
  • Design and implement data transformations to support analytics.
  • Build out a semantic layer for BI and reporting tools.
  • Maintain and enhance data quality monitoring systems.
  • Implement data governance best practices.
  • Support internal BI and reporting tools.
  • Provide state-of-the-art analytics to all stakeholders.
  • Provide leadership in technical design and mentor team members.

AWSPostgreSQLPythonSQLMySQLSnowflake

Posted 5 months ago
Apply