Apply

Senior Analytics Engineer

Posted about 1 month agoViewed

View full description

💎 Seniority level: Senior, 8+ years

📍 Location: United States, PST

💸 Salary: 160000.0 - 175000.0 USD per year

🔍 Industry: Blockchain Intelligence

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

🗣️ Languages: English

⏳ Experience: 8+ years

🪄 Skills: PythonSQLCloud ComputingSnowflakeTableauAirflowData engineeringData scienceCommunication SkillsCI/CDData modeling

Requirements:
  • 8+ years of experience in analytics engineering, data engineering, or data science with a strong focus on building and scaling analytics workflows.
  • Strong experience across the entire Data Engineering lifecycle (from ETLs, Data Model design, infra, Data Quality, architecture etc.)
  • Deep proficiency in SQL and experience developing robust, modular data models using dbt (or equivalent tools) in a production environment.
  • Strong software engineering fundamentals, including experience with Python, CI/CD pipelines, and automated testing.
  • Proficiency in defining robust and scalable data models using best practices.
  • Experience using LLMs as well enabling AI through high quality data infrastructure.
  • Hands-on experience with cloud data warehouses and infrastructure (e.g., Snowflake, BigQuery, Redshift) and data orchestration tools (e.g., Airflow, Dagster, Prefect).
  • Proficiency in developing compelling dashboards using tools like Looker, Tableau, Power BI, Plotly or similar.
Responsibilities:
  • Lead the development and optimization of analytics pipelines and data models that power TRM’s products, investigations, and decision-making — enabling teams across the company and our customers (including former FBI, Secret Service, and Europol agents) to detect and respond to financial crime in the crypto space.
  • Define and implement best practices in analytics engineering (e.g., testing, observability, versioning, documentation), helping to level up the team’s development workflows and data reliability
  • Improve the scalability and maintainability of our analytics data ecosystem, which process large volume of data, through thoughtful architectural decisions and tooling improvements.
  • Partner closely with data scientists, data engineers, product and business teams to deliver production-ready datasets that support models, metrics, and investigative workflows.
  • Establish best-in-class data quality solutions to increase the reliability and accuracy of our data
  • Investigate performance issues, and bring creative, durable solutions to improve long-term reliability, cost and developer experience.
  • Drive adoption of modern data tools and workflows, helping the team evolve toward best-in-class analytics engineering practices.
  • Contribute to team on-call responsibilities that support the health and availability of our analytics and data science infrastructure (on-call is lightweight and shared equitably).
Apply

Related Jobs

Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

💸 162000.0 - 190000.0 USD per year

🔍 Software Development

🏢 Company: Lob👥 101-250💰 $50,000,000 Series C over 4 years agoDeveloper APIsShippingSaaSMarketingHealth CareSoftwareCourier Service

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience: at least one big data warehouse system such as Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating live production systems using dbt and Python.
  • 3+ years of BI Software experience: at least one analytics platform such as Looker, Power BI, or Tableau.
  • Empathy and effective communication skills: You can explain complex analytical issues to both technical and non-technical audiences.
  • Strong interpretive skills: You can deconstruct complex source data to compose curated models that can be explored by stakeholders.
  • Product mindset: You build data systems that will be used to generate insights for years to come, not just one-off analyses.
  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams as they identify ways to improve the quality of the metrics they produce and analyze.
  • Champion data governance, security, privacy, and retention policies to protect end users, customers, and Lob.
  • Support and mentor fellow engineers and data team members through coffee chats, code review, and pair programming.

PythonSQLETLSnowflakeCommunication SkillsAnalytical SkillsData visualizationData modeling

Posted 9 days ago
Apply
Apply

📍 USA

🧭 Full-Time

🔍 Healthcare

🏢 Company: Atropos Health

  • 5+ years of experience in data infrastructure, data analysis, data visualization, and dash boarding tools
  • Expertise in correct data modeling principles for analytics (dimensional, bonus points for data vault knowledge), integrating raw data from disparate sources, ETL, python, and building/maintaining data pipelines
  • Strong SQL skills
  • Experience building data warehouses and optimizing how data is modeled for downstream consumers
  • Familiarity with general cloud architecture
  • Quality mindset. Must approach data preparation development with data validation, ensuring accuracy and reliability in all data products
  • Must be curious, humble, a team player, operate with a bias to action, and have the ability to make decisions when confronted with uncertainty
  • Have worked in a fast-paced high-growth startup environment before and is comfortable switching work streams as needed
  • Design and develop robust and efficient analytics data products, such as data models, schemas, ETL jobs, data sets, BI visualizations, etc.,
  • Collaborate in partnership with engineering,  product, and other business teams to assess data and analytics needs
  • As part of development, ensure high level of data quality within data products through data validation and disciplined design practices
  • Contribute to a reliable and usable platform to deliver insights through raw data delivery, SQL access, API endpoints, and BI tooling.
  • Provide refined data to business users to enable KPI tracking and foster a culture of data-driven decision-making
  • Prioritize inbound requests from multiple teams and establish the business context for efficient problem-solving
  • Contribute to data asset documentation, data modeling, data validation and profiling, and tooling decisions

PythonSQLCloud ComputingData AnalysisETLAmazon Web ServicesData engineeringREST APIAnalytical SkillsData visualizationData modelingData analytics

Posted 24 days ago
Apply
Apply

📍 All over the world

🧭 Contract

  • You have proven experience as an Analytics Engineer, Data Engineer, or in a similar role.
  • You possess strong SQL skills and have experience working with modern data warehouses like Snowflake, BigQuery, or Redshift.
  • You have hands-on experience building data pipelines using tools such as dbt, Airflow, or Census.
  • You have strong data modelling skills.
  • Build robust, scalable pipelines to onboard new cost data providers into the Cloud Costs platform.
  • Design and implement methodologies to accurately model, allocate, and expose cloud and third-party spend to internal teams.
  • Partner with FinOps engineers and stakeholders to deliver high-impact cost visibility features that unlock real savings opportunities.
  • Develop reusable analytics components and dbt models to streamline cost data integration and reallocation.
  • Rapidly iterate on solutions to meet short-term project milestones while setting the stage for long-term scalability.
  • Collaborate closely with cross-functional teams to ensure alignment, reduce duplication, and drive shared success.

SQLCloud ComputingSnowflakeAirflowData engineeringData visualizationData modelingData analytics

Posted 25 days ago
Apply
Apply

📍 California, Colorado, Florida, Illinois, Kansas, Maryland, Massachusetts, Missouri, New Jersey, New York, Texas, Washington

💸 117000.0 - 201000.0 USD per year

🔍 Gaming

🏢 Company: Mythical Games👥 251-500💰 Series C over 1 year ago🫂 Last layoff over 2 years agoVideo GamesBlockchainGamingApps

  • 3+ years of professional experience in analytics and data pipelines/warehouses
  • 3+ years of experience with SQL and a programming language (eg Python, Java)
  • Experience with gaming analytics and player behavior metrics
  • Hands-on experience with Google Cloud services (BigQuery, Cloud Functions, Dataflow)
  • Experience with Data Analysis Tools such as Looker or Tableau
  • Collaborate with stakeholders to gather requirements and design intuitive, streamlined analytical experiences
  • Develop sophisticated data models in BigQuery using Looker's LookML
  • Design robust data pipelines to ensure required data is available in a data warehouse with a reliable cadence
  • Create reports analyzing Mythical’s game/marketplace data and present them to appropriate stakeholders
  • Optimize analytics performance and data structure improvements (eg query optimization, data rollups)

PythonSQLData AnalysisETLGCPTableauData engineeringCommunication SkillsAnalytical SkillsRESTful APIsData visualizationData modeling

Posted 27 days ago
Apply
Apply

📍 United States

💸 157300.0 - 255200.0 USD per year

🔍 SaaS

🏢 Company: Calendly👥 501-1000💰 $350,000,000 Series B over 4 years ago🫂 Last layoff almost 2 years agoProductivity ToolsEnterprise SoftwareCollaborationMeeting SoftwareSchedulingSoftware

  • 5+ years of experience working with a SaaS company, partnering with the GTM domain, and building data products in a cloud data warehouse (BigQuery, Snowflake, Redshift, Databricks, etc.)
  • Expert SQL skills with strong attention to data accuracy and integrity (R or Python is a plus)
  • Experience owning a dbt project with engineering best practices (e.g., unit testing, data quality checks, reusable building blocks, etc.)
  • Hands-on experience with Segment for web event tracking and customer data integration
  • Experience with version control systems like Github, Bitbucket, Gitlab, etc (bonus for CI/CD pipeline management)
  • Proficiency in data modeling to meet business and GTM needs for today, and beyond
  • Ability to diagnose data issues proactively and build trust in data across GTM teams
  • Experience with website experimentation (A/B testing, personalization, CRO analysis)
  • Experience with Salesforce, Braze, and paid ad platforms, ensuring seamless GTM data flow
  • Experience implementing and driving adoption of data quality tools, like Monte Carlo, or similar anomaly detection tool
  • Strong growth mindset—you understand how data drives revenue, acquisition, and retention
  • Design scalable queries and data syncs to provide customer traits, event data, and insights for GTM teams
  • Build customer journey models to drive lead prioritization, retention strategies, and precision targeting
  • Support web, CX, and content teams in implementing and analyzing experiments, personalization, and conversion optimizations
  • Drive adoption of Monte Carlo for GTM data monitoring and issue resolution, ensuring high trust in insights
  • Ensure timely and reliable data availability across Salesforce, Braze, Segment, and paid media platforms
  • Establish data governance and analytics engineering (AE) best practices, aligning with the core data platform and Analytics teams
  • Collaborate closely with Marketing, Sales, CX, Ops, and Analytics teams to ensure data is leveraged effectively for business decisions

PythonSQLCloud ComputingETLGitSalesforceSnowflakeAPI testingData engineeringCommunication SkillsAnalytical SkillsCI/CDProblem SolvingRESTful APIsAttention to detailCross-functional collaborationData visualizationStrategic thinkingData modelingData analyticsData managementSaaSA/B testing

Posted 2 months ago
Apply