Apply

Analytics Engineer

Posted 17 days agoViewed

View full description

💎 Seniority level: Senior, 5 - 10 years

📍 Location: United States, Canada

🔍 Industry: Data & Finance

🏢 Company: AirGarage👥 11-50💰 $1,000 4 months agoInternetTransportationOnline PortalsPaymentsParkingReal EstateSoftware

🗣️ Languages: English

⏳ Experience: 5 - 10 years

🪄 Skills: PythonSQLBashData AnalysisETLGitKubernetesSnowflakeAlgorithmsAPI testingData engineeringData StructuresREST APICI/CDData visualizationData modelingData analyticsData management

Requirements:
  • A degree in engineering, science, math, or a related analytical field.
  • 5 - 10 years of experience in Data Engineering or Analytics Engineering.
  • Experience with the modern data stack (Snowflake / dbt / Polytomic / Hex or equivalents).
  • Advanced fluency in SQL and Python (bonus for Snowpark) — knowing how to write intricate queries and build automated pipelines.
  • Prior experience with data modeling and writing performant queries in a data warehouse setting on large datasets.
  • Passion for data
  • Extreme attention to detail and nuance
  • Innate curiosity
  • Systems-thinker
  • A growth mindset
Responsibilities:
  • Partner with our founding team and company leadership to build out a best-in-class analytics system to give our partners insight into their parking lot and garage performance and super charge our internal organization.
  • Work with our engineering, data, sales, marketing, and product teams to understand the data needs of the business and produce pipelines, data marts and other data solutions that enable better decision-making.
  • Develop scalable code that transforms raw data in an automated and efficient manner to ensure curated and ready-to-analyze data is always available.
  • Identify new data sources we can leverage to improve the company’s performance in core business activities; this can take the form of collaborating with our engineering team to improve data logging, scraping data from the web, or using third-party datasets.
  • Conduct data analyses to assist partner teams in diagnosing issues, understanding opportunities, or measuring performance.
  • Troubleshoot and communicate data issues to internal stakeholders, including by building code tests to alert us when there is a problem in our data before it affects end users.
  • Play a critical role in helping define data team best practices, workflows, and documentation to ensure we are building a sustainable and scalable data function.
Apply

Related Jobs

Apply

📍 United States

💸 125000.0 - 145000.0 USD per year

🔍 Healthcare

🏢 Company: Lirio👥 51-100💰 $3,000,000 Debt Financing over 2 years agoArtificial Intelligence (AI)Machine LearningInformation TechnologyHealth Care

  • Bachelors Degree in related field
  • 7+ years of experience
  • 5+ years of experience creating data analytics methods and tools
  • Experience in a senior technical role
  • Experience working in healthcare and with healthcare data
  • Experience operating systems in regulated environments, adhering to standards such as SOC 2 Type 2 and HITRUST
  • Experience working with domain experts to analyze complex datasets
  • Knowledge of software engineering and data modeling best practice
  • SQL
  • Python
  • Snowflake or similar managed data warehouses
  • Docker
  • Airflow
  • JavaScript, D3.js, DBT, and/or Observable are helpful, but not required
  • Collaborates internally and externally to understand technical questions that drive analytic requirements, understand ways to answer technical questions or reporting needs by instrumenting and analyzing Lirio’s Behavior Change AI platform, and understand insight and reporting requirements for verifying and validating the efficacy of the Lirio solution(s).
  • Develop and support tools facilitating self-service by internal and external users.
  • Owns the analytics engineering implementation at Lirio, assuring operation of the analytics pipelines, quality in data and reporting artifacts and developing improvements to meet client and market needs.
  • Provide expertise and guidance to clients, vendors, and internal teams for Data Analytics initiatives and strategies.
  • Guide the design, implementation, and performance of various data analytics processes including report generation, agent insights, and various aspects of client onboarding.
  • Collaborate with the data engineering team around requirements, timing, quality, etc.
  • Support team efforts around observability and SLAs.
  • Collaborate on research and development around clinical data, machine learning features, and other data assets to support organizational objectives.

DockerPythonSQLData AnalysisETLMachine LearningSnowflakeAirflowData engineeringREST APIData visualizationData modelingData analyticsData management

Posted 5 days ago
Apply
Apply

📍 USA

🧭 Full-Time

🔍 Healthcare

🏢 Company: Stedi

  • Expert in the modern data stack (DBT, Redshift, Quicksight)
  • Expert in SQL, data visualization, and structuring complex database schemas
  • Define business metrics to determine the “network health” of our clearinghouse
  • Instrument dashboards, reports, and alerts for both the network operations and engineering teams to take action on in order to continuously improve our payer routing and API logic
  • Work with product, engineering, and design to “productize” these metrics within the Stedi application for end-customer consumption
  • Build internal tools, scripts, and runbooks to help Stedi manage the network and make decisions quickly
  • Manage a well-defined backlog of issues in GitHub

SQLData AnalysisData visualization

Posted 10 days ago
Apply
Apply

📍 India, Romania, Berlin, the US, Canada

🧭 Full-Time

🔍 Software Development

🏢 Company: Cresta👥 101-250💰 $125,000,000 Series D 4 months agoAutomotiveCustomer ServiceArtificial Intelligence (AI)Intelligent SystemsRetailMachine LearningTelecommunicationsNatural Language ProcessingSoftware

  • Previous experience in a startup or product-first company is a plus.
  • Familiarity with ClickHouse or similar columnar databases for managing large-scale, real-time analytical queries.
  • Experience with product analytics, including defining and tracking relevant metrics.
  • Strong proficiency in data engineering, ETL processes, and database management.
  • Proficiency in SQL, any scripting language (Python, R) and a data visualization tool like Hex, or Power BI.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork abilities.
  • Collaborate with stakeholders to understand data requirements and design and implement scalable data pipelines.
  • Partner with product teams to define and implement tracking mechanisms for key product metrics.
  • Analyze user behavior, product usage, and other relevant data to provide insights that drive product improvements.
  • Create and maintain dashboards and reports to communicate analytical findings to stakeholders.
  • Collaborate with cross-functional teams to integrate analytics into product development processes.

PythonSQLData AnalysisETLProduct AnalyticsClickhouseData engineeringREST APIData visualizationData modelingScripting

Posted 10 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 173000.0 - 204000.0 USD per year

🔍 Software Development

🏢 Company: GlossGenius👥 51-100💰 $28,000,000 Series C over 1 year agoSaaSMobile AppsBeautyFinTech

  • 4+ years of experience as an analytics engineer, data engineer, or business intelligence engineer and 2+ years developing in dbt, ideally at a SaaS company
  • Advanced proficiency in SQL
  • Previous experience with data modeling, ETL/ELT development principles, and data warehousing concepts
  • Familiarity with the tools in our stack (Segment, Fivetran, Snowflake, dbt, Looker, Airflow) is preferred
  • A naturally inquisitive, critical thinker, who enjoys and is effective at solving problems
  • A demonstrated self-starter with strong communication and project management skills
  • Build, Refactor & Optimize Data Models in dbt: Design and maintain data transformations that ensure accurate, scalable, and high-quality datasets
  • Serve as the Architect for Our dbt Project: Own and evolve the project’s architecture, design patterns, and best practices, ensuring consistency in data definitions and streamlined development.
  • Create a Consistent User Experience: Unify metrics and definitions across our data warehouse and BI tools, driving reliable, self-service analytics for cross-functional stakeholders.
  • Mentor & Uplevel the Team: Provide guidance and code reviews to analysts and data engineers, fostering a culture of collaboration, learning, and excellence in dbt and data modeling.
  • Partner with Data Engineering to Manage Our Stack: Collaborate on the design, ingestion, and transformation pipelines—ensuring they’re scalable, efficient, and aligned with business needs.
  • Champion Data Privacy & Quality: Implement and uphold governance processes and compliance measures to maintain the highest standards of data integrity.

Project ManagementPythonSQLApache AirflowBusiness IntelligenceETLSnowflakeData engineeringCommunication SkillsAnalytical SkillsData visualizationData modelingData analyticsData management

Posted 12 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 SaaS

🏢 Company: ServiceTrade👥 51-100💰 $85,000,000 Private about 3 years agoE-CommerceWeb AppsEnterprise SoftwareMobile

  • 3-5 years in data engineering or analytics engineering
  • Strong expertise in DBT, SQL, Python
  • Proven experience with Google Cloud Platform tools
  • Design, develop, and maintain data warehouse
  • Build and optimize ETL/ELT pipelines
  • Establish and standardize key business metrics
  • Integrate AI-powered tools
  • Collaborate with cross-functional teams
  • Enforce data quality and compliance

PythonSQLETL

Posted 13 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: HackerOne👥 201-500💰 $49,000,000 Series E about 3 years ago🫂 Last layoff over 1 year agoInternetComputerNetwork Security

  • 4+ years experience as an Analytics Engineer, Business Intelligence Engineer, Data Engineer, or similar role w/ proven track record of launching source of truth data marts.
  • 4+ years of experience building and optimizing data pipelines, products, and solutions.
  • Strong proficiency in SQL for data manipulation in a fast-paced work environment.
  • Strong proficiency in creating compelling data stories using data visualization tools such as Looker, Tableau, Sigma, Domo, or PowerBI.
  • Deliver impact by consistently contributing to discovery, architecture, and development of high-impact, high-performance, scalable source of truth data marts and data products.
  • Lead and deliver cross-functional product and technical initiatives, while managing competing priorities, and adapting to shifting business objectives.
  • Drive continuous evolution and innovation, adoption of emerging technologies, and implementation of industry best practices.
  • Champion a higher bar for discoverability, usability, consistency, validity, uniqueness, simplicity, completeness, integrity, security, and compliance of information and insights across the company.
  • Provide subject matter expertise, fostering a culture of continuous learning and growth.

AWSPythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 23 days ago
Apply
Apply

📍 Germany, India, USA

🧭 Full-Time

🔍 Customer relationship management (CRM)

🏢 Company: HubSpot👥 1001-5000💰 $35,000,000 Series E over 12 years ago🫂 Last layoff about 1 year agoSaaSAnalyticsMarketingCopywritingSocial Media

  • Several years of hands-on SQL experience and expertise in relational databases and data modeling.
  • Strong organizational skills and the ability to document technical designs.
  • Proven communication skills to effectively bridge gaps between business leaders, engineers, and data scientists.
  • Experience in distilling complex information for both executives and front-line representatives.
  • Creative problem-solving abilities for flexible solutions to business questions.
  • Demonstrated curiosity and willingness to learn new technologies.
  • Experience with Snowflake, dbt, and/or Looker is preferred.
  • Collaborate with technical and non-technical teams to connect business and technical solutions.
  • Build scalable data models to analyze key business components.
  • Maintain technical best practices in data infrastructure and contribute to long-term data strategies.
  • Support operations teams in using foundational data models for reporting purposes.
  • Conduct complex root cause analyses and implement preventive recommendations.
  • Expand dbt patterns and macros for flexible data structures.
  • Scope requirements with stakeholders and guide projects throughout their lifecycle with clear roadmaps.
  • Mentor others to foster an inclusive and diverse team environment.

SQLData AnalysisSnowflakeData modeling

Posted 26 days ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 United States

💸 115000.0 - 160000.0 USD per year

🔍 Marketplace platform

🏢 Company: Taskrabbit👥 251-500💰 Secondary Market over 9 years agoMarketplaceE-CommerceJanitorial ServiceFacilities Support ServicesFreight ServicePeer to PeerSharing Economy

  • Expertise in Snowflake SQL and dbt Cloud CLI (5 years).
  • Expertise in dimensional modeling and SCD2 use cases (3 years).
  • Expertise in at least one programming language used for data analysis (e.g., Python, R) (3 years).
  • Experience designing and implementing data quality assurance and monitoring processes (2 years).
  • Experience interacting with executive and senior management stakeholders around data topics (2 years).
  • Familiarity with git (2 years).
  • Commitment to data process development with a focus on transparency, quality, efficiency, and resilience.
  • Design and develop data ontologies for Dolly’s core business objects represented as the enterprise data model.
  • Develop performant and scalable code for data transformation.
  • Design and develop efficient orchestrations for data transformation dependencies.
  • Implement transformation pipelines for raw data focusing on quality and periodicity.
  • Own data definitions for key business metrics.
  • Collaborate with stakeholders to ensure data definitions align with business needs.

Python

Posted about 1 month ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Analytics Engineering

🏢 Company: dbt Labs👥 251-500💰 $222,000,000 Series D about 3 years ago🫂 Last layoff over 1 year agoArtificial Intelligence (AI)Open SourceBig DataAnalyticsInformation TechnologySoftware

  • 2+ years experience as an Analytics Engineer or related role
  • Expertise in SQL and Python
  • Experience with ETL jobs using dbt
  • Experience with data reporting and dashboarding tools
  • Translate data needs into technical requirements
  • Own dbt instance and maintain data pipelines
  • Establish data integrity standards
  • Develop datasets for product metrics
  • Build dashboards to track performance
  • Create tools for self-serve analytics
  • Influence roadmap from a data perspective

PythonSQLETLData visualization

Posted about 1 month ago
Apply
Apply
🔥 Data Analytics Engineer III
Posted about 1 month ago

📍 United States

🧭 Full-Time

💸 130000.0 - 150000.0 USD per year

🔍 Health and Fitness

🏢 Company: MyFitnessPal👥 51-100💰 $18,000,000 Series A over 11 years agoPersonal HealthNutritionFitnessAppsWellnessHealth CareQuantified Self

  • Experience with dbt and modern data modeling practices.
  • Proven expertise in implementing and maintaining QA/QC processes for data pipelines.
  • Proficiency in SQL and development languages like Python.
  • Ability to lead daily efforts and contribute to long-term data governance strategy.
  • Strong problem-solving skills and a self-starter mindset.
  • Experience with data warehouses like Snowflake and relational databases.
  • Understanding of data governance principles and data lineage.
  • Experience with Airflow for managing ETL/ELT pipelines.
  • Familiarity with API integration and design patterns.
  • Experience developing scalable data pipelines and resolving production data issues.
  • Lead day-to-day efforts to ensure data quality, governance, and reliability, driving long-term data governance strategy.
  • Design and implement automated data quality assurance (QA) and quality control (QC) processes.
  • Develop and maintain data validation frameworks and testing strategies for pipelines and transformations.
  • Conduct ad hoc analysis and investigations to deliver actionable insights.
  • Collaborate with the Data Platform team to improve data infrastructure and processes.

PythonSQLApache AirflowETLSnowflakeData modeling

Posted about 1 month ago
Apply