Apply

Senior Analytics Engineer

Posted 6 days agoViewed

View full description

💎 Seniority level: Senior, 4+ years

🏢 Company: Mixbook👥 51-100💰 $10,000,000 Series B over 13 years agoE-CommerceSocial NetworkConsumer Goods

🗣️ Languages: English

⏳ Experience: 4+ years

Requirements:
  • 4+ years of proven experience in Analytics or Data Engineering.
  • Expert proficiency in SQL and LookML - you’ve built and optimized complex data models in Looker.
  • Proven track record with dbt - delivering modular, tested transformations for analytics.
  • Experience with Business Intelligence tools (e.g., Looker, Tableau) and an ability to design intuitive, user-friendly data visualizations.
  • Success in fast-paced, high-growth environments.
  • Comfortable introducing new analytic techniques and driving their adoption across teams.
Responsibilities:
  • Own and maintain executive-level dashboards, ensuring the data models (LookML and dbt) behind them are accurate and scalable.
  • Develop and refine data models to support Mixbook’s growth. Move us from ad-hoc structures to a more robust, sustainable environment for deeper analysis.
  • Build repeatable, scalable solutions and guide end users on leveraging these data models for self-service analytics and consistent reporting.
  • Establish uniform definitions for key metrics, maintain data quality standards, and implement governance frameworks that ensure confidence in our data.
Apply

Related Jobs

Apply

📍 Canada

🧭 Full-Time

🔍 FinTech

🏢 Company: KOHO

  • 5+ years of mastery in data manipulation and analytics architecture
  • Advanced expertise in dbt (incremental modeling, materializations, snapshots, variables, macros, jinja)
  • Strong knowledge of SQL and how to write efficient SQL queries
  • Strong command of SQL, query optimization, and data warehouse design
  • Building strong relationships with stakeholders (the finance team), scope and prioritize their analytics requests.
  • Understanding business needs and translating them to requirements.
  • Using dbt (Core for development and Cloud for orchestration) to transform, test, deploy, and document financial data while applying software engineering best practices.
  • Troubleshooting variances in reports, and striving to eliminate them at the source.
  • Building game-changing data products that empower the finance team
  • Architecting solutions that transform complex financial data into actionable insights
  • Monitoring, optimizing and troubleshooting warehouse performance (AWS Redshift).
  • Creating scalable, self-service analytics solutions that democratize data access
  • Occasionally building dashboards and reports in Sigma and Drivetrain.
  • Defining processes, building tools, and offering training to empower all data users in the organization.

SQLData engineeringCommunication SkillsAnalytical SkillsData visualizationData modelingFinanceData analytics

Posted 5 days ago
Apply
Apply

📍 Canada

🧭 Employee

🔍 Software Development

🏢 Company: Lightspeed Commerce👥 1001-5000💰 $716,100,000 Post-IPO Equity over 3 years ago🫂 Last layoff 4 months agoE-CommerceBusiness Information SystemsRetail TechnologyCloud Management

  • Proficient with SQL and Python (or equivalent language)
  • Experience with MPP data warehouse (we use BigQuery)
  • Proficiency in performing data profiling, data mapping and data quality validation
  • Experience with data orchestrator
  • Solid understanding of industry standards for data warehousing and data modeling
  • Proficiency with data environments (we use GCP) as well as containerization solutions
  • Proficiency in source control, CI/CD and automated testing
  • A security mindset with ability to work with compliance protocols (SOX, GDPR)
  • Excellent communication skills and ability to communicate with technical and non technical stakeholders
  • Excellent problem solving skills
  • Ability to work independently and in teams
  • Self motivated and great at taking initiative
  • Cleanse and transform raw data into business logic to help stakeholders build reports and data models
  • Ensure the right data gets to the right people at the right time
  • Collaborate with a Central Data Team to build data pipelines in accordance with shared best practices and design, and feed proper data models for usage
  • Act as a champion for data quality and usability across the Golf organization
  • Ensure availability, quality and searchability of data and documentation
  • Write, clean and maintainable queries and scripts that are highly performing, well structured and easy to understand and extend
  • Automate testing of data models and add appropriate validations to ensure quality
  • Help define and improve our internal standards for data models and infrastructure
  • Help define the data vision and roadmap for a new and fast-growing product
  • Mentor and help grow the team

PythonSQLGCPCommunication SkillsCI/CDProblem SolvingComplianceData modelingSaaS

Posted 12 days ago
Apply
Apply

📍 USA

🧭 Full-Time

🔍 Software Development

🏢 Company: PermitFlow👥 1-10💰 $5,500,000 Seed almost 2 years agoConstructionAppsSoftware

  • 4+ years of experience in analytics engineering, data engineering, or data analytics roles.
  • Strong proficiency in PostgreSQL, SQL, dbt, or similar data warehousing technologies.
  • Experience with ETL pipelines and tools, or other modern data stack tools.
  • Advanced programming skills in Python or other languages for data transformation and analysis.
  • Proven experience designing data models and building scalable data warehouses for analytics purposes.
  • A deep understanding of data governance, data quality practices, and developing self-serve analytics solutions.
  • Strong communication and collaboration skills, with the ability to explain technical concepts to non-technical stakeholders.
  • Experience with business intelligence tools like Looker, Tableau, or Power BI.
  • Familiarity with cloud platforms like Google Cloud.
  • Design and implement scalable data models optimized for analytics and company-wide reporting, continuously refining them to meet evolving business needs.
  • Build and maintain efficient data pipelines to transform datasets for analytics.
  • Collaborate with product and engineering teams to integrate data from sources like PermitFlow’s CRM, 3rd party vendors, and other internal sources while optimizing for performance and reliability.
  • Implement data governance to ensure consistency, quality, and security. Define key metrics, track data lineage, and enforce data quality checks.
  • Work closely with stakeholders to deliver analytics solutions, develop dashboards, and provide training on data tools and best practices.
  • Manage and optimize PermitFlow’s data stack to support scalable reporting and insights.

PostgreSQLPythonSQLApache AirflowBusiness IntelligenceCloud ComputingData AnalysisETLGCPData engineeringRESTful APIsJSONData visualizationData modelingData analytics

Posted 14 days ago
Apply
Apply

🧭 Full-Time

💸 175000.0 - 200000.0 USD per year

🔍 Software Development

🏢 Company: Ironclad

  • 5+ years of experience as an analytics engineer, data engineer, or business intelligence engineer and 2+ years developing in dbt, ideally at a B2B SaaS company
  • Advanced proficiency in SQL
  • Previous experience with data modeling, ETL/ELT development principles, and data warehousing concepts
  • Familiarity with the tools in our stack (Segment, Fivetran, BigQuery, dbt, Looker, Airflow) is preferred
  • A naturally inquisitive, critical thinker, who enjoys and is effective at solving problems
  • A demonstrated self-starter with strong communication and project management skills
  • Design and maintain data transformations that ensure accurate, scalable, and high-quality datasets
  • Serve as the Architect for our dbt project: Own and evolve the project’s architecture, design patterns, and best practices, ensuring consistency in data definitions and streamlined development.
  • Unify metrics and definitions across our data warehouse and BI tools, driving reliable, self-service analytics for analytics team members and cross-functional stakeholders.
  • Mentor & Upskill the Team: Provide guidance and code reviews to analysts and analytics engineers, fostering a culture of collaboration, learning, and excellence in dbt and data modeling.
  • Partner with Data Engineering to Manage Our Stack: Collaborate on the design, ingestion, and transformation pipelines—ensuring they’re scalable, efficient, and aligned with business needs.
  • Champion Data Privacy & Quality: Implement and uphold governance processes and compliance measures to maintain the highest standards of data integrity.
Posted 15 days ago
Apply
Apply

📍 United States

💸 157300.0 - 255200.0 USD per year

🔍 SaaS

🏢 Company: Calendly👥 501-1000💰 $350,000,000 Series B about 4 years ago🫂 Last layoff over 1 year agoProductivity ToolsEnterprise SoftwareCollaborationMeeting SoftwareSchedulingSoftware

  • 5+ years of experience working with a SaaS company, partnering with the GTM domain, and building data products in a cloud data warehouse (BigQuery, Snowflake, Redshift, Databricks, etc.)
  • Expert SQL skills with strong attention to data accuracy and integrity (R or Python is a plus)
  • Experience owning a dbt project with engineering best practices (e.g., unit testing, data quality checks, reusable building blocks, etc.)
  • Hands-on experience with Segment for web event tracking and customer data integration
  • Experience with version control systems like Github, Bitbucket, Gitlab, etc (bonus for CI/CD pipeline management)
  • Proficiency in data modeling to meet business and GTM needs for today, and beyond
  • Ability to diagnose data issues proactively and build trust in data across GTM teams
  • Experience with website experimentation (A/B testing, personalization, CRO analysis)
  • Experience with Salesforce, Braze, and paid ad platforms, ensuring seamless GTM data flow
  • Experience implementing and driving adoption of data quality tools, like Monte Carlo, or similar anomaly detection tool
  • Strong growth mindset—you understand how data drives revenue, acquisition, and retention
  • Design scalable queries and data syncs to provide customer traits, event data, and insights for GTM teams
  • Build customer journey models to drive lead prioritization, retention strategies, and precision targeting
  • Support web, CX, and content teams in implementing and analyzing experiments, personalization, and conversion optimizations
  • Drive adoption of Monte Carlo for GTM data monitoring and issue resolution, ensuring high trust in insights
  • Ensure timely and reliable data availability across Salesforce, Braze, Segment, and paid media platforms
  • Establish data governance and analytics engineering (AE) best practices, aligning with the core data platform and Analytics teams
  • Collaborate closely with Marketing, Sales, CX, Ops, and Analytics teams to ensure data is leveraged effectively for business decisions

PythonSQLCloud ComputingETLGitSalesforceSnowflakeAPI testingData engineeringCommunication SkillsAnalytical SkillsCI/CDProblem SolvingRESTful APIsAttention to detailCross-functional collaborationData visualizationStrategic thinkingData modelingData analyticsData managementSaaSA/B testing

Posted 21 days ago
Apply
Apply

🏢 Company: Eventbrite, Inc.

  • Proficiency with data languages, tools, and techniques and how to leverage them for impact including Snowflake, DBT, Airflow, BI/Viz Tools
  • Ability to synthesize common data models and actionable frameworks from diverse analytical needs for technical and non technical stakeholders
  • Ability to analyze business requirements in various business functional areas and translate them into conceptual, logical and physical data models
  • Understanding dimensional modeling techniques and their applications
  • Advanced experience writing, optimizing and validating SQL queries; good command of Python
  • 3-5 years hands on experience building, running and supporting ETL pipelines on large scale data sets
  • Good command of database optimization and scaling approaches
  • Experience working in Agile framework
  • Attentive to detail with strong quality control, especially in regard to large-scale data changes
  • Design and build new data products and revamp legacy models that enhance business analytics and reporting capabilities across the Eventbrite
  • Enable functional analytics team to deliver key dashboards and reports. You’ll own the development of metrics, data models and the underlying logic that powers the dashboards and reporting designed by our functional data partners, which in turn supports strategic decision-making by leadership
  • Collaborate cross functionally to revamp data models that power business critical internal and customer-visible metrics
  • Deliver data marts and ETL processes that enhance self service analytics capabilities for a diverse set of users/use cases
  • Define standards and best practices for improving data quality and automation
  • Translate data consumption needs into requirements and technical specifications for source data and transformations
  • Enable data driven thinking and help elevate data literacy across the company
Posted 22 days ago
Apply
Apply

📍 Poland

🔍 Software Development

  • Strong experience with SQL
  • Familiarity with Python
  • Proficiency in modern data stack tools such as Snowflake, dbt, ThoughtSpot or similar
  • Help transform raw data into actionable insights through scalable data models and visualization solutions.
  • Bridge the gap between data engineering and analytics, ensuring data is well-structured, accessible, and optimized for business intelligence and reporting.

PythonSQLBusiness IntelligenceData AnalysisETLSnowflakeData engineeringAnalytical SkillsReportingData visualizationFinancial analysisData modelingFinanceData management

Posted 27 days ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 United States

🧭 Full-Time

💸 160000.0 - 180000.0 USD per year

🔍 Insurance, Financial Services

🏢 Company: Vouch Insurance

  • 5+ years experience developing ETL/ELT workflows as a data analytics engineer, data engineer, or data analyst, supporting a variety of business domains
  • Advanced experience in SQL, dbt, Snowflake, and git
  • Intermediate experience building business reporting processes (e.g. for Finance, GTM, Actuarial, etc.)
  • Intermediate experience leveraging data pipeline services like Stitch and orchestration tools like Airflow
  • Basic experience with business intelligence tools such as Mode Analytics
  • Basic experience in Python, Airflow, Stitch
  • Architect, build and optimize robust data models in dbt to power business reporting processes and analytics use cases
  • Effectively speak with, and listen to, business stakeholders, engineers, designers, executives, etc., so that you can Understand business context and support analysts/business stakeholders across GTM, Insurance, Finance, Product, etc. to design and build single-source-of-truth data assets to drive Vouch’s business
  • Act as a liaison between Engineering and Analytics to define source and target fields for data assets
  • Partner with upstream Engineering teams to develop new and adapt existing data models as our technology and business evolve
  • Clearly document specs and data models with source, description, and field definitions to enhance collaboration, scalability, and usability
  • Set up tests to ensure data quality, monitor daily job execution, and diagnose/fix issues to ensure SLAs are met
  • Perform root cause analysis for data quality issues, and partner with upstream teams and business stakeholders to rectify dirty, incomplete, or unreliable data
  • Maintain and enforce consistent data modelling standards, best practices, and design patterns
  • Provide and receive constructive feedback through code reviews, technical specifications, and pair programming sessions
  • Use software development best practices with a focus on version control, testing, reliability and maintainability
  • Work in dbt, SQL, Snowflake, git, Airflow, Python, Stitch, Mode, and we welcome new ideas!

PythonSQLBusiness IntelligenceETLGitSnowflakeAirflowData engineeringData modelingData analytics

Posted about 1 month ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 US

🧭 Full-Time

💸 173000.0 - 204000.0 USD per year

🔍 Software Development

🏢 Company: GlossGenius👥 51-100💰 $28,000,000 Series C over 1 year agoSaaSMobile AppsBeautyFinTech

  • 4+ years of experience as an analytics engineer, data engineer, or business intelligence engineer and 2+ years developing in dbt, ideally at a SaaS company
  • Advanced proficiency in SQL
  • Previous experience with data modeling, ETL/ELT development principles, and data warehousing concepts
  • Familiarity with the tools in our stack (Segment, Fivetran, Snowflake, dbt, Looker, Airflow) is preferred
  • A naturally inquisitive, critical thinker, who enjoys and is effective at solving problems
  • A demonstrated self-starter with strong communication and project management skills
  • Build, Refactor & Optimize Data Models in dbt: Design and maintain data transformations that ensure accurate, scalable, and high-quality datasets
  • Serve as the Architect for Our dbt Project: Own and evolve the project’s architecture, design patterns, and best practices, ensuring consistency in data definitions and streamlined development.
  • Create a Consistent User Experience: Unify metrics and definitions across our data warehouse and BI tools, driving reliable, self-service analytics for cross-functional stakeholders.
  • Mentor & Uplevel the Team: Provide guidance and code reviews to analysts and data engineers, fostering a culture of collaboration, learning, and excellence in dbt and data modeling.
  • Partner with Data Engineering to Manage Our Stack: Collaborate on the design, ingestion, and transformation pipelines—ensuring they’re scalable, efficient, and aligned with business needs.
  • Champion Data Privacy & Quality: Implement and uphold governance processes and compliance measures to maintain the highest standards of data integrity.

Project ManagementPythonSQLApache AirflowBusiness IntelligenceETLSnowflakeData engineeringCommunication SkillsAnalytical SkillsData visualizationData modelingData analyticsData management

Posted about 1 month ago
Apply
Apply

📍 United States

💸 115000.0 - 160000.0 USD per year

🔍 Marketplace platform

🏢 Company: Taskrabbit👥 251-500💰 Secondary Market over 9 years agoMarketplaceE-CommerceJanitorial ServiceFacilities Support ServicesFreight ServicePeer to PeerSharing Economy

  • Expertise in Snowflake SQL and dbt Cloud CLI (5 years).
  • Expertise in dimensional modeling and SCD2 use cases (3 years).
  • Expertise in at least one programming language used for data analysis (e.g., Python, R) (3 years).
  • Experience designing and implementing data quality assurance and monitoring processes (2 years).
  • Experience interacting with executive and senior management stakeholders around data topics (2 years).
  • Familiarity with git (2 years).
  • Commitment to data process development with a focus on transparency, quality, efficiency, and resilience.
  • Design and develop data ontologies for Dolly’s core business objects represented as the enterprise data model.
  • Develop performant and scalable code for data transformation.
  • Design and develop efficient orchestrations for data transformation dependencies.
  • Implement transformation pipelines for raw data focusing on quality and periodicity.
  • Own data definitions for key business metrics.
  • Collaborate with stakeholders to ensure data definitions align with business needs.

Python

Posted 2 months ago
Apply

Related Articles

Posted about 1 month ago

Why remote work is such a nice opportunity?

Why is remote work so nice? Let's try to see!

Posted 8 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 8 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 8 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 8 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.