Apply

Senior Analytics Engineer

Posted 26 days agoInactiveViewed

View full description

💎 Seniority level: Senior

📍 Location: United States

💸 Salary: 130000.0 - 150000.0 USD per year

🔍 Industry: Data Analytics

🏢 Company: Velir

🗣️ Languages: English

🪄 Skills: PythonSQLApache AirflowBusiness IntelligenceCloud ComputingETLGitSnowflakeData engineeringREST APIData visualizationData modelingData analytics

Requirements:
  • Expert in SQL
  • Expert with data transformation tools (e.g. dbt)
  • Expert with at least one cloud data warehouse (e.g. Snowflake)
  • Expert with version control and git
  • Proficiency with data modeling approaches and philosophies (e.g. Kimball, OBT)
  • Proficiency with business intelligence platforms (e.g. Sigma, PowerBI)
  • Knowledge of other common programming languages for data manipulation (e.g. Python, R)
  • Knowledge of common data integration patterns (e.g. CDC, ELT, etc.)
  • Knowledge of common data integration / orchestration platforms (e.g. Fivetran, Azure Data Factory, Apache Airflow)
Responsibilities:
  • Designs data models, implements data governance, selects and implements transformation and analytics tools, and creates efficient data querying and processing methods.
  • Configures and optimizes aspects of a cloud data warehouse such as data permissions, compute and storage clusters, and table schemas.
  • Works with tools like business intelligence (BI) platforms and data visualization tools.
  • Focuses on opportunities to reduce complexity within client data stacks and delivering added value for downstream use cases.
  • Provides constructive feedback on peer code or solution design document reviews.
  • Serves as a mentor for less advanced team members and onboarding new engineers onto the team, helping evolve and document the standards for the analytics engineering practice at Velir.
  • Promotes a positive culture within and across different teams, collaborating with data engineers and data analysts on end-to-end client requirements.
  • Collaborates with clients and functional managers to plan for analytics engineering needs for a product or feature launch.
  • Pairs with a teammate or with someone at a client on strategies for solving an analytics engineering problem.
  • Creates a process or reporting template that helps cross-functional teams solve for common analytics engineering problems.
  • Regularly engages with other teams to make our organization more effective.
  • Takes initiative to identify and solve important problems.
  • Coordinates with others on cross-cutting technical issues.
  • Drives data solutions improvements that impact the client experience or empowers internal stakeholders (teams like Operations, Customer Support, Finance, etc.) to do their job effectively.
  • Optimizes for the predictability and regular cadence of deliverables.
  • Keeps reliability, maintainability and scalability of our clients’ systems top of mind.
  • Embraces long-term ownership of projects while training others to reduce the bus factor or becoming a blocker.
  • Prioritizes and values undesirable/unowned work that enables the team to move faster.
Apply

Related Jobs

Apply

📍 USA

🧭 Full-Time

🔍 Software Development

🏢 Company: PermitFlow👥 1-10💰 $5,500,000 Seed almost 2 years agoConstructionAppsSoftware

  • 4+ years of experience in analytics engineering, data engineering, or data analytics roles.
  • Strong proficiency in PostgreSQL, SQL, dbt, or similar data warehousing technologies.
  • Experience with ETL pipelines and tools, or other modern data stack tools.
  • Advanced programming skills in Python or other languages for data transformation and analysis.
  • Proven experience designing data models and building scalable data warehouses for analytics purposes.
  • A deep understanding of data governance, data quality practices, and developing self-serve analytics solutions.
  • Strong communication and collaboration skills, with the ability to explain technical concepts to non-technical stakeholders.
  • Experience with business intelligence tools like Looker, Tableau, or Power BI.
  • Familiarity with cloud platforms like Google Cloud.
  • Design and implement scalable data models optimized for analytics and company-wide reporting, continuously refining them to meet evolving business needs.
  • Build and maintain efficient data pipelines to transform datasets for analytics.
  • Collaborate with product and engineering teams to integrate data from sources like PermitFlow’s CRM, 3rd party vendors, and other internal sources while optimizing for performance and reliability.
  • Implement data governance to ensure consistency, quality, and security. Define key metrics, track data lineage, and enforce data quality checks.
  • Work closely with stakeholders to deliver analytics solutions, develop dashboards, and provide training on data tools and best practices.
  • Manage and optimize PermitFlow’s data stack to support scalable reporting and insights.

PostgreSQLPythonSQLApache AirflowBusiness IntelligenceCloud ComputingData AnalysisETLGCPData engineeringRESTful APIsJSONData visualizationData modelingData analytics

Posted 13 days ago
Apply
Apply

📍 United States

💸 157300.0 - 255200.0 USD per year

🔍 SaaS

🏢 Company: Calendly👥 501-1000💰 $350,000,000 Series B about 4 years ago🫂 Last layoff over 1 year agoProductivity ToolsEnterprise SoftwareCollaborationMeeting SoftwareSchedulingSoftware

  • 5+ years of experience working with a SaaS company, partnering with the GTM domain, and building data products in a cloud data warehouse (BigQuery, Snowflake, Redshift, Databricks, etc.)
  • Expert SQL skills with strong attention to data accuracy and integrity (R or Python is a plus)
  • Experience owning a dbt project with engineering best practices (e.g., unit testing, data quality checks, reusable building blocks, etc.)
  • Hands-on experience with Segment for web event tracking and customer data integration
  • Experience with version control systems like Github, Bitbucket, Gitlab, etc (bonus for CI/CD pipeline management)
  • Proficiency in data modeling to meet business and GTM needs for today, and beyond
  • Ability to diagnose data issues proactively and build trust in data across GTM teams
  • Experience with website experimentation (A/B testing, personalization, CRO analysis)
  • Experience with Salesforce, Braze, and paid ad platforms, ensuring seamless GTM data flow
  • Experience implementing and driving adoption of data quality tools, like Monte Carlo, or similar anomaly detection tool
  • Strong growth mindset—you understand how data drives revenue, acquisition, and retention
  • Design scalable queries and data syncs to provide customer traits, event data, and insights for GTM teams
  • Build customer journey models to drive lead prioritization, retention strategies, and precision targeting
  • Support web, CX, and content teams in implementing and analyzing experiments, personalization, and conversion optimizations
  • Drive adoption of Monte Carlo for GTM data monitoring and issue resolution, ensuring high trust in insights
  • Ensure timely and reliable data availability across Salesforce, Braze, Segment, and paid media platforms
  • Establish data governance and analytics engineering (AE) best practices, aligning with the core data platform and Analytics teams
  • Collaborate closely with Marketing, Sales, CX, Ops, and Analytics teams to ensure data is leveraged effectively for business decisions

PythonSQLCloud ComputingETLGitSalesforceSnowflakeAPI testingData engineeringCommunication SkillsAnalytical SkillsCI/CDProblem SolvingRESTful APIsAttention to detailCross-functional collaborationData visualizationStrategic thinkingData modelingData analyticsData managementSaaSA/B testing

Posted 20 days ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 United States

🧭 Full-Time

💸 160000.0 - 180000.0 USD per year

🔍 Insurance, Financial Services

🏢 Company: Vouch Insurance

  • 5+ years experience developing ETL/ELT workflows as a data analytics engineer, data engineer, or data analyst, supporting a variety of business domains
  • Advanced experience in SQL, dbt, Snowflake, and git
  • Intermediate experience building business reporting processes (e.g. for Finance, GTM, Actuarial, etc.)
  • Intermediate experience leveraging data pipeline services like Stitch and orchestration tools like Airflow
  • Basic experience with business intelligence tools such as Mode Analytics
  • Basic experience in Python, Airflow, Stitch
  • Architect, build and optimize robust data models in dbt to power business reporting processes and analytics use cases
  • Effectively speak with, and listen to, business stakeholders, engineers, designers, executives, etc., so that you can Understand business context and support analysts/business stakeholders across GTM, Insurance, Finance, Product, etc. to design and build single-source-of-truth data assets to drive Vouch’s business
  • Act as a liaison between Engineering and Analytics to define source and target fields for data assets
  • Partner with upstream Engineering teams to develop new and adapt existing data models as our technology and business evolve
  • Clearly document specs and data models with source, description, and field definitions to enhance collaboration, scalability, and usability
  • Set up tests to ensure data quality, monitor daily job execution, and diagnose/fix issues to ensure SLAs are met
  • Perform root cause analysis for data quality issues, and partner with upstream teams and business stakeholders to rectify dirty, incomplete, or unreliable data
  • Maintain and enforce consistent data modelling standards, best practices, and design patterns
  • Provide and receive constructive feedback through code reviews, technical specifications, and pair programming sessions
  • Use software development best practices with a focus on version control, testing, reliability and maintainability
  • Work in dbt, SQL, Snowflake, git, Airflow, Python, Stitch, Mode, and we welcome new ideas!

PythonSQLBusiness IntelligenceETLGitSnowflakeAirflowData engineeringData modelingData analytics

Posted about 1 month ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 US

🧭 Full-Time

💸 173000.0 - 204000.0 USD per year

🔍 Software Development

🏢 Company: GlossGenius👥 51-100💰 $28,000,000 Series C over 1 year agoSaaSMobile AppsBeautyFinTech

  • 4+ years of experience as an analytics engineer, data engineer, or business intelligence engineer and 2+ years developing in dbt, ideally at a SaaS company
  • Advanced proficiency in SQL
  • Previous experience with data modeling, ETL/ELT development principles, and data warehousing concepts
  • Familiarity with the tools in our stack (Segment, Fivetran, Snowflake, dbt, Looker, Airflow) is preferred
  • A naturally inquisitive, critical thinker, who enjoys and is effective at solving problems
  • A demonstrated self-starter with strong communication and project management skills
  • Build, Refactor & Optimize Data Models in dbt: Design and maintain data transformations that ensure accurate, scalable, and high-quality datasets
  • Serve as the Architect for Our dbt Project: Own and evolve the project’s architecture, design patterns, and best practices, ensuring consistency in data definitions and streamlined development.
  • Create a Consistent User Experience: Unify metrics and definitions across our data warehouse and BI tools, driving reliable, self-service analytics for cross-functional stakeholders.
  • Mentor & Uplevel the Team: Provide guidance and code reviews to analysts and data engineers, fostering a culture of collaboration, learning, and excellence in dbt and data modeling.
  • Partner with Data Engineering to Manage Our Stack: Collaborate on the design, ingestion, and transformation pipelines—ensuring they’re scalable, efficient, and aligned with business needs.
  • Champion Data Privacy & Quality: Implement and uphold governance processes and compliance measures to maintain the highest standards of data integrity.

Project ManagementPythonSQLApache AirflowBusiness IntelligenceETLSnowflakeData engineeringCommunication SkillsAnalytical SkillsData visualizationData modelingData analyticsData management

Posted about 1 month ago
Apply
Apply

📍 United States

💸 115000.0 - 160000.0 USD per year

🔍 Marketplace platform

🏢 Company: Taskrabbit👥 251-500💰 Secondary Market over 9 years agoMarketplaceE-CommerceJanitorial ServiceFacilities Support ServicesFreight ServicePeer to PeerSharing Economy

  • Expertise in Snowflake SQL and dbt Cloud CLI (5 years).
  • Expertise in dimensional modeling and SCD2 use cases (3 years).
  • Expertise in at least one programming language used for data analysis (e.g., Python, R) (3 years).
  • Experience designing and implementing data quality assurance and monitoring processes (2 years).
  • Experience interacting with executive and senior management stakeholders around data topics (2 years).
  • Familiarity with git (2 years).
  • Commitment to data process development with a focus on transparency, quality, efficiency, and resilience.
  • Design and develop data ontologies for Dolly’s core business objects represented as the enterprise data model.
  • Develop performant and scalable code for data transformation.
  • Design and develop efficient orchestrations for data transformation dependencies.
  • Implement transformation pipelines for raw data focusing on quality and periodicity.
  • Own data definitions for key business metrics.
  • Collaborate with stakeholders to ensure data definitions align with business needs.

Python

Posted 2 months ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 SaaS

🏢 Company: Apollo.io👥 501-1000💰 $100,000,000 Series D over 1 year agoSoftware Development

  • Experience with relevant tools in data engineering and analytics.
  • Sound understanding of the SaaS industry, especially regarding LTV:CAC, Activation levers, and Conversion Rates.
  • Ability to approach new projects objectively without bias.
  • Great time management skills.
  • Excellent written and oral communication skills for summarizing insights and designing data assets.
  • Design, develop, and maintain essential data models for various business functions to ensure consistency and accuracy.
  • Define and implement data engineering standards and best practices, providing guidance to other teams.
  • Collaborate with multiple business areas to understand their requirements and deliver scalable data solutions.
  • Influence the future direction of analytics infrastructure, offering strategic insights to drive business impact.

Business IntelligenceSnowflakeStrategyData engineeringCommunication SkillsCollaborationCross-functional collaborationData modeling

Posted 5 months ago
Apply