Apply

Analytics Engineer

Posted 29 days agoViewed

View full description

💎 Seniority level: Junior, 2-4 years

📍 Location: USA

💸 Salary: 135000.0 USD per year

🏢 Company: Extend👥 51-100💰 $40,000,000 Series B almost 4 years agoMobile PaymentsCredit CardsPaymentsFinTechSoftware

🗣️ Languages: English

⏳ Experience: 2-4 years

🪄 Skills: SQLETLSnowflakeAPI testingData engineeringREST APIAccountingJSONData visualizationData modelingData analytics

Requirements:
  • 2-4 years of experience as an Analytics Engineer, Data Engineer, or in a similar role focused on operational data systems.
  • Strong SQL skills and hands-on experience with Snowflake and DBT.
  • Proficiency with API interaction and tools like Thunderclient or Postman.
  • Familiarity with reconciliation, financial audit preparation, and cross-system data integrity best practices.
  • Excellent communication skills and a collaborative approach to working across technical and non-technical teams.
  • Strong problem-solving mindset and the ability to proactively identify and address data inconsistencies.
Responsibilities:
  • Own and improve the data workflows related to Claim Reimbursement and Contract Sales, ensuring accurate financial tracking and reconciliation with partners.
  • Build and maintain data pipelines using Snowflake and DBT to support reliable and auditable operations.
  • Collaborate closely with Accounting, Engineering, Operations, and Reporting to align and reconcile internal and external financial data.
  • Ensure data integrity across systems and assist with internal and external audits through well-documented, traceable data practices.
  • Enhance and support integrations with Service Providers and Merchants to improve timeliness and accuracy of claim expense reporting and payments.
  • Utilize tools like Thunderclient and other API clients to interface with production systems when necessary.
  • Identify, investigate, and fix data quality issues, recommending process improvements to reduce errors and manual interventions.
  • Help extend and evolve our data quality standards and operational analytics practices.
Apply

Related Jobs

Apply

📍 United States

🧭 Full-Time

🔍 Industrial Process Manufacturing

🏢 Company: Seeq👥 101-250💰 $50,000,000 Series D 10 months agoIndustrialInternet of ThingsAnalyticsCommercialSoftware

  • Bachelor's degree (or equivalent experience) in a technical field.
  • Minimum of 7 years of experience in process engineering, reliability engineering, technical pre-sales engineering (SaaS business, OT space preferred), or other related industry fields.
  • Deep technical expertise in one or more industrial process manufacturing verticals and a sound understanding of broad market drivers and business priorities.
  • Empathy with and experience in the difficulties involved in trying to solve real-world business and industrial challenges involving dense time-series process data with inadequate or overly complex analytics tools, such as Excel, MATLAB, Python, BI tools, or otherwise.
  • Responsive and adept at building trust and rapport with customers, prospective customers, partners, and internal multidisciplinary teams.
  • Effective and influential technical communicator.
  • Experience using Seeq is highly beneficial.
  • Creative ability to develop and propose technical solutions leveraging the latest features and capabilities in the software and broader technology landscape.
  • Programming experience in Python and/or other programming languages is a plus.
  • Ability to independently prioritize work based on account team strategy, customer business needs, and Seeq company objectives.
  • Broad knowledge of the SaaS industrial analytics ecosystem is a plus (e.g., Azure, AWS, Databricks, AVEVA, Aspen Tech, IOTA, Cognite, C3.ai, Palantir, etc.).
  • Serve as the customer’s trusted technical advisor by building relationships with champions and end-users through solution design, implementation guidance, and technical mentorship.
  • Leverage customer relationships to perform constant discovery and uncover high-value, high-impact use case opportunities directly linked to the customer’s critical business workflows and objectives.
  • Collaborate with the customer and account team to develop and execute strategic plans for adoption of Seeq and completion of key use cases addressing business objectives.
  • Advise customer champions and end-users to implement and operationalize the use cases to deliver a high return on investment for the customer.
  • Upskill Seeq’s partner ecosystem to be an extension of the Analytics Engineering team to provide support globally and in localized languages.
  • Be a product expert, maintaining knowledge of how to implement and troubleshoot solutions in the platform, including integrations with other systems.
  • Identify gaps in Seeq's capabilities required to meet customer needs and give feedback to the product organization to influence the product roadmap.
  • Work with the product team to design, implement, and test new functionalities that meet customer needs.
  • Build trust with customers through the application of industry-specific knowledge and awareness, including market and use case trends specific to their area.
  • Collaborate with internal teams (including industry principals, product marketing, and solution consulting) to influence industry-specific strategy.

PythonSQLData AnalysisData engineeringCommunication SkillsAnalytical SkillsProblem SolvingCustomer serviceRESTful APIsMentoringLinuxDevOpsEmpathyActive listeningJSONData visualizationProcess improvementTechnical supportData modelingData analyticsCustomer SuccessSaaS

Posted 1 day ago
Apply
Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

💸 162000.0 - 190000.0 USD per year

🔍 Software Development

🏢 Company: Lob👥 101-250💰 $50,000,000 Series C about 4 years agoDeveloper APIsShippingSaaSMarketingHealth CareSoftwareCourier Service

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience: at least one big data warehouse system such as Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating live production systems using dbt and Python.
  • 3+ years of BI Software experience: at least one analytics platform such as Looker, Power BI, or Tableau.
  • Empathy and effective communication skills: You can explain complex analytical issues to both technical and non-technical audiences.
  • Strong interpretive skills: You can deconstruct complex source data to compose curated models that can be explored by stakeholders.
  • Product mindset: You build data systems that will be used to generate insights for years to come, not just one-off analyses.
  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams as they identify ways to improve the quality of the metrics they produce and analyze.
  • Champion data governance, security, privacy, and retention policies to protect end users, customers, and Lob.
  • Support and mentor fellow engineers and data team members through coffee chats, code review, and pair programming.

PythonSQLETLSnowflakeCommunication SkillsAnalytical SkillsData visualizationData modeling

Posted 6 days ago
Apply
Apply
🔥 Analytics Engineer
Posted 7 days ago

📍 United States

🧭 Full-Time

💸 100000.0 - 125000.0 USD per year

🏢 Company: FSAStore.com

  • 2+ years of experience in analytics, data science, data engineering, business intelligence, or related function
  • Advanced knowledge of modern BI and data visualization tools with a strong preference for Microsoft Power BI end to end development
  • Knowledge of data modeling and data engineering in various cloud environments (both data warehouse and data lakes) such as BigQuery, Microsoft Fabric, Snowflake, etc.
  • Intermediate to advanced knowledge of SQL (Non TCL/DCL)
  • Python proficiency highly preferred
  • Comfortable with a range of statistical techniques ranging such as regression analysis, probability, survival modeling, etc.
  • Ability to communicate effectively with wide-range of audiences
  • Ability to quickly ramp up on new tools/software
  • Passion for data and its fundamental ability to create value
  • Extremely curious and excited to dive into complex problems
  • Build data pipelines and data models through ETL/ELT processes in collaboration with the Data Engineering team to better serve business insights
  • Define, develop and implement dashboards and reporting to ensure ongoing progress and status updates.
  • Create informative and well-organized presentations that contribute to overall business needs and goals, explaining complex situations clearly and simply. Be a translator of complex data into concise, actionable business insights
  • Ensure data robustness, accuracy and governance across all pipelines and views
  • Identify opportunities for department-wide improvement and communicate with management to recommend and effectuate change.
  • Support new strategy, processes design, and communications in collaboration with cross-functional teams
  • Fulfill ad-hoc requests for data and analysis

PythonSQLApache AirflowETLMicrosoft Power BISnowflakeData engineeringRegression testingData visualizationData modelingData analytics

Posted 7 days ago
Apply
Apply
🔥 Analytics Engineer
Posted 12 days ago

📍 United States, Canada

🧭 Full-Time

💸 136000.0 - 170000.0 USD per year

🔍 Software Development

🏢 Company: Tailscale👥 51-100💰 $100,480,659 Series B about 3 years agoInfrastructureInformation TechnologyCyber SecurityNetwork Security

  • Experience designing and maintaining production-grade data models and pipelines.
  • Deep understanding of modern data modeling principles and best practices.
  • Strong SQL skills and confidence working with complex datasets and schemas.
  • High attention to detail, with high standards for data accuracy and validation.
  • Flexibility and resourcefulness in navigating manual tasks or imperfect systems.
  • Passion for clean code, scalable architecture, and incrementally reducing complexity over time.
  • Ability to manage time, prioritize effectively, and thrive in a fast-moving environment.
  • Clear communication skills with both technical and non-technical collaborators.
  • Willingness to flexibly support other domains (e.g., Product, Marketing, and Engineering) as the team grows.
  • Dive into complex systems and manual workflows to develop an understanding of our tools and constraints.
  • Thoughtfully design and maintain clean, scalable, production-grade dbt models used for billing pipelines and financial reporting (e.g., ARR, revenue recognition).
  • Work across the full data pipeline, from ingestion to transformation to visualization.
  • Take a pragmatic approach to system challenges, balancing quick fixes with thoughtful long-term design.
  • Collaborate with Finance, Engineering, Sales, and other teams across the company to understand data needs and deliver timely, practical solutions.

PostgreSQLSQLCloud ComputingData AnalysisETLSnowflakeRESTful APIsData visualizationFinancial analysisData modelingFinance

Posted 12 days ago
Apply
Apply
🔥 Analytics Engineer
Posted 16 days ago

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: Funnel Leasing

  • Bachelor’s degree or equivalent, related experience. Preferred degree in computer Science.
  • 3 + years analytics and product development professional experience
  • Great attention to detail and technical fluency balanced with the ability to effectively translate insights to both technical and non-technical decision-makers
  • Experience developing client facing analytics dashboard or products
  • Expert at SQL, experience with transforming data and data modeling
  • ​​Have a background as a strong individual contributor
  • Expertise with Tableau, Mode, Looker or similar BI tool
  • Experience with dbt (see getdbt.com)
  • Product Management and Data Analytics experience
  • Experience reading code in at least two languages (Python, Javascript, html, etc.)
  • Manage our data stack: Fivetran for our data pipeline, Snowflake for our data warehouse, dbt for data modeling and Looker for reporting
  • Transform data to build the analytics layer of Funnel’s environment to make data standardized and accessible
  • Streamline data collection and transformation of new data sources and iterate on improvement and standardization of existing data workflows
  • Transform, test, deploy, and document data for internal and external stakeholders.
  • Collaborate with and support Product and Customer Experience teams by providing data literacy and performance insights crucial for client onboarding, adoption and product reviews
  • Develop and deliver analytics to provide Funnel’s real estate owners and managers with real insights into their business performance
  • Provide clean data sets to end users, modeling data in a way that empowers end users to answer their own questions
  • Create a culture of continuous improvement and learning
  • Foster collaboration within team and across the company
  • Performs other duties as assigned and modified at manager's discretion

SQLSnowflakeTableauData engineeringData visualizationData modelingData analytics

Posted 16 days ago
Apply
Apply

📍 USA

🧭 Full-Time

🔍 Healthcare

🏢 Company: Atropos Health

  • 5+ years of experience in data infrastructure, data analysis, data visualization, and dash boarding tools
  • Expertise in correct data modeling principles for analytics (dimensional, bonus points for data vault knowledge), integrating raw data from disparate sources, ETL, python, and building/maintaining data pipelines
  • Strong SQL skills
  • Experience building data warehouses and optimizing how data is modeled for downstream consumers
  • Familiarity with general cloud architecture
  • Quality mindset. Must approach data preparation development with data validation, ensuring accuracy and reliability in all data products
  • Must be curious, humble, a team player, operate with a bias to action, and have the ability to make decisions when confronted with uncertainty
  • Have worked in a fast-paced high-growth startup environment before and is comfortable switching work streams as needed
  • Design and develop robust and efficient analytics data products, such as data models, schemas, ETL jobs, data sets, BI visualizations, etc.,
  • Collaborate in partnership with engineering,  product, and other business teams to assess data and analytics needs
  • As part of development, ensure high level of data quality within data products through data validation and disciplined design practices
  • Contribute to a reliable and usable platform to deliver insights through raw data delivery, SQL access, API endpoints, and BI tooling.
  • Provide refined data to business users to enable KPI tracking and foster a culture of data-driven decision-making
  • Prioritize inbound requests from multiple teams and establish the business context for efficient problem-solving
  • Contribute to data asset documentation, data modeling, data validation and profiling, and tooling decisions

PythonSQLCloud ComputingData AnalysisETLAmazon Web ServicesData engineeringREST APIAnalytical SkillsData visualizationData modelingData analytics

Posted 21 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 160000.0 - 230000.0 USD per year

🔍 Daily Fantasy Sports

🏢 Company: PrizePicks👥 101-250💰 Corporate about 2 years agoGamingFantasy SportsSports

  • 7+ years of experience in a Data Engineering, or data-oriented software engineering role creating and pushing end-to-end data engineering pipelines.
  • Graduate degree in a quantitative field: Computer Science, Mathematics, Statistics, Business Analytics, Engineering) or equivalent experience.
  • 3+ years of experience acting as technical lead and providing mentorship and feedback to junior engineers.
  • Experience building and optimizing data pipelines for analytics, with a focus on data transformation and modeling.
  • Experience in integrating data from various sources to support analytical needs, including familiarity with data warehousing principles and ELT processes.
  • Strong experience in dimensional modeling, data warehousing, and data mart design.
  • Excellent ability to translate business questions into data requirements and analytical solutions.
  • Partner with Data Engineering to build and optimize robust data pipelines that ensure data accessibility and reliability.
  • Collaborate with Business Intelligence to implement crucial business logic that powers BI dashboards, directly impacting key business decisions and providing broad organizational exposure.
  • Design and implement complex data transformation logic primarily in SQL with a focus on creating reusable data models that support various analytical needs.
  • Build and maintain dbt models to ensure data accuracy, consistency, and reliability.
  • Collaborate with data engineers, data analysts, and business stakeholders to understand data requirements, define key metrics, and deliver actionable insights through data models and reporting solutions.
  • Define and implement data quality checks and validation processes to ensure the accuracy and reliability of data used for analysis.
  • Develop comprehensive documentation of data models, data dictionaries, and transformation logic to facilitate data understanding.
  • Develop and manage CI/CD pipelines to automate and streamline the deployment of data solutions.
  • Ensure that data workflows are thoroughly tested, integrated, and deployed efficiently, following best practices for version control, automation, and quality assurance.
  • Experience in defining and implementing data governance principles, such as data lineage and data cataloging, to improve data discoverability, usability, and trust for analytical purposes.
  • Serve as an Analytics Staff Engineer within the broader PrizePicks technology organization by staying current with emerging analytical techniques, data modeling best practices, and analytics engineering trends.
  • Mentor junior team members and promote a data-driven culture.
  • On-call rotation support, the on-call is shared across the Analytics and Data Engineering teams.

AWSPostgreSQLPythonSQLApache AirflowETLSnowflakeData engineeringCommunication SkillsAnalytical SkillsCollaborationCI/CDMentoringData visualizationData modelingData analytics

Posted 22 days ago
Apply
Apply

📍 Worldwide

🧭 Contract

  • Proven experience as an Analytics Engineer, Data Engineer, or in a similar role.
  • Strong SQL skills and have experience working with modern data warehouses like Snowflake, BigQuery, or Redshift.
  • Hands-on experience building data pipelines using tools such as dbt, Airflow, or Census.
  • Strong data modelling skills.
  • Excel in collaboration, communicating clearly with both technical and non-technical stakeholders.
  • Consider the end-user experience when building dashboards and data products, ensuring insights are clear, actionable, and easy to interpret.
  • Are detail-oriented, focusing on delivering reliable and trustworthy data outputs.
  • Build robust, scalable pipelines to onboard new cost data providers into the Cloud Costs platform.
  • Design and implement methodologies to accurately model, allocate, and expose cloud and third-party spend to internal teams.
  • Partner with FinOps engineers and stakeholders to deliver high-impact cost visibility features that unlock real savings opportunities.
  • Develop reusable analytics components and dbt models to streamline cost data integration and reallocation.
  • Rapidly iterate on solutions to meet short-term project milestones while setting the stage for long-term scalability.
  • Collaborate closely with cross-functional teams to ensure alignment, reduce duplication, and drive shared success.

SQLCloud ComputingSnowflakeAirflowData engineeringData visualizationData modelingData analytics

Posted 22 days ago
Apply
Apply

📍 United States, Canada

🧭 Full-Time

💸 154000.0 - 281000.0 USD per year

🔍 Software Development

🏢 Company: Webflow👥 501-1000💰 $120,000,000 Series C about 3 years ago🫂 Last layoff 10 months agoCMSWeb HostingWeb Design

  • 4–6+ years of hands-on experience in analytics engineering or related data roles
  • Strong experience managing/leading the following disciplines: SQL/Python Data modeling ETL/ELT Cloud Warehouses
  • Are tech-savvy and systems-oriented
  • Have a track record of building and scaling data foundations
  • Have strong communication and project management skills
  • Thrive collaborating with other data team members and non-technical stakeholders in a fast-paced and sometimes ambiguous environment, solving complex and nuanced problems
  • Design and build scalable, reliable, and well-documented data models and pipelines that form the foundation of the analytics stack and support cross-functional business needs.
  • Act as a strategic partner to cross-functional teams by deeply understanding business domains, fostering collaboration, and driving a culture of curiosity and continuous improvement through data.
  • Lead the evolution of our data warehouse architecture, analytics engineering workflows, and BI layer—contributing to the development of intuitive dashboards and self-serve tools that empower data-informed decision-making.
  • Collaborate closely with data analysts, data scientists, and data engineers to define and maintain trusted metrics, ensure data quality, and improve usability and accessibility of core data assets.
  • Drive best practices in metric development, testing frameworks, lineage tracking, and documentation standards using tools such as dbt, Snowflake, and Tableau.
  • Serve as a subject matter expert and thought partner in strategic, cross-functional initiatives—applying systems thinking and deep technical expertise to improve data flows and inform high-impact decisions.

PythonSQLCloud ComputingETLSnowflakeTableauData engineeringData modelingData analytics

Posted 23 days ago
Apply
Apply

📍 California, Colorado, Florida, Illinois, Kansas, Maryland, Massachusetts, Missouri, New Jersey, New York, Texas, Washington

💸 117000.0 - 201000.0 USD per year

🔍 Gaming

🏢 Company: Mythical Games👥 251-500💰 Series C over 1 year ago🫂 Last layoff over 2 years agoVideo GamesBlockchainGamingApps

  • 3+ years of professional experience in analytics and data pipelines/warehouses
  • 3+ years of experience with SQL and a programming language (eg Python, Java)
  • Experience with gaming analytics and player behavior metrics
  • Hands-on experience with Google Cloud services (BigQuery, Cloud Functions, Dataflow)
  • Experience with Data Analysis Tools such as Looker or Tableau
  • Collaborate with stakeholders to gather requirements and design intuitive, streamlined analytical experiences
  • Develop sophisticated data models in BigQuery using Looker's LookML
  • Design robust data pipelines to ensure required data is available in a data warehouse with a reliable cadence
  • Create reports analyzing Mythical’s game/marketplace data and present them to appropriate stakeholders
  • Optimize analytics performance and data structure improvements (eg query optimization, data rollups)

PythonSQLData AnalysisETLGCPTableauData engineeringCommunication SkillsAnalytical SkillsRESTful APIsData visualizationData modeling

Posted 24 days ago
Apply