Apply

Analytics Engineer

Posted 29 days agoViewed

View full description

💎 Seniority level: Junior, 2+ years

💸 Salary: 100000.0 - 180000.0 USD per year

🔍 Industry: Software Development

🏢 Company: Fingerprint👥 101-250💰 $33,000,000 Series C over 1 year agoFraud DetectionCyber SecuritySoftware

🗣️ Languages: English

⏳ Experience: 2+ years

Requirements:
  • BS/MS in Computer Science or a related field, or equivalent work experience.
  • 2+ years of experience in Analytics Engineering, Data Engineering, Data Analytics, or Data Science.
  • Excellent SQL skills.
  • Practical experience with analytical storage solutions such as Clickhouse, Snowflake, BigQuery, Redshift, Databricks, etc.
  • Familiarity with data transformation frameworks and approaches (e.g., dbt, materialized views, data pipeline workflow tools).
  • General engineering skills: git, IDE, shell.
  • Experience with data visualization tools like Apache Superset, Tableau, Metabase, Looker, etc.
  • Strong foundation in statistics for designing metrics and experiments.
  • Ability to conduct Exploratory Data Analysis (EDA) for investigating ad-hoc questions and identifying anomalies.
Responsibilities:
  • Develop tools for deep data analytics on the Identification API Product.
  • Provide insights about algorithm accuracy, design experiments, detect anomalies, and analyze trends in time-series data.
  • Foster an engineering-focused, data-driven culture within the Fingerprint team by sharing tools and knowledge on effective data analytics approaches.
  • Apply software engineering best practices to deliver clean, transformed data ready for analysis.
  • Strengthen the team behind the industry-leading device identification API through applied data analytics skills.
  • This role includes participation in a shared on-call rotation.
Apply

Related Jobs

Apply

📍 UK, Europe

🔍 SaaS or tech

  • 5+ years of experience in analytics engineering, product analytics, or strategic data roles in SaaS or tech companies.
  • Proven ability to translate business needs into data models, dashboards, and actionable recommendations.
  • Expert SQL skills and experience modeling data for BI tools.
  • Strong stakeholder management skills.
  • Highly organized with the ability to prioritize multiple initiatives.
  • Background in Revenue Operations, Product Analytics, or Growth Analytics.
  • Experience integrating and analyzing data from HubSpot, Mixpanel, and customer engagement platforms.
  • Hands-on experience with AI-driven analytics and NLP-based insight extraction.
  • Familiarity with Python for data automation or lightweight modeling tasks.
  • Collaborate with teams to define key business questions and data needs.
  • Structure and model datasets for analysis.
  • Lead insight generation, turning data into recommendations.
  • Leverage AI/NLP techniques to extract insights.
  • Identify patterns in customer behavior and product usage.
  • Develop dashboards and reports.
  • Analyze sales and pipeline data.
  • Partner with RevOps and Sales leadership.
  • Support A/B testing and user behavior studies.
  • Build models to assess initiative effectiveness.
  • Develop self-service data tools and training.
  • Lead workshops and promote data literacy.

SQLBusiness IntelligenceData AnalysisSalesforceTableauProduct AnalyticsData engineeringAnalytical SkillsData visualizationStakeholder managementData modelingData analyticsData managementCustomer SuccessSaaSA/B testing

Posted 1 day ago
Apply
Apply

🧭 Full-Time

  • Bachelor’s Degree in Mechanical, Thermal, Chemical, Energy Systems or Computer science
  • Analytical mindset, ability to handle multiple tasks simultaneously, work under fast-paced environment
  • Strong technical background in data engineering, systems integration, or industrial IoT
  • Proficiency in Python, SQL, and working with cloud-native data architectures (in AWS, GCP, Azure, etc.)
  • Ability to visualize complex datasets clearly and effectively for business and technical users
  • Hands-on mindset with the ability to troubleshoot across hardware, software, and network layers
  • Experience with edge computing platforms, gateways, or PLC integration (a plus)
  • Prior experience on data analytics of energy systems, refrigeration systems is highly desirable
  • Familiarity with thermodynamic principals, refrigeration or energy systems is highly desirable
  • Understanding of refrigeration, HVAC, or thermodynamic systems (a plus)
  • Experience collaborating cross-functionally with engineering, operations, and vendors
  • Self-driven and adaptable to fast-changing needs in field operations
  • Ability to be individual contributor
  • Team player and effective communicator
  • Fluent in English
  • Assist in creating and scaling a unified data strategy for field installations
  • Develop and maintain pipelines to ingest, clean, tag, and store data from mixed field controllers and edge devices
  • Interface with protocols such as Modbus, M2M2 Loytec Gateway, etc and integrate with platforms like AWS, GCP, or vendor clouds (e.g., Danfoss Carel boss)
  • Standardize and normalize fragmented datasets into a centralized cloud repository
  • Build business-facing and engineering dashboards using tools like Google data studio, Grafana or Power BI
  • Write and maintain Python or SQL scripts for data transformation, anomaly detection, and automated reporting
  • Work with refrigeration and controls engineers to support field deployments, thermodynamic analysis, and commissioning of edge/IoT devices
  • Oversee day-to-day health, analysis, and integrity of field data streams and dashboards
  • Collaborate with external vendors or contractors to deploy pilot implementations across selected sites
  • Develop algorithms for data collection, clean up, organization
  • Develop visualization dashboards
  • Conduct energy saving calculations
  • All other projects and duties as assigned
Posted 1 day ago
Apply
Apply

📍 Brazil

🔍 Lending

🏢 Company: RecargaPay👥 501-1000💰 $10,000,000 Debt Financing almost 3 years agoMobile PaymentsFinancial ServicesFinTech

  • Deep experience with SQL for querying large datasets, optimizing performance, and creating reusable data logic.
  • Proven experience building and optimizing interactive dashboards in QlikSense or a similar tool; knowledge of app design best practices.
  • Familiarity with Redshift, Databricks, and data warehousing concepts.
  • Solid understanding of credit products, collections strategies, portfolio risk, and financial KPIs.
  • Strong ability to communicate analytical results clearly and collaborate effectively with technical and non-technical teams.
  • Bachelor’s degree in Business, Engineering, Technology, Finance, or related fields.
  • Produce forecasts and estimations to evaluate product performance and support short- and long-term strategic decisions.
  • Partner with product managers to ensure all new features and changes are built with measurable KPIs, allowing for seamless performance tracking from day one.
  • Build and enhance QlikSense dashboards to monitor portfolio, collections, and product performance. Add relevant variables and data streams to ensure comprehensive insights.
  • Continuously optimize SQL queries and improve the performance of QlikSense apps to ensure scalability, efficiency, and fast execution of reports.
  • Conduct in-depth analysis on user behavior, repayment trends, and risk dynamics, helping the product team make data-backed decisions.
  • Handle analytical and operational requests from the Finance, Risk, and Data Science teams, delivering high-quality data support and insights.
  • Maintain clear documentation of metrics, queries, and dashboards to promote governance and reproducibility.
  • Analyze and monitor credit portfolio performance, collections effectiveness, and other key financial indicators critical to Lending.
  • Support the ongoing migration from Redshift to Databricks, ensuring no data loss and continuity of insights.

SQLData AnalysisETLQliksenseAnalytical SkillsFinancial analysisData modeling

Posted 2 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Industrial Process Manufacturing

🏢 Company: Seeq👥 101-250💰 $50,000,000 Series D 10 months agoIndustrialInternet of ThingsAnalyticsCommercialSoftware

  • Bachelor's degree (or equivalent experience) in a technical field.
  • Minimum of 7 years of experience in process engineering, reliability engineering, technical pre-sales engineering (SaaS business, OT space preferred), or other related industry fields.
  • Deep technical expertise in one or more industrial process manufacturing verticals and a sound understanding of broad market drivers and business priorities.
  • Empathy with and experience in the difficulties involved in trying to solve real-world business and industrial challenges involving dense time-series process data with inadequate or overly complex analytics tools, such as Excel, MATLAB, Python, BI tools, or otherwise.
  • Responsive and adept at building trust and rapport with customers, prospective customers, partners, and internal multidisciplinary teams.
  • Effective and influential technical communicator.
  • Experience using Seeq is highly beneficial.
  • Creative ability to develop and propose technical solutions leveraging the latest features and capabilities in the software and broader technology landscape.
  • Programming experience in Python and/or other programming languages is a plus.
  • Ability to independently prioritize work based on account team strategy, customer business needs, and Seeq company objectives.
  • Broad knowledge of the SaaS industrial analytics ecosystem is a plus (e.g., Azure, AWS, Databricks, AVEVA, Aspen Tech, IOTA, Cognite, C3.ai, Palantir, etc.).
  • Serve as the customer’s trusted technical advisor by building relationships with champions and end-users through solution design, implementation guidance, and technical mentorship.
  • Leverage customer relationships to perform constant discovery and uncover high-value, high-impact use case opportunities directly linked to the customer’s critical business workflows and objectives.
  • Collaborate with the customer and account team to develop and execute strategic plans for adoption of Seeq and completion of key use cases addressing business objectives.
  • Advise customer champions and end-users to implement and operationalize the use cases to deliver a high return on investment for the customer.
  • Upskill Seeq’s partner ecosystem to be an extension of the Analytics Engineering team to provide support globally and in localized languages.
  • Be a product expert, maintaining knowledge of how to implement and troubleshoot solutions in the platform, including integrations with other systems.
  • Identify gaps in Seeq's capabilities required to meet customer needs and give feedback to the product organization to influence the product roadmap.
  • Work with the product team to design, implement, and test new functionalities that meet customer needs.
  • Build trust with customers through the application of industry-specific knowledge and awareness, including market and use case trends specific to their area.
  • Collaborate with internal teams (including industry principals, product marketing, and solution consulting) to influence industry-specific strategy.

PythonSQLData AnalysisData engineeringCommunication SkillsAnalytical SkillsProblem SolvingCustomer serviceRESTful APIsMentoringLinuxDevOpsEmpathyActive listeningJSONData visualizationProcess improvementTechnical supportData modelingData analyticsCustomer SuccessSaaS

Posted 4 days ago
Apply
Apply

📍 Europe

🧭 Full-Time

🔍 Software Development

  • You have at least 1 significant experience in Analytics Engineering with dbt
  • SQL & data modelling in DBT has no more secrets for you
  • Experience with Airflow (with a good knowledge of Python) is a plus
  • You know how to deal with technical and non-technical stakeholders to collect needs, challenge them and make clear and informed decisions on the data model
  • You carry a clear technical vision on how to structure data management practices
  • You have excellent communication skills, both written and spoken
  • You have a result-driven and impact-oriented mindset
  • Build data models: collect needs for data transformations, define unique sources of truth for our metrics and dimensions, implement data models in dbt, make sure we make the most of dbt’s features, in particular through constant watch
  • Govern our data model: keep overall consistency of our data model, define policies and standards for data management, ensure quality and reliability of our data model, ensure the documentation of our data model
  • Provide support & train the main users of your models: deliver the necessary training to make sure People use your models, iterate on models to make sure they better tackle their needs.
  • Contribute to maintaining the data pipeline: contribute to maintain the Python code base for Airflow, launch relevant incident process to fix the refresh with developers, keep an eye on potential evolution to suggest evolutions to the pipeline.

PostgreSQLPythonSQLTableauAirflowApache KafkaCommunication SkillsData visualizationData modelingData analyticsData managementEnglish communication

Posted 5 days ago
Apply
Apply

📍 Spain

🧭 Full-Time

🏢 Company: Voicemod👥 101-250💰 $14,500,000 Series A over 2 years agoAudioDeveloper APIsGamingSoftware

  • Proven experience in Business Intelligence, Data Engineering, or Data Analytics with advanced SQL skills
  • Solid knowledge of data modelling
  • Experience with DBT and Python
  • Experience with event tracking tools like Segment, mParticle, or similar.
  • Collaborate with engineers, product teams and analysts to develop data products that are precise and insightful.
  • Find creative ways to integrate AI into the Data lifecycle.
  • Own the data product lifecycle, including designing tracking plans, developing data models, ELT pipelines, and self-serve data products.
  • Be the data steward, ensuring data quality and consistent metrics.

PythonSQLBusiness IntelligenceETLData engineeringData visualizationData modelingData analyticsData management

Posted 5 days ago
Apply
Apply

💸 110000.0 - 170000.0 USD per year

  • Experienced in SQL, with an emphasis on writing performant code
  • Data modeling expert, and have designed schemas before at previous organizations, ideally using dbt
  • Strong comprehension of data warehouses and the modern data stack
  • Track record of delivering projects, including datasets that were used to drive key business outcomes
  • Can self-serve by building ELT pipelines and reverse ETL pipelines using off-the-shelf tools, or simple DAGs in Airflow leveraging python.
  • Experience leveraging tools for data governance, data quality testing, and documentation
  • Can clearly communicate about technical subjects to non-technical people, and behave as a liaison between Business, Engineering, and Analytics
  • Ingest, clean and organize raw data
  • Lead data projects from ingestion to reporting
  • Liaise between Business, Engineering, and Data to translate business requirements into proper data capture and schema design
  • Help shape the data model
  • Work with key stakeholders to understand existing processes and collaborate to design and implement improved and automated processes
Posted 5 days ago
Apply
Apply
🔥 Analytics Engineer
Posted 8 days ago

🔍 Marketing

🏢 Company: Power Digital👥 1001-5000💰 Secondary Market about 3 years agoDigital MarketingAdvertisingSEOMobile AdvertisingBrand MarketingMarketingMarketing Automation

  • 1-3 years of experience in designing and implementing basic data models for analytics and business intelligence, focusing on supporting centralized and accessible data structures within data warehouses
  • 1-3 years of experience in SQL-based data transformations using tools like dbt, with exposure to ETL/ELT processes and a willingness to learn best practices for optimizing data structures
  • 1+ years of experience working with data warehousing principles and BI tools (e.g., Looker, Tableau, Power BI), assisting in the creation and maintenance of analytics-ready datasets
  • Strong problem-solving skills with the ability to troubleshoot data discrepancies, monitor data pipelines, and support performance optimizations under guidance
  • Experience working with or supporting cross-functional teams, including data engineers, product teams, and business stakeholders, to understand data needs and contribute to analytical solutions
  • Some experience or coursework in documenting data models, definitions, and transformations, with a willingness to improve documentation and maintain data quality standards
  • Exposure to new tools and methodologies in analytics engineering, with a proactive approach to learning and improving data transformation and delivery processes
  • Bachelor's degree (or equivalent experience) in Computer Science, Information Technology, Data Analytics, or a related field
  • Assist in developing and maintaining clean, analytics-ready data models (e.g., star, snowflake schemas) under guidance from senior engineers
  • Write and optimize SQL queries and transformations, ensuring efficient data retrieval and accuracy, while reviewing best practices with senior team members
  • Monitor and maintain data pipelines, assisting in debugging issues and working with senior engineers to implement fixes
  • Implement data quality checks using predefined validation rules and work on resolving flagged issues
  • Collaborate with business stakeholders and senior analytics engineers to translate data requirements into structured models
  • Support data marts and curated datasets, ensuring that business users have accurate, up-to-date, and accessible information
  • Handle support tickets, troubleshooting data-related issues and escalating more complex problems to senior engineers
  • Maintain documentation for data models, transformations, and business definitions to help improve transparency and consistency
  • Develop and maintain Looker dashboards and reports, ensuring they meet business needs and provide actionable insights
  • Participate in identifying and suggesting process improvements for data pipelines and transformation tools
  • Employ AI technologies to enhance and optimize business processes
  • Utilize and leverage Power Digital's Nova ecosystem as it relates to your department
Posted 8 days ago
Apply
Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

💸 162000.0 - 190000.0 USD per year

🔍 Software Development

🏢 Company: Lob👥 101-250💰 $50,000,000 Series C over 4 years agoDeveloper APIsShippingSaaSMarketingHealth CareSoftwareCourier Service

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience: at least one big data warehouse system such as Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating live production systems using dbt and Python.
  • 3+ years of BI Software experience: at least one analytics platform such as Looker, Power BI, or Tableau.
  • Empathy and effective communication skills: You can explain complex analytical issues to both technical and non-technical audiences.
  • Strong interpretive skills: You can deconstruct complex source data to compose curated models that can be explored by stakeholders.
  • Product mindset: You build data systems that will be used to generate insights for years to come, not just one-off analyses.
  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams as they identify ways to improve the quality of the metrics they produce and analyze.
  • Champion data governance, security, privacy, and retention policies to protect end users, customers, and Lob.
  • Support and mentor fellow engineers and data team members through coffee chats, code review, and pair programming.

PythonSQLETLSnowflakeCommunication SkillsAnalytical SkillsData visualizationData modeling

Posted 8 days ago
Apply

Related Articles

Posted about 1 month ago

How to Overcome Burnout While Working Remotely: Practical Strategies for Recovery

Burnout is a silent epidemic among remote workers. The blurred lines between work and home life, coupled with the pressure to always be “on,” can leave even the most dedicated professionals feeling drained. But burnout doesn’t have to define your remote work experience. With the right strategies, you can recover, recharge, and prevent future episodes. Here’s how.



Posted 6 days ago

Top 10 Skills to Become a Successful Remote Worker by 2025

Remote work is here to stay, and by 2025, the competition for remote jobs will be tougher than ever. To stand out, you need more than just basic skills. Employers want people who can adapt, communicate well, and stay productive without constant supervision. Here’s a simple guide to the top 10 skills that will make you a top candidate for remote jobs in the near future.

Posted 9 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 10 months ago

Read about the recent updates in remote work policies by major companies, the latest tools enhancing remote work productivity, and predictive statistics for remote work in 2024.

Posted 10 months ago

In-depth analysis of the tech layoffs in 2024, covering the reasons behind the layoffs, comparisons to previous years, immediate impacts, statistics, and the influence on the remote job market. Discover how startups and large tech companies are adapting, and learn strategies for navigating the new dynamics of the remote job market.