Apply

Senior Analytics Engineer

Posted 30 days agoViewed

View full description

💎 Seniority level: Senior

🔍 Industry: Property Management

🏢 Company: Ambient👥 1-10💰 $6,500,000 Seed over 1 year agoCryptocurrencyBlockchain

Requirements:
  • Advanced SQL skills to extract data from a data warehouse (BigQuery).
  • Data modeling and schema design skills (using dbt).
  • Experience in developing cloud-hosted data ingestion web services (AWS and GCP) with skills in Lambda or containerization (Kubernetes, Docker).
  • Ability to connect observability concepts to specific needs.
  • Familiarity with ETL/ELT tools (Airflow and Stitch) and modern analytical tools and programming languages (Python).
  • Software engineering fundamentals and proficiency in version control systems (Git, Github).
  • Experience with data visualization tools (e.g., SigmaComputing).
Responsibilities:
  • Create new data models, views, and data flows from various sources to support product experimentation and device troubleshooting.
  • Collaborate with engineers to enhance telemetry collection and improve downstream alerting metrics.
  • Build data quality tests for IoT device telemetry and ERP systems.
  • Support business users through workshops and monitor query/dashboard performance.
  • Build analytics covering access auditing and qualitative insights on inbound data requests.
Apply

Related Jobs

Apply

📍 US

🔍 Early education technology

  • Passionate about data with strong SQL skills.
  • Proficient in DBT and data modeling.
  • Experience juggling multiple projects with shifting priorities.
  • Self-starter who takes ownership of high impact projects.

  • Join the Analytics team to enable self-service analytics.
  • Engage in deep data discovery based on business problems.
  • Build scalable data models.

SQLData modeling

Posted 26 days ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 1 month ago

📍 United States

🧭 Full-Time

💸 125000 - 145000 USD per year

🔍 Fintech

🏢 Company: Patriot Software👥 101-250💰 Series B over 1 year agoAccountingHuman ResourcesFinancial ServicesBookkeeping and PayrollSaaSSoftware

  • Experience working in Analytics (8+ years)
  • Experience working in SaaS at a fintech company (3+ years)
  • Data Modeling for Data Warehouses (8+ years)
  • dbt Experience (1+ years)
  • Python for Data Engineering (4+ years)
  • Common Off the Shelf (COTS) business intelligence tool experience (5+ years)
  • Strong analytical skills with the ability to interpret complex data.
  • Working knowledge of statistical analysis techniques including ML/AI algorithms.
  • Excellent communication skills to present complex data findings clearly.
  • Strong attention to detail and organizational skills.
  • Ability to manage multiple projects simultaneously.
  • Ability to work collaboratively in cross-functional teams.

  • Collaborate and prototype with stakeholders to define business requirements for analysis and reporting.
  • Design, develop, and maintain data models, balancing tradeoffs between possible schema design decisions.
  • Build reports from data models using business intelligence tools to provide meaningful insights.
  • Develop and automate data visualizations, dashboards, and reports that effectively communicate complex data.
  • Utilize Python to extract data from various sources and load it into appropriate data storage solutions.
  • Develop and maintain ETL/ELT workflows to transform and clean raw data.
  • Work closely with cross-functional teams to identify data requirements and define metrics.
  • Conduct exploratory data analysis to uncover opportunities and insights.
  • Stay up-to-date with industry trends and best practices in analytics and data engineering.

PythonBusiness IntelligenceData AnalysisETLGitTableauAlgorithmsData engineeringCommunication SkillsAnalytical SkillsAttention to detailOrganizational skillsSaaS

Posted about 1 month ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 2 months ago

📍 United States

🔍 Financial technology

🏢 Company: Forward Financing👥 251-500💰 $200,000,000 Debt Financing 4 months agoFinancial ServicesFinanceFinTech

  • 4+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience with dbt or equivalent experience using Python for data transformations.
  • 2+ years of experience with a cloud-based data warehouse such as Snowflake or Redshift.
  • Advanced proficiency in SQL and data modeling.
  • Strong ability to understand business requirements and translate them into technical solutions.
  • Experience with business intelligence tools such as Looker, Tableau, or similar.
  • Preferred: Experience with Python.

  • Design, build, and maintain scalable data models and marts using dbt.
  • Partner with the Data Science team to create variables for predictive models.
  • Collaborate with Data Engineering on deploying variables for production models.
  • Work with the Technology team on schema changes and data migrations.
  • Collaborate with DevOps to monitor our streaming dbt project for real-time use.
  • Ensure code quality by reviewing analyst’s pull requests.
  • Assist in evaluating and integrating third-party data sources.
  • Adhere to data governance standards including data quality checks and security compliance.

SQLBusiness IntelligenceSnowflakeTableauData engineeringData scienceDevOpsComplianceData modeling

Posted about 2 months ago
Apply
Apply

📍 UK, Europe, Africa

🔍 Digital and Financial Inclusion

  • Extensive experience with Python and Java.
  • Proficiency in SQL.
  • Experience with data warehouse technologies.
  • Familiarity with BI tools such as Looker or Tableau.
  • Experience in dimensional data modeling for analytics/big data infrastructures.
  • Experience working with orchestration systems such as Airflow.
  • Experience with dbt.

  • Building easy to understand and consistently modeled datasets to serve metrics, dashboards, and exploratory analysis.
  • Creating data transformation pipelines, primarily by using SQL and Python in dbt and Airflow infrastructure.
  • Collaborating with cross-functional product and engineering teams and internal business units to gather requirements and understand business needs.
  • Delivering data-driven recommendations along with applying best practices to build reliable, well-tested, efficient, and documented data assets.

PythonSQLJavaTableauAirflowData engineering

Posted 2 months ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Technology/Analytics

🏢 Company: Fetch

  • 3+ years of professional experience in a technical role requiring advance knowledge of SQL.
  • Understand the difference between SQL that works and SQL that performs.
  • Experience with data modeling and orchestration tools.
  • Experience with relational (SQL) and non-relational (NoSQL) databases.
  • Experience with object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB).
  • Understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools.
  • Experience clearly communicating about data with internal and external stakeholders from both technical and nontechnical backgrounds.
  • Ability to thrive in a highly autonomous, matrixed organization and manage multiple, concurrent work streams.

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance.
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices.
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data.
  • Translate business requirements for near-real-time actionable insights into data models and artifacts.
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders.
  • Perform administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure.
  • Test, monitor, and report on data health and data quality.
  • Lead the charge on data documentation and data discovery initiatives.

AWSPythonSQLBusiness IntelligenceDynamoDBETLGCPMongoDBSnowflakeTableauAirflowAzurePostgresRedisNosqlCI/CDData modeling

Posted 2 months ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 SaaS

🏢 Company: Apollo.io👥 501-1000💰 $100,000,000 Series D over 1 year agoSoftware Development

  • Experience with relevant tools in data engineering and analytics.
  • Sound understanding of the SaaS industry, especially regarding LTV:CAC, Activation levers, and Conversion Rates.
  • Ability to approach new projects objectively without bias.
  • Great time management skills.
  • Excellent written and oral communication skills for summarizing insights and designing data assets.

  • Design, develop, and maintain essential data models for various business functions to ensure consistency and accuracy.
  • Define and implement data engineering standards and best practices, providing guidance to other teams.
  • Collaborate with multiple business areas to understand their requirements and deliver scalable data solutions.
  • Influence the future direction of analytics infrastructure, offering strategic insights to drive business impact.

Business IntelligenceSnowflakeStrategyData engineeringCommunication SkillsCollaborationCross-functional collaborationData modeling

Posted 3 months ago
Apply
Apply

📍 Europe, APAC, Americas

🧭 Full-Time

🔍 Technology

  • Experience: 5+ years of experience in data engineering or analytics engineering roles, with a proven track record of leading complex data projects and initiatives.
  • Technical Expertise: Deep expertise in SQL, DBT, and data modeling, with a strong understanding of data pipeline design, ETL processes, and data warehousing.
  • Software Engineering Skills: Proficiency in software engineering principles, including CI/CD pipelines, version control (e.g., Git), and scripting languages (e.g., Python).
  • Data Tools Proficiency: Hands-on experience with tools like Snowflake, DBT, and Looker. Familiarity with additional tools and platforms (e.g., AWS, Kubernetes) is a plus.
  • Problem-Solving: Strong analytical and problem-solving skills, with the ability to diagnose and resolve complex technical issues related to data infrastructure.
  • Leadership: Demonstrated ability to mentor and lead junior engineers, with a focus on fostering a collaborative and high-performance team environment.
  • Communication: Excellent communication skills, with the ability to clearly and concisely convey complex technical concepts to both technical and non-technical stakeholders.
  • Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.

  • Data Pipeline Leadership: Design, develop, and maintain highly scalable and efficient data pipelines, ensuring timely and accurate collection, transformation, and integration of data from various sources.
  • Advanced Data Modeling: Architect and implement robust data models and data warehousing solutions that enable efficient storage, retrieval, and analysis of large, complex datasets.
  • Cross-Functional Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data requirements, translating them into actionable data models and insights.
  • Data Quality Assurance: Implement and oversee rigorous data validation, cleansing, and error-handling mechanisms to maintain high data quality and reliability.
  • Performance Optimization: Continuously monitor and optimize data pipeline performance, identifying and resolving bottlenecks and inefficiencies to maintain optimal system responsiveness.
  • Mentorship and Leadership: Provide guidance and mentorship to junior analytics engineers, fostering a collaborative and learning-oriented environment.
  • Strategic Contribution: Contribute to the strategic direction of data initiatives, staying abreast of industry best practices, emerging technologies, and trends in data engineering and analytics.
  • Documentation & Knowledge Sharing: Build and maintain user-facing documentation for key processes, metrics, and data models to enhance the data-driven culture within the organization.

LeadershipPythonSQLETLGitSnowflakeStrategyData engineeringCommunication SkillsCollaborationCI/CD

Posted 3 months ago
Apply
Apply

📍 France

🔍 Health and Wellness

🏢 Company: Fabulous

  • University Degree in Engineering, Computer Science or Applied Mathematics.
  • A minimum of 4 years of experience in Data or Analytics Engineering.
  • Excellent hands-on skills in Data Modelling.
  • Excellent SQL skills (even better if this is coupled with dbt experience or a similar SQL-based data modelling tool).
  • Excellent Engineering skills (testing, clean coding, peer-reviewing, CD/CI, git workflows, agile workflows, etc…).
  • Previous experience working with modern data stack tools and cloud-based data warehouse (BQ, Snowflake, Redshift, etc.).
  • Sound business acumen to manage your own projects and your business stakeholders.
  • Self-Starter with the ability to work autonomously and own one's projects fully.
  • Excellent written and verbal communication skills (English).
  • Comfortable in a remote work environment (we are a remote-first organization).

  • You will work on Data Modelling and Analytics Engineering project to improve, enrich and maintain our data models and Analytics pipelines. Those projects will be in close collaboration with the head of Data & Analytics as the main stakeholder.
  • You will be responsible for contributing effectively to our code base: building, testing, reviewing and maintaining solid analytics pipelines using SQL and dbt.
  • Help managing TechDebt and improving engineering practices and the project's architecture are also important responsibilities for this role. Solid methodical testing is key here to strengthen Data Observability.
  • You are expected to gradually own some aspects of the team's responsibilities, have a strong saying in how the analytics project's architecture should evolve, contribute to team's evolution and continuous growth, help data analysts and scientists improve and strengthen their engineering skills.
  • You are expected to speak up your mind and contribute proactively and effectively to improving the team's practices, cohesion, impact and mission.
  • You are expected to be highly autonomous and show a sense of ownership and ability to effectively manage your own projects and stakeholders.
  • You will help mentoring more junior members and sharing knowledge and practices within the team to level up everyone's skills.
  • You are expected to contribute effectively to our functional documentation in a way that is clear, concise and useful for future collaborators and readers.

Project ManagementSQLAgileData AnalysisGitSnowflakeData scienceCommunication SkillsCollaboration

Posted 4 months ago
Apply

Related Articles

Posted 5 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 5 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 5 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 5 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Posted 5 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.