Apply

Senior Analytics Engineer

Posted 2024-10-17

View full description

📍 Location: USA, UK, Philippines, Poland, South Africa

🔍 Industry: Finance and technology (remittances)

🏢 Company: Zepz

🗣️ Languages: English, Akuapem, Amharic, Bengali, Ewe, Fante, Ga, Igbo, Kalenjin, Luganda, Oromo, Somali, Swahili, Wolof, Bulgarian, Croatian, Czech, Danish, Dutch, Estonian, Finnish, French, German, Greek, Hungarian, Irish, Italian, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovenian, Spanish, Swedish

🪄 Skills: AWSPostgreSQLPythonSQLGCPMySQLSnowflakeTableauAirflowAzureCommunication SkillsProblem Solving

Requirements:
  • Experience with DBT to design and implement data models.
  • Experience using a modern data warehouse (e.g., Databricks, Snowflake, BigQuery) for scalable and efficient data processing.
  • Proficiency in SQL and experience with database technologies (e.g., PostgreSQL, MySQL, Redshift).
  • You have the ability to work confidently with the tools of modern software engineering, for example, working with the command line, version control, testing, and performing code reviews.
  • You have previous use of any orchestration tools similar to Airflow, DBT Cloud, or Fivetran.
  • Familiarity with data visualization tools (e.g., Mode, Tableau, Looker).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
  • You’re comfortable, or have an interest in, reading and extracting data transformation processes from Python scripts.
  • You see yourself as a problem-solver who wants to understand the business problem and communicate the commercial impact alongside the technical solution.
  • Strong communication skills and the ability to work effectively in a distributed team environment.
  • You are comfortable collaborating across multiple time zones.
  • You have an open mind with respect to diversity and inclusivity.
Responsibilities:
  • Developing, testing, and implementing data models to ensure data integrity and performance using DBT.
  • Collaborating with cross-functional teams to understand data needs and develop solutions to meet those needs.
  • Contributing to and following best practices for data consumption, including educating data consumers on data quality, availability and interpretation.
  • Optimizing existing data processes and pipelines to improve efficiency and reduce latency.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and our data warehouse.
  • Troubleshooting and resolving data issues, ensuring data quality and reliability.
  • Ensuring the data and output is of high quality - tested, automated, scalable, and documented.
Apply

Related Jobs

Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

💸 162500 - 202500 USD per year

🔍 Analytics and Postal Services

🏢 Company: Lob

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience with big data warehouse systems like Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating production systems using dbt and Python.
  • 3+ years of BI Software experience with analytics platforms like Looker, Power BI, or Tableau.
  • Empathy and effective communication skills to convey complex analytical issues.
  • Strong interpretive skills for deconstructing complex data into usable models.
  • Product mindset to build long-lasting data systems for insights.

  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams to improve the quality of their metrics.
  • Champion data governance, security, privacy, and retention policies.
  • Support and mentor fellow engineers and data team members through various means.

Project ManagementPythonSQLDynamoDBElasticSearchETLSnowflakeTableauAirflowElasticsearchNosqlCommunication Skills

Posted 2024-11-07
Apply
Apply

📍 United States

🔍 Financial technology

🏢 Company: Forward Financing

  • 4+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience with dbt or equivalent experience using Python for data transformations.
  • 2+ years of experience with a cloud-based data warehouse such as Snowflake or Redshift.
  • Advanced proficiency in SQL and data modeling.
  • Strong ability to understand business requirements and translate them into technical solutions.
  • Experience with business intelligence tools such as Looker, Tableau, or similar.
  • Preferred: Experience with Python.

  • Design, build, and maintain scalable data models and marts using dbt.
  • Partner with the Data Science team to create variables for predictive models.
  • Collaborate with Data Engineering on deploying variables for production models.
  • Work with the Technology team on schema changes and data migrations.
  • Collaborate with DevOps to monitor our streaming dbt project for real-time use.
  • Ensure code quality by reviewing analyst’s pull requests.
  • Assist in evaluating and integrating third-party data sources.
  • Adhere to data governance standards including data quality checks and security compliance.

SQLBusiness IntelligenceSnowflakeTableauData engineeringData scienceDevOpsCompliance

Posted 2024-11-07
Apply
Apply

📍 UK, Europe, Africa

🔍 Digital and Financial Inclusion

  • Extensive experience with Python and Java.
  • Proficiency in SQL.
  • Experience with data warehouse technologies.
  • Familiarity with BI tools such as Looker or Tableau.
  • Experience in dimensional data modeling for analytics/big data infrastructures.
  • Experience working with orchestration systems such as Airflow.
  • Experience with dbt.

  • Building easy to understand and consistently modeled datasets to serve metrics, dashboards, and exploratory analysis.
  • Creating data transformation pipelines, primarily by using SQL and Python in dbt and Airflow infrastructure.
  • Collaborating with cross-functional product and engineering teams and internal business units to gather requirements and understand business needs.
  • Delivering data-driven recommendations along with applying best practices to build reliable, well-tested, efficient, and documented data assets.

PythonSQLJavaTableauAirflowData engineering

Posted 2024-11-02
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Technology/Analytics

🏢 Company: Fetch

  • 3+ years of professional experience in a technical role requiring advance knowledge of SQL.
  • Understand the difference between SQL that works and SQL that performs.
  • Experience with data modeling and orchestration tools.
  • Experience with relational (SQL) and non-relational (NoSQL) databases.
  • Experience with object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB).
  • Understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools.
  • Experience clearly communicating about data with internal and external stakeholders from both technical and nontechnical backgrounds.
  • Ability to thrive in a highly autonomous, matrixed organization and manage multiple, concurrent work streams.

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance.
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices.
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data.
  • Translate business requirements for near-real-time actionable insights into data models and artifacts.
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders.
  • Perform administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure.
  • Test, monitor, and report on data health and data quality.
  • Lead the charge on data documentation and data discovery initiatives.

AWSPythonSQLBusiness IntelligenceDynamoDBETLGCPMongoDBSnowflakeTableauAirflowAzurePostgresRedisNosqlCI/CD

Posted 2024-10-24
Apply
Apply

📍 United States

🧭 Full-Time

🔍 SaaS

🏢 Company: Apollo.io

  • Experience with relevant tools in data engineering and analytics.
  • Sound understanding of the SaaS industry, especially regarding LTV:CAC, Activation levers, and Conversion Rates.
  • Ability to approach new projects objectively without bias.
  • Great time management skills.
  • Excellent written and oral communication skills for summarizing insights and designing data assets.

  • Design, develop, and maintain essential data models for various business functions to ensure consistency and accuracy.
  • Define and implement data engineering standards and best practices, providing guidance to other teams.
  • Collaborate with multiple business areas to understand their requirements and deliver scalable data solutions.
  • Influence the future direction of analytics infrastructure, offering strategic insights to drive business impact.

Business IntelligenceSnowflakeStrategyData engineeringCommunication SkillsCollaboration

Posted 2024-10-21
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: Appfire Technologies, LLC.

  • 5+ years of relevant industry experience.
  • Strong SQL skills for data modeling and query optimization.
  • 3+ years of experience with Python and DBT.
  • Passion for high data quality and impactful analytics engineering work.
  • Strong understanding of data warehousing concepts and best practices.
  • Efficient problem-solving and analytical skills.
  • Ability to work in both collaborative and independent environments.
  • Experience managing concurrent complex data projects.
  • Excellent communication and documentation skills.

  • Understand decision needs by interfacing with Analytics Engineers, Data Engineers, Data Analysts and Business Partners.
  • Architect, build, and launch efficient & reliable data models and pipelines with Data Engineering, using DBT and Snowflake.
  • Design and implement metrics and dimensions for self-service analysis and structured KPI views.
  • Define and share best practices for metrics, dimension, and data model development.

PythonSQLSnowflakeData engineeringAnalytical SkillsCollaboration

Posted 2024-10-19
Apply
Apply

📍 Europe, APAC, Americas

🧭 Full-Time

🔍 Technology

  • Experience: 5+ years of experience in data engineering or analytics engineering roles, with a proven track record of leading complex data projects and initiatives.
  • Technical Expertise: Deep expertise in SQL, DBT, and data modeling, with a strong understanding of data pipeline design, ETL processes, and data warehousing.
  • Software Engineering Skills: Proficiency in software engineering principles, including CI/CD pipelines, version control (e.g., Git), and scripting languages (e.g., Python).
  • Data Tools Proficiency: Hands-on experience with tools like Snowflake, DBT, and Looker. Familiarity with additional tools and platforms (e.g., AWS, Kubernetes) is a plus.
  • Problem-Solving: Strong analytical and problem-solving skills, with the ability to diagnose and resolve complex technical issues related to data infrastructure.
  • Leadership: Demonstrated ability to mentor and lead junior engineers, with a focus on fostering a collaborative and high-performance team environment.
  • Communication: Excellent communication skills, with the ability to clearly and concisely convey complex technical concepts to both technical and non-technical stakeholders.
  • Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.

  • Data Pipeline Leadership: Design, develop, and maintain highly scalable and efficient data pipelines, ensuring timely and accurate collection, transformation, and integration of data from various sources.
  • Advanced Data Modeling: Architect and implement robust data models and data warehousing solutions that enable efficient storage, retrieval, and analysis of large, complex datasets.
  • Cross-Functional Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data requirements, translating them into actionable data models and insights.
  • Data Quality Assurance: Implement and oversee rigorous data validation, cleansing, and error-handling mechanisms to maintain high data quality and reliability.
  • Performance Optimization: Continuously monitor and optimize data pipeline performance, identifying and resolving bottlenecks and inefficiencies to maintain optimal system responsiveness.
  • Mentorship and Leadership: Provide guidance and mentorship to junior analytics engineers, fostering a collaborative and learning-oriented environment.
  • Strategic Contribution: Contribute to the strategic direction of data initiatives, staying abreast of industry best practices, emerging technologies, and trends in data engineering and analytics.
  • Documentation & Knowledge Sharing: Build and maintain user-facing documentation for key processes, metrics, and data models to enhance the data-driven culture within the organization.

LeadershipPythonSQLETLGitSnowflakeStrategyData engineeringCommunication SkillsCollaborationCI/CD

Posted 2024-09-29
Apply