Apply

Senior Analytics Engineer

Posted 2024-11-02

View full description

💎 Seniority level: Senior, extensive experience

📍 Location: UK, Europe, Africa, UTC -1, UTC+3

🔍 Industry: Digital and Financial Inclusion

⏳ Experience: Extensive experience

🪄 Skills: PythonSQLJavaTableauAirflowData engineering

Requirements:
  • Extensive experience with Python and Java.
  • Proficiency in SQL.
  • Experience with data warehouse technologies.
  • Familiarity with BI tools such as Looker or Tableau.
  • Experience in dimensional data modeling for analytics/big data infrastructures.
  • Experience working with orchestration systems such as Airflow.
  • Experience with dbt.
Responsibilities:
  • Building easy to understand and consistently modeled datasets to serve metrics, dashboards, and exploratory analysis.
  • Creating data transformation pipelines, primarily by using SQL and Python in dbt and Airflow infrastructure.
  • Collaborating with cross-functional product and engineering teams and internal business units to gather requirements and understand business needs.
  • Delivering data-driven recommendations along with applying best practices to build reliable, well-tested, efficient, and documented data assets.
Apply

Related Jobs

Apply

📍 USA, UK, Philippines, Poland, South Africa

🧭 Permanent

🔍 Finance and technology (remittances)

🏢 Company: Zepz

  • Experience with DBT to design and implement data models.
  • Experience using a modern data warehouse (e.g., Databricks, Snowflake, BigQuery) for scalable and efficient data processing.
  • Proficiency in SQL and experience with database technologies (e.g., PostgreSQL, MySQL, Redshift).
  • You have the ability to work confidently with the tools of modern software engineering, for example, working with the command line, version control, testing, and performing code reviews.
  • You have previous use of any orchestration tools similar to Airflow, DBT Cloud, or Fivetran.
  • Familiarity with data visualization tools (e.g., Mode, Tableau, Looker).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
  • You’re comfortable, or have an interest in, reading and extracting data transformation processes from Python scripts.
  • You see yourself as a problem-solver who wants to understand the business problem and communicate the commercial impact alongside the technical solution.
  • Strong communication skills and the ability to work effectively in a distributed team environment.
  • You are comfortable collaborating across multiple time zones.
  • You have an open mind with respect to diversity and inclusivity.

  • Developing, testing, and implementing data models to ensure data integrity and performance using DBT.
  • Collaborating with cross-functional teams to understand data needs and develop solutions to meet those needs.
  • Contributing to and following best practices for data consumption, including educating data consumers on data quality, availability and interpretation.
  • Optimizing existing data processes and pipelines to improve efficiency and reduce latency.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and our data warehouse.
  • Troubleshooting and resolving data issues, ensuring data quality and reliability.
  • Ensuring the data and output is of high quality - tested, automated, scalable, and documented.

AWSPostgreSQLPythonSQLGCPMySQLSnowflakeTableauAirflowAzureCommunication SkillsProblem Solving

Posted 2024-10-17
Apply