Apply

Senior Analytics Engineer

Posted 2 days agoViewed

View full description

💎 Seniority level: Senior

📍 Location: UK, Europe, Africa, UTC -1, UTC+3

🔍 Industry: Fintech

🏢 Company: M-KOPA

🗣️ Languages: English

🪄 Skills: PythonSQLApache AirflowBusiness IntelligenceTableauData modeling

Requirements:
  • Proficiency with SQL
  • Experience with Python
  • Experience with orchestration systems such as Airflow
  • Working knowledge of data warehouse technologies
  • Experience with BI tools like Looker or Tableau
  • Experience with dimensional data modeling
Responsibilities:
  • Build datasets for metrics, dashboards, and exploratory analysis
  • Develop semantic models and dashboards
  • Coach users on self-serve capabilities
Apply

Related Jobs

Apply

📍 Germany, India, USA

🧭 Full-Time

🔍 Customer relationship management (CRM)

🏢 Company: HubSpot👥 1001-5000💰 $35,000,000 Series E over 12 years ago🫂 Last layoff about 1 year agoSaaSAnalyticsMarketingCopywritingSocial Media

  • Several years of hands-on SQL experience and expertise in relational databases and data modeling.
  • Strong organizational skills and the ability to document technical designs.
  • Proven communication skills to effectively bridge gaps between business leaders, engineers, and data scientists.
  • Experience in distilling complex information for both executives and front-line representatives.
  • Creative problem-solving abilities for flexible solutions to business questions.
  • Demonstrated curiosity and willingness to learn new technologies.
  • Experience with Snowflake, dbt, and/or Looker is preferred.
  • Collaborate with technical and non-technical teams to connect business and technical solutions.
  • Build scalable data models to analyze key business components.
  • Maintain technical best practices in data infrastructure and contribute to long-term data strategies.
  • Support operations teams in using foundational data models for reporting purposes.
  • Conduct complex root cause analyses and implement preventive recommendations.
  • Expand dbt patterns and macros for flexible data structures.
  • Scope requirements with stakeholders and guide projects throughout their lifecycle with clear roadmaps.
  • Mentor others to foster an inclusive and diverse team environment.

SQLData AnalysisSnowflakeData modeling

Posted 16 days ago
Apply
Apply

📍 United Kingdom, Latvia, Spain, Germany, Denmark, Poland, Portugal, Ireland

🔍 Translation management system

🏢 Company: Lokalise👥 101-250💰 $50,000,000 Series B about 3 years agoInformation ServicesDeveloper APIsSaaSInformation TechnologyCollaborationTranslation ServiceSoftwareCloud Infrastructure

  • 3+ years of experience in building analytics in a SaaS B2B environment in areas like Product, Marketing, Sales, or Finance.
  • Strong proficiency in SQL and Python.
  • Experience with data warehousing platforms (e.g., Snowflake) and ETL/ELT tools (e.g., dbt).
  • Familiarity with data modeling techniques, such as snowflake schema.
  • Experience with data visualization tools like Metabase, Looker, or Lightdash.
  • Experience working with version control environments (e.g., git).
  • Experience in maintaining data quality alerting and monitoring.
  • Work closely with software developers, data analysts, product managers, and business owners to optimize the data analytics lifecycle.
  • Collaborate with data engineers to maintain and optimize ingestion data pipelines.
  • Design, implement, document, and optimize the semantic layer from raw data to foundational business data tables.
  • Own the quality, consistency, and readiness of reporting data tables for endpoints like Metabase and Lightdash.
  • Improve data literacy by educating product managers and business owners on business intelligence tools.
  • Apply software engineering best practices to data transformation code.

PythonSQLApache AirflowETLGitSnowflakeData visualizationData modeling

Posted 29 days ago
Apply
Apply

📍 UK, Europe, Africa

🔍 Digital and Financial Inclusion

  • Extensive experience with Python and Java.
  • Proficiency in SQL.
  • Experience with data warehouse technologies.
  • Familiarity with BI tools such as Looker or Tableau.
  • Experience in dimensional data modeling for analytics/big data infrastructures.
  • Experience working with orchestration systems such as Airflow.
  • Experience with dbt.
  • Building easy to understand and consistently modeled datasets to serve metrics, dashboards, and exploratory analysis.
  • Creating data transformation pipelines, primarily by using SQL and Python in dbt and Airflow infrastructure.
  • Collaborating with cross-functional product and engineering teams and internal business units to gather requirements and understand business needs.
  • Delivering data-driven recommendations along with applying best practices to build reliable, well-tested, efficient, and documented data assets.

PythonSQLJavaTableauAirflowData engineering

Posted 4 months ago
Apply
Apply

📍 France

🔍 Health and Wellness

🏢 Company: Fabulous

  • University Degree in Engineering, Computer Science or Applied Mathematics.
  • A minimum of 4 years of experience in Data or Analytics Engineering.
  • Excellent hands-on skills in Data Modelling.
  • Excellent SQL skills (even better if this is coupled with dbt experience or a similar SQL-based data modelling tool).
  • Excellent Engineering skills (testing, clean coding, peer-reviewing, CD/CI, git workflows, agile workflows, etc…).
  • Previous experience working with modern data stack tools and cloud-based data warehouse (BQ, Snowflake, Redshift, etc.).
  • Sound business acumen to manage your own projects and your business stakeholders.
  • Self-Starter with the ability to work autonomously and own one's projects fully.
  • Excellent written and verbal communication skills (English).
  • Comfortable in a remote work environment (we are a remote-first organization).
  • You will work on Data Modelling and Analytics Engineering project to improve, enrich and maintain our data models and Analytics pipelines. Those projects will be in close collaboration with the head of Data & Analytics as the main stakeholder.
  • You will be responsible for contributing effectively to our code base: building, testing, reviewing and maintaining solid analytics pipelines using SQL and dbt.
  • Help managing TechDebt and improving engineering practices and the project's architecture are also important responsibilities for this role. Solid methodical testing is key here to strengthen Data Observability.
  • You are expected to gradually own some aspects of the team's responsibilities, have a strong saying in how the analytics project's architecture should evolve, contribute to team's evolution and continuous growth, help data analysts and scientists improve and strengthen their engineering skills.
  • You are expected to speak up your mind and contribute proactively and effectively to improving the team's practices, cohesion, impact and mission.
  • You are expected to be highly autonomous and show a sense of ownership and ability to effectively manage your own projects and stakeholders.
  • You will help mentoring more junior members and sharing knowledge and practices within the team to level up everyone's skills.
  • You are expected to contribute effectively to our functional documentation in a way that is clear, concise and useful for future collaborators and readers.

Project ManagementSQLAgileData AnalysisGitSnowflakeData scienceCommunication SkillsCollaboration

Posted 5 months ago
Apply