Apply

Analytics Engineer

Posted 2024-10-24

View full description

📍 Location: UK

💸 Salary: 50000 - 55000 GBP per year

🔍 Industry: AI and Data Infrastructure

🏢 Company: Prolific

🗣️ Languages: English

🪄 Skills: AWSLeadershipSQLCloud ComputingData AnalysisGCPProduct ManagementAmazon Web ServicesData analysisCommunication SkillsAnalytical SkillsCollaboration

Requirements:
  • Deep experience with dbt and SQL.
  • Strong familiarity with cloud platforms like AWS and GCP.
  • Passion for ensuring high data quality through tests and documentation.
  • Ability to understand business needs and communicate with non-technical stakeholders.
  • Advocating for best practices in logging and data modeling.
  • Skilled at working cross-functionally and translating complex concepts.
  • Proficiency in designing repeatable and scalable workflows for data transformation.
Responsibilities:
  • Create complex dbt models, custom macros, and reusable packages.
  • Monitor and maintain dbt workflow jobs for data refreshes.
  • Write tests and assertions for data integrity.
  • Document and standardize dbt processes.
  • Translate technical data issues into understandable business terms.
  • Support junior analysts and data engineers with best practices.
  • Collaborate with product, engineering, and BI teams to align data infrastructure with business needs.
Apply

Related Jobs

Apply
🔥 Analytics Engineer
Posted 2024-11-09

📍 US, UK, Philippines, Poland, South Africa

🧭 Full-Time

🔍 Financial services / Remittance

🏢 Company: Zepz

  • Comfortable with daily use of SQL in a modern cloud data warehouse environment.
  • Able to automate processes and deploy applications in Python, developing production standard scripts.
  • Confidence working with command line, version control, testing, and code reviews.
  • Problem-solver who understands business issues and communicates commercial impact.
  • Advocate for data-driven decision-making and strives to improve processes.
  • Familiarity with dbt for designing and implementing data models is a nice to have.
  • Open-minded with respect to diversity and inclusivity.

  • Building and maintaining data models to expose reliable data for analysis and reporting.
  • Communicating with analysts and business stakeholders to understand commercial requirements and translating them into technical solutions.
  • Developing standards and best practices for data consumption, including educating data consumers on data quality.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and the data warehouse.
  • Ensuring data quality is high, including testing, automation, scalability, and documentation.

AWSPythonSQLKubernetesAirflow

Posted 2024-11-09
Apply
Apply

📍 Germany, Europe

🧭 Full-Time

🔍 Employee Apps, Tech

🏢 Company: Flip App

  • 2-4 years of experience in Business Intelligence, Data Analytics, or Data Engineering.
  • Experience with BI tools like Looker, PowerBI, Tableau, or Qlik.
  • Proficient in SQL.
  • Experience in stakeholder management and project management.
  • Consulting experience is a plus.
  • Familiarity with Marketing & CRM data, ideally with Hubspot.
  • Strong proactive communication skills.
  • Structured and independent work style with multitasking ability.
  • Fluent in English, good German is a plus.

  • Own the entire data lifecycle for Sales and Marketing data.
  • Serve as a central communication hub between departments.
  • Develop KPIs and integrate data sources (ETL).
  • Enhance self-service reporting platform and perform data visualization.
  • Conduct onboarding and training sessions for stakeholders.
  • Identify opportunities for process optimization and automation.

SQLBusiness IntelligenceETLTableauData engineering

Posted 2024-11-07
Apply
Apply

📍 UK, Europe, Africa

🔍 Digital and Financial Inclusion

  • Extensive experience with Python and Java.
  • Proficiency in SQL.
  • Experience with data warehouse technologies.
  • Familiarity with BI tools such as Looker or Tableau.
  • Experience in dimensional data modeling for analytics/big data infrastructures.
  • Experience working with orchestration systems such as Airflow.
  • Experience with dbt.

  • Building easy to understand and consistently modeled datasets to serve metrics, dashboards, and exploratory analysis.
  • Creating data transformation pipelines, primarily by using SQL and Python in dbt and Airflow infrastructure.
  • Collaborating with cross-functional product and engineering teams and internal business units to gather requirements and understand business needs.
  • Delivering data-driven recommendations along with applying best practices to build reliable, well-tested, efficient, and documented data assets.

PythonSQLJavaTableauAirflowData engineering

Posted 2024-11-02
Apply
Apply

📍 USA, UK, Philippines, Poland, South Africa

🧭 Permanent

🔍 Finance and technology (remittances)

🏢 Company: Zepz

  • Experience with DBT to design and implement data models.
  • Experience using a modern data warehouse (e.g., Databricks, Snowflake, BigQuery) for scalable and efficient data processing.
  • Proficiency in SQL and experience with database technologies (e.g., PostgreSQL, MySQL, Redshift).
  • You have the ability to work confidently with the tools of modern software engineering, for example, working with the command line, version control, testing, and performing code reviews.
  • You have previous use of any orchestration tools similar to Airflow, DBT Cloud, or Fivetran.
  • Familiarity with data visualization tools (e.g., Mode, Tableau, Looker).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
  • You’re comfortable, or have an interest in, reading and extracting data transformation processes from Python scripts.
  • You see yourself as a problem-solver who wants to understand the business problem and communicate the commercial impact alongside the technical solution.
  • Strong communication skills and the ability to work effectively in a distributed team environment.
  • You are comfortable collaborating across multiple time zones.
  • You have an open mind with respect to diversity and inclusivity.

  • Developing, testing, and implementing data models to ensure data integrity and performance using DBT.
  • Collaborating with cross-functional teams to understand data needs and develop solutions to meet those needs.
  • Contributing to and following best practices for data consumption, including educating data consumers on data quality, availability and interpretation.
  • Optimizing existing data processes and pipelines to improve efficiency and reduce latency.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and our data warehouse.
  • Troubleshooting and resolving data issues, ensuring data quality and reliability.
  • Ensuring the data and output is of high quality - tested, automated, scalable, and documented.

AWSPostgreSQLPythonSQLGCPMySQLSnowflakeTableauAirflowAzureCommunication SkillsProblem Solving

Posted 2024-10-17
Apply
Apply

📍 United Kingdom

🔍 Esports, Gaming, Betting, Events

🏢 Company: ESL FACEIT Group

  • Extensive knowledge of SQL and hands-on experience with dbt.
  • Strong understanding of database design principles including star/snowflake schemas, data vault, and normal forms.
  • Ability to visualize and communicate data insights effectively.
  • Excellent communication skills for stakeholder management and support.
  • Passion for learning and staying updated on data trends.
  • Experience in the Esports, Gaming, Betting, or Events industry is a plus.

  • Ask insightful questions to understand customer needs and design appropriate solutions.
  • Work with business users to encapsulate business logic in the data warehouse.
  • Set standards for documentation, data quality and code, holding the team accountable.
  • Drive towards efficiency, simplifying and improving work processes.
  • Create data products that tell impactful business stories.
  • Ensure high delivery quality, speed, and effective communication with stakeholders.
  • Inspire, teach, and guide team members while forming best practices.

SQLAgileSnowflakeDocumentation

Posted 2024-09-20
Apply