🪄 Skills: PythonSQLETLRubyRuby on RailsSnowflakeAirflowData engineeringCollaboration
Requirements:
Have significant experience in software or data engineering, preferably in a senior or lead role.
Are highly proficient in writing Python and SQL.
Have extensive experience building and optimizing complex ETL/ELT data pipelines.
Have used data transformation tools like dbt.
Are skilled in managing and optimizing script dependencies with tools like Airflow.
Have substantial experience designing and maintaining data warehouses using Snowflake or similar technologies.
Experience leading and mentoring junior data engineers, and guiding teams through complex projects.
Ability to contribute to the strategic direction of data engineering practices and technologies within the organisation.
Nice to have: experience with Terraform, Ruby, data visualization tools (e.g., Looker, Tableau, Power BI), Amplitude, DevOps, Heroku, Kafka, AWS/GCP, etc.
Responsibilities:
Leading the design, development, and maintenance of robust ETL/ELT data pipelines.
Writing, optimizing, and reviewing advanced SQL queries for data extraction and transformation.
Implementing and managing sophisticated data workflows and dependencies using tools like Airflow.
Designing, building, and maintaining advanced data models and data warehouses using Snowflake or similar technologies.
Collaborating with cross-functional teams to understand complex data requirements and deliver efficient solutions.
Ensuring high data quality, integrity, and security across the data lifecycle.
Continuously improving data engineering processes and infrastructure, and implementing best practices.