Apply

Senior Data Engineer - Remote, UK

Posted 16 days agoViewed

View full description

💎 Seniority level: Senior

📍 Location: United Kingdom

🔍 Industry: Cybersecurity

🏢 Company: Immersive

🗣️ Languages: English

🪄 Skills: AWSPythonSQLFlaskGitSnowflakeData engineeringCI/CDRESTful APIsTerraformData visualizationData modelingSoftware Engineering

Requirements:
  • Proficient in python programming with experience using Plotly graphing libraries and application development, experience with Flask and SQLAlchemy is a plus
  • Experience maintaining data pipelines, managing infrastructure as code, and implementing data model changes
  • Experience following software engineering best practices like version control and continuous integration
  • Strong proficiency using SQL in cloud data warehouses (e.g. BigQuery, Redshift, Snowflake) and are comfortable with performance optimization, data partitioning, and window functions
  • Experience with dbt for data transformation layer
  • Experience with IaC tooling such as Terraform or CloudFormation
  • Experience with BI tooling such as Power BI or Looker
  • Experience with AWS, Azure or GCP
Responsibilities:
  • Design, build and maintain high quality python applications for customer facing reporting
  • Maintain and develop data pipelines to ensure data quality and consistency
  • Collaborate closely with analytics engineers to implement data model changes
  • Apply domain knowledge to enable the rest of the business to access the data they need to make informed business decisions
Apply

Related Jobs

Apply

📍 UK

🧭 Full-Time

🔍 Knowledge management

🏢 Company: AlphaSights👥 1001-5000💰 over 17 years agoInformation ServicesKnowledge Management

  • 5+ years of hands-on data engineering development.
  • Expert in Python and SQL.
  • Experience with SQL/NoSQL databases.
  • Experienced with AWS data services.
  • Proficiency in DataOps methodologies and tools.
  • Experience with CI/CD pipelines and managing containerized applications.
  • Proficiency in workflow orchestration tools such as Apache Airflow.
  • Experience in designing, building, and maintaining Data Warehouses.
  • Collaborative experience with cross-functional teams.
  • Knowledge of ETL frameworks and best practices.
  • Design, develop, deploy and support data infrastructure, pipelines and architectures.
  • Take ownership of reporting APIs, ensuring accuracy and timeliness for stakeholders.
  • Monitor dataflows and underlying systems, promoting necessary changes for scalability and performance.
  • Collaborate directly with stakeholders to translate business problems into data-driven solutions.
  • Mentor engineers within the technical guild and support team growth.

AWSPythonSQLApache AirflowETLAirflowData engineeringNosqlCI/CD

Posted 5 months ago
Apply