Engenheiro de Dados Sênior

New
100% remote work model across Brazil.Full-TimeSenior
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Languages
English
Required Skills
SQLAirflowData engineeringData modelingDatabricksPySpark

Requirements

  • Solid experience with Azure Data Factory and Databricks using PySpark.
  • Strong background in data modeling and building Data Lakes/Lakehouse architectures with Delta Lake.
  • Experience with SQL and large-scale data processing environments.
  • Hands-on expertise in pipeline orchestration tools such as Airflow or equivalent platforms.
  • Strong analytical and problem-solving skills with the ability to work independently in remote environments.
  • Experience leading technical discussions and contributing to architecture decisions.
  • Advanced English proficiency.
  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related fields is preferred.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using Azure Data Factory and PySpark in Databricks environments.
  • Lead technical initiatives related to data engineering, ensuring best practices in architecture, performance, and reliability.
  • Build and optimize Data Lakes and Lakehouse solutions using Delta Lake technologies.
  • Develop and maintain robust data models to support analytics, reporting, and business intelligence needs.
  • Orchestrate and monitor data workflows and pipelines using tools such as Airflow or similar solutions.
  • Collaborate with cross-functional teams to understand business requirements and translate them into efficient data solutions.
  • Ensure data quality, governance, scalability, and operational excellence across the data platform.
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now