Data Engineer (Tech-lead)

Brazil / Mexico / Peru / Colombia / Argentina, CST / CDT (Central Time)ContractLead
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Languages
English
Required Skills
DockerPythonSQLApache AirflowBashKubernetesSnowflakedbtGitLab

Requirements

  • Advanced proficiency in Python, SQL, and Bash for complex data processing and scripting.
  • Deep experience with Snowflake as a primary data warehouse and dbt for robust transformations.
  • Hands-on experience with Apache Airflow for managing and monitoring sophisticated data workflows.
  • Practical knowledge of containerization and CI/CD tools, specifically Docker, Kubernetes, and GitLab.
  • A proven track record in System Architecture and Team Leadership/Management.
  • Excellent verbal and written communication skills in English for daily technical collaboration and leadership.

Responsibilities

  • Design and architect high-quality data systems and pipelines using the latest industry standards.
  • Lead the development of complex data models and transformations leveraging Snowflake and dbt.
  • Design and implement efficient workflow automation layers using Apache Airflow for complex dependency handling.
  • Ensure environment consistency and deployment reliability by managing Docker, Kubernetes, and GitLab CI/CD pipelines.
  • Take full ownership of the data product lifecycle, driving projects from initial inception through to successful production deployment.
  • Provide guidance and management to the team, ensuring best practices in Python, SQL, and Bash scripting are maintained across the codebase.
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now