Apply

Data Platform Engineer

Posted 4 days agoViewed

View full description

💎 Seniority level: Senior, 5+ years

📍 Location: Brazil

🔍 Industry: Payments platform

🏢 Company: Alternative Payments👥 11-50Financial ServicesOnline PortalsPayments

⏳ Experience: 5+ years

🪄 Skills: AWSPythonSQLCloud ComputingETLSnowflakeAirflowData engineeringCommunication SkillsAnalytical SkillsCollaborationCI/CDMentoringDevOpsTerraformDocumentationData modeling

Requirements:
  • 5+ years in software or data engineering, with hands-on experience provisioning and maintaining cloud data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery)
  • Proficiency with Infrastructure-as-Code tools (Terraform, CloudFormation, Pulumi) to automate data platform deployments
  • Strong SQL skills and experience building ETL pipelines in Python or Java/Scala
  • Familiarity with orchestration frameworks (Airflow, Prefect, Dagster) or transformation tools (dbt)
Responsibilities:
  • Architect and spin up our production and sandbox data warehouse environments using IaC
  • Build and deploy the first wave of ETL pipelines to ingest transactional, event and third-party data
  • Embed data quality tests and SLA tracking into every pipeline
  • Establish coding conventions, pipeline templates and best practices for all future data projects
Apply

Related Jobs

Apply

📍 Brazil

🧭 Full-Time

🔍 Software Development

🏢 Company: TELUS Digital Brazil

  • 5+ years of hands-on experience in supporting data engineering teams, strongly emphasizing data pipeline enhancement and optimization, and data integration.
  • Advanced/ Fluent English - communication and documentation skills.
  • Proficient in cloud computing - GCP.
  • Solid proficiency with Python in terms of data processing.
  • Experience with cloud data-related services such as BigQuery, Dataflow, Cloud Composer, Dataproc, Cloud Storage, Pub/Sub, or the correlated services from other providers.
  • Knowledge of SQL and experience with relational databases.
  • Proven experience optimizing data pipelines toward efficiency, reducing operational costs, and reducing the number of issues/failures.
  • Solid knowledge of monitoring, troubleshooting, and resolving data pipeline issues.
  • Familiarity with version control systems like Git
  • Design and implement scalable data pipeline architectures in collaboration with Data Engineers.
  • Continuously optimize data pipeline efficiency to reduce operational costs and minimize issues and failures.
  • Monitor performance and reliability of data pipelines, enhancing reliability through data quality, analysis, and testing.
  • Build and manage automated alerting systems for data pipeline issues.
  • Automate repetitive tasks in data processing and management.
  • Develop and manage disaster recovery and backup plans.
  • In collaboration with other Data Engineering teams, conduct capacity planning for data storage and processing needs.
  • Develop and maintain comprehensive documentation for data pipeline systems and processes, and provide knowledge transfer to data-related teams.
  • Monitor, troubleshoot and resolve production issues in data processing workflows.
  • Maintain infrastructure reliability for data pipelines, enterprise datahub, HPBI, and MDM systems.
  • Conduct post-incident reviews and implement improvements for data pipelines.

PythonSQLApache AirflowCloud ComputingGCPGitData engineeringREST APICI/CDDevOpsDocumentationMicroservicesTroubleshootingData modeling

Posted 2 months ago
Apply