Data Engineer - Databricks / AWS
A
AzumoSoftware Development
Latin America - Remote; locations: Buenos Aires, Argentina. Uruguay. Colombia. Dominican Republic. MexicoFull-TimeSenior
Salary not disclosed
Apply NowOpens the employer's application page
Job Details
- Languages
- Professional English proficiency (B2/C1)
- Experience
- 5+ years
- Required Skills
- AWSPythonSQLApache AirflowETLData modelingdbtDatabricks
Requirements
- Strong experience with Databricks
- Strong SQL and Python programming experience
- Experience building scalable ETL/ELT pipelines
- Experience with AWS data ecosystem and cloud-native architectures
- Experience with DBT, Apache Airflow or similar orchestration tools
- Strong understanding of data modeling and analytics engineering
- Experience with large-scale structured and semi-structured datasets
- Experience implementing data governance and data quality practices
- BS or Master’s degree in Computer Science, related degree, or equivalent experience
- 5+ years experience with data-related and data management responsibilities
- Professional English proficiency (B2/C1)
Responsibilities
- Design, build, and maintain scalable ETL/ELT pipelines using Databricks and AWS
- Improve ingestion pipeline quality, reliability, scalability, and governance
- Develop and optimize core data models and foundational data tables
- Build analytics-ready datasets to support player insights and operational reporting
- Implement data governance, data quality, lineage, and observability practices
- Collaborate with product, analytics, and business stakeholders
- Optimize large-scale data processing workflows for performance and cost
- Contribute to the unification of fragmented data ecosystems
- Build and maintain reliable orchestration workflows and scheduling systems
- Participate in architectural discussions around scalability and modernization
View Full Description & ApplyYou'll be redirected to the employer's site