Apply🧭 Part-Time
💸 700000000.0 - 900000000.0 COP per year
🔍 Software Development
- 4 + years experience in Data Engineering
- 3+ years of experience working on Apache Spark applications using Python (PySpark) or Scala
- Experience creating spark jobs that work on at least 1 billion records
- Strong knowledge of ETL architecture and standards
- Software development experience working with Apache Airflow, Spark, MongoDB, MySQL
- Strong SQL knowledge
- Strong command of Python
- Experience creating data pipelines in a production system
- Proven experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
- Experience using Docker or Kubernetes is a plus
- Ability to identify and resolve problems associated with production grade large scale data processing workflows
- Experience with crafting and maintaining unit tests and continuous integration.
- Passion for crafting Intelligent data pipelines that teams love to use
- Strong capacity to handle numerous projects are a must
- Collaborate with Data architects, Enterprise architects, Solution consultants and Product engineering teams to gather customer data integration requirements, conceptualize solutions & build required technology stack
- Collaborate with enterprise customer's engineering team to identify data sources, profile and quantify quality of data sources, develop tools to prepare data and build data pipelines for integrating customer data sources and third party data sources.
- Develop new features and improve existing data integrations with customer data ecosystem
- Encourage the team to think out-of-the-box and overcome engineering obstacles while incorporating new innovative design principles.
- Collaborate with a Project Manager to bill and forecast time for product owner solutions
- Building data pipelines
- Reconciling missed data
- Acquire datasets that align with business needs
- Develop algorithms to transform data into useful, actionable information
- Build, test, and maintain database pipeline architectures
- Collaborate with management to understand company objectives
- Create new data validation methods and data analysis protocols
- Ensure compliance with data governance and security policies
Posted 1 day ago
Apply