Apply📍 Poland
🔍 Financial services industry
🏢 Company: Capco
- Extensive experience with Databricks, including ETL processes and data migration.
- Experience with additional cloud platforms like AWS, Azure, or GCP.
- Strong knowledge of data warehousing concepts, data modeling, and SQL.
- Proficiency in programming languages such as Python, SQL, and scripting languages.
- Knowledge of data governance frameworks and data security principles.
- Familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes.
- Bachelor or Master Degree in Computer Science or related field.
- Design, develop, and implement robust data architecture solutions utilizing modern data platforms like Databricks.
- Ensure scalable, reliable, and secure data environments that meet business requirements and support advanced analytics.
- Lead the migration of data from traditional RDBMS systems to Databricks environments.
- Architect and design scalable data pipelines and infrastructure to support the organization's data needs.
- Develop and manage ETL processes using Databricks to ensure efficient data extraction, transformation, and loading.
- Optimize ETL workflows to enhance performance and maintain data integrity.
- Monitor and optimize performance of data systems to ensure reliability, scalability, and cost-effectiveness.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Define best practices for data engineering and ensure adherence to them.
- Evaluate and implement new technologies to improve data pipeline efficiency.
AWSDockerLeadershipPythonSQLETLGCPKubernetesAzureData engineeringRDBMSAnalytical Skills
Posted 2024-11-07
Apply