Apply📍 Brazil, United States, Sweden
🔍 Open Banking Payments
- Familiarity with data pipeline tools such as Airflow for batch processing.
- Experience with Kafka for streaming data processes.
- Understanding of data quality practices and observability.
- Knowledge of data tools like Debezium and Quicksight.
- Proficiency in applying good coding and development standards.
- Deliver data generated by applications to various stakeholder areas.
- Maintain data from APIs and other tools in a safe and structured manner.
- Work with batch (Airflow) and streaming (Kafka) layers.
- Automate processes to improve data delivery speed and reliability.
- Ensure high data quality and create an observability layer.
- Support Data Science with the necessary infrastructure.
- Follow good coding and development practices with end-to-end encryption and infrastructure as code.
PythonSoftware DevelopmentSQLApache AirflowJavaJavascriptKafkaAirflowData engineeringJavaScript
Posted 2024-11-07
Apply