Apply📍 Sofia, Sofia City Province, Bulgaria
🧭 Full-Time
🔍 Software Development
- A minimum of 5 years of relevant experience in data engineering
- Bachelor's degree in Computer Science, Information Technology, or a related field
- Strong proficiency in Python for scripting and data processing
- Familiarity with big data technologies such as Hadoop, Spark, and Kafka.
- Experience with cloud platforms (AWS, Azure, or Google Cloud) and their data services.
- Strong understanding of data warehousing concepts and experience with databases like SQL Server, Oracle, or PostgreSQL.
- Solid understanding of data modeling, database design, and data warehousing concepts
- Excellent problem-solving and communication skills
- Ability to work independently and collaboratively in a fast-paced environment
- Design, develop, and maintain scalable data pipelines for processing and analyzing large volumes of data
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data integrity and quality
- Utilize your expertise in Python for scripting and coding tasks related to data processing and analysis
- Understand and implement business rules in python for data transformation
- Implement ETL processes to integrate data from various sources into data warehouse or data lake solutions
- Optimize big data storage and processing
- Troubleshoot and resolve data-related issues, ensuring the reliability and performance of our data infrastructure
- Follow emerging trends and technologies in the data engineering space and make recommendations for continuous improvement
- Optimize and tune data workflows for maximum efficiency and scalability.
- Implement data security best practices to protect sensitive information and ensure compliance with data protection regulations.
- Develop and maintain API integrations to facilitate seamless data exchange between systems and applications
AWSPostgreSQLPythonETLHadoopKafkaOracleAzureSparkData modeling
Posted about 2 months ago
Apply