Apply

Senior Data Engineer

Posted 11 days agoViewed

View full description

💎 Seniority level: Senior, 5 years

📍 Location: Bulgaria

🔍 Industry: Software Development

🏢 Company: Dreamix Ltd.

🗣️ Languages: English

⏳ Experience: 5 years

🪄 Skills: AWSPostgreSQLPythonETLHadoopKafkaOracleAzureSparkData modeling

Requirements:
  • Minimum 5 years of relevant experience in data engineering
  • Bachelor's degree in Computer Science or related field
  • Strong proficiency in Python
  • Familiarity with big data technologies like Hadoop, Spark, Kafka
  • Experience with cloud platforms (AWS, Azure, Google Cloud)
  • Understanding of data warehousing concepts
  • Experience with databases like SQL Server, Oracle, PostgreSQL
  • Solid understanding of data modeling and database design
  • Excellent problem-solving and communication skills
Responsibilities:
  • Design, develop, and maintain scalable data pipelines
  • Collaborate with data scientists, analysts, and stakeholders
  • Utilize Python for data processing and analysis
  • Implement ETL processes for data integration
  • Optimize big data storage and processing
  • Troubleshoot data-related issues
  • Follow trends and technologies in data engineering
  • Implement data security best practices
  • Develop and maintain API integrations
Apply

Related Jobs

Apply
🔥 Senior Data Engineer
Posted 3 days ago

📍 Europe, APAC, Americas

🧭 Full-Time

🔍 Software Development

🏢 Company: Docker👥 251-500💰 $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture

PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 3 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 1 month ago

📍 South Africa, Mauritius, Kenya, Nigeria

🔍 Technology, Marketplaces

  • BSc degree in Computer Science, Information Systems, Engineering, or related technical field or equivalent work experience.
  • 3+ years related work experience.
  • Minimum of 2 years experience building and optimizing ‘big data’ data pipelines, architectures and maintaining data sets.
  • Experienced in Python.
  • Experienced in SQL (PostgreSQL, MS SQL).
  • Experienced in using cloud services: AWS, Azure or GCP.
  • Proficiency in version control, CI/CD and GitHub.
  • Understanding/experience in Glue and PySpark highly desirable.
  • Experience in managing data life cycle.
  • Proficiency in manipulating, processing and architecting large disconnected data sets for analytical requirements.
  • Ability to maintain and optimise processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Good understanding of data management principles - data quality assurance and governance.
  • Strong analytical skills related to working with unstructured datasets.
  • Understanding of message queuing, stream processing, and highly scalable ‘big data’ datastores.
  • Strong attention to detail.
  • Good communication and interpersonal skills.
  • Suggest efficiencies and execute on implementation of internal process improvements in automating manual processes.
  • Implement enhancements and new features across data systems.
  • Improve streamline processes within data systems with support from Senior Data Engineer.
  • Test CI/CD process for optimal data pipelines.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Highly efficient in ETL processes.
  • Develop and conduct unit tests on data pipelines as well as ensuring data consistency.
  • Develop and maintain automated monitoring solutions.
  • Support reporting and analytics infrastructure.
  • Maintain data quality and data governance as well as upkeep of overall maintenance of data infrastructure systems.
  • Maintain data warehouse and data lake metadata, data catalogue, and user documentation for internal business users.
  • Ensure best practice is implemented and maintained on database.

AWSPostgreSQLPythonSQLETLGitCI/CD

Posted about 1 month ago
Apply