Apply

Middle/Senior Data Engineer

Posted 7 days agoViewed

View full description

💎 Seniority level: Middle, 4+ years

📍 Location: Poland, Ukraine, Abroad

🔍 Industry: Data Engineering

🏢 Company: N-iX👥 1001-5000IT Services and IT Consulting

🗣️ Languages: English

⏳ Experience: 4+ years

🪄 Skills: SQLApache AirflowGitMicrosoft SQL ServerAzure

Requirements:
  • Experience with Databricks
  • 4+ years experience in development of database systems (MS SQL/T-SQL)
  • Experience in creation and maintenance of Azure Pipelines
  • Developing robust data pipelines with DBT
Responsibilities:
  • Implementation of business logic in Data Warehouse
  • Conversion of business requirements into data models
  • Pipelines management
  • Loadings and query performance tuning
Apply

Related Jobs

Apply

📍 Poland

🧭 Full-Time

🔍 Software Development

🏢 Company: N-iX👥 1001-5000IT Services and IT Consulting

  • Minimum of 3-4 years as data engineer, or in a relevant field
  • Advanced experience in Python, particularly in delivering production-grade data pipelines and troubleshooting code-based bugs.
  • Structured approach to data insights
  • Familiarity with cloud platforms (preferably Azure)
  • Experience with Databricks, Snowflake, or similar data platforms
  • Knowledge of relational databases, with proficiency in SQL
  • Experience using Apache Spark
  • Experience in creating and maintaining structured documentation
  • Proficiency in utilizing testing frameworks to ensure code reliability and maintainability
  • Experience with Gitlab or equivalent tools
  • English Proficiency: B2 level or higher
  • Design, build, and maintain data pipelines using Python
  • Collaborate with an international team to develop scalable data solutions
  • Conduct in-depth analysis and debugging of system bugs (Tier 2)
  • Develop and maintain smart documentation for process consistency, including the creation and refinement of checklists and workflows
  • Set up and configure new tenants, collaborating closely with team members to ensure smooth onboarding
  • Write integration tests to ensure the quality and reliability of data services
  • Work with Gitlab to manage code and collaborate with team members
  • Utilize Databricks for data processing and management

DockerPythonSQLCloud ComputingData AnalysisETLGitKubernetesSnowflakeApache KafkaAzureData engineeringRDBMSREST APIPandasCI/CDDocumentationMicroservicesDebugging

Posted 7 days ago
Apply