Data Engineer
T
Tecknoworks EuropeTechnology Consulting
RomaniaContractMiddle
Salary not disclosed
Apply NowOpens the employer's application page
Job Details
- Experience
- 3+ years
- Required Skills
- DockerPythonSQLKafkaKubernetesSnowflakeAirflowSparkdbtDatabricksGitHub ActionsAzure DevOpsAWS Lambda
Requirements
- 3+ years of experience as a Data Engineer or in a similar role
- Strong proficiency in SQL and Python
- Solid understanding of data modeling, ETL/ELT processes, and pipeline orchestration
- Experience working in DevOps environments using CI/CD tools (e.g., GitHub Actions, Azure DevOps)
- Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes, Airflow)
- Familiarity with data cataloging tools like AWS Glue Data Catalog or Azure Purview
- Strong interpersonal and communication skills
- Adaptability in fast-paced environments with shifting client needs and priorities
- Analytical mindset with attention to detail and a commitment to delivering quality results
- Experience with AWS Glue for ETL/ELT processes
- Familiarity with Amazon Redshift, Athena, S3, and Lake Formation
- Use of AWS Lambda, Step Functions, and CloudWatch for data pipeline orchestration and monitoring
- Exposure to Amazon Kinesis or Kafka on AWS for real-time data streaming
- Knowledge of IAM, VPC, and security practices in AWS data environments
- Experience with Azure Data Factory (ADF)/Synapse for data integration and orchestration
- Familiarity with Azure Synapse Analytics, Azure Data Lake Storage (ADLS), and Azure SQL Database
- Hands-on with Databricks on Azure and Apache Spark for data processing and analytics
- Exposure to Azure Event Hubs, Azure Functions, and Logic Apps
- Understanding of Azure Monitor, Log Analytics, and role-based access control
Responsibilities
- Design, develop, and maintain robust and scalable data pipelines to ingest, transform, and store data from diverse sources
- Optimize data systems for performance, scalability, and reliability in a cloud-native environment
- Work closely with data analysts, data scientists, and other stakeholders to ensure high data quality and availability
- Develop and manage data models using DBT, ensuring modular, testable, and well-documented transformation layers
- Implement and enforce data governance, security, and privacy standards
- Manage and optimize cloud data warehouses, especially Snowflake, for performance, cost-efficiency, and scalability
- Monitor, troubleshoot, and improve data workflows and ETL/ELT processes
- Collaborate in the design and deployment of data lakes, warehouses, and lakehouse architectures
View Full Description & ApplyYou'll be redirected to the employer's site