Apply

Senior Data Engineer

Posted about 13 hours agoViewed

View full description

💎 Seniority level: Senior

📍 Location: Tallinn, Harju County, Estonia. Barcelona, Catalonia, Spain. Lisbon, Lisbon, Portugal. Bucharest, Bucharest, Romania. Cluj-Napoca, Cluj County, Romania, GMT+3

🔍 Industry: Influencer Marketing

🏢 Company: Modash

🗣️ Languages: English

🪄 Skills: AWSPythonSQLApache AirflowDynamoDBETLData engineeringSpark

Requirements:
  • Experienced as a Data Engineer
  • Used to working with unstructured data
  • Experienced working in Spark environment
  • Experience managing data workflows with a distributed workflow manager (i.e.: Airflow, AWS Step functions)
  • Hands-on experience in writing code in Python and SQL
  • Experience in building and maintaining ETL & ELT data pipelines
  • Familiarity with AWS ecosystem: DynamoDB, Glue, EMR, Kinesis, SQS, Lambda, ECS
  • Hands-on experience with SQL/NoSQL database design
Responsibilities:
  • Take ownership of building the best data platform for influencer marketing.
  • Design, develop, and test data pipelines in order to collect, process and store data.
  • Collaborate with your colleagues, from pair programming to mob reviewing. We are all for one.
  • You are expected to take part in every area of the data product, from brainstorming, roadmap planning, implementing, and reviewing, to releasing. Working with the CEO, CTO, engineers, sales, and customers.
  • Help us choose the best technical direction by providing well-reasoned ideas on which frameworks and tools to use.
  • Implement systems to monitor data quality for optimized accuracy and clarity.
  • Teach and be taught through code reviews and feedback.
Apply

Related Jobs

Apply
🔥 Senior Data Engineer
Posted about 13 hours ago

📍 Madrid, Barcelona

🏢 Company: Clarity AI

  • 5+ years in data architecture, data engineering, or a related role, with hands-on experience in data modeling, schema design, and cloud-based data systems
  • Strong Python and SQL skills and proficiency with distributed data stores
  • Proficiency in schema design, data modeling, and building data products
  • Proficient in at least one major programming language, preferably Python, with a software engineering mindset for writing clean, maintainable code
  • Deep understanding of architectural principles in microservices, distributed systems, and data pipeline design
  • Familiarity with containerized environments, public/private API integrations, and security best practices
  • Strong communication and interpersonal skills, with experience working cross-functionally
  • Proven ability to guide teams and drive complex projects to successful completion
  • Self-starter, able to take ownership and initiative, with high energy and stamina
  • Decisive and action-oriented, able to make rapid decisions even when they are short of information
  • Highly motivated, independent and deeply passionate about sustainability and impact
  • Excellent oral and written English communication skills (minimum C1 level-proficient user)
  • Transforming raw data into intuitive, high-quality data models
  • Implementing and monitor data quality checks
  • Collaborating across functions
  • Working closely with engineering and product teams
  • Acting as a bridge between technical and non-technical stakeholders
  • Leading initiatives to improve data practices
  • Guiding the team in experimenting with new tools and technologies
  • Continuously evolving the data architecture
  • Applying a pragmatic approach to performance metrics and scaling decisions
  • Implementing performance metrics to monitor system health
  • Maintaining comprehensive documentation of data systems, processes, and best practices

DockerPythonSQLApache AirflowCloud ComputingKubernetesAlgorithmsData engineeringData StructuresPostgresREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingAgile methodologiesMicroservicesData visualizationData modelingData analyticsData management

Posted about 13 hours ago
Apply
Apply
🔥 Senior Data Engineer
Posted 3 days ago

📍 Europe, APAC, Americas

🧭 Full-Time

🔍 Software Development

🏢 Company: Docker👥 251-500💰 $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture

PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 3 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted 21 days ago

📍 Spain

💸 80000.0 - 110000.0 EUR per year

🔍 Financial services

  • 5+ years of professional experience in Data Engineering or similar roles.
  • Proficiency in SQL and DBT for data transformations.
  • Fluency in Python or other modern programming languages.
  • Experience with infrastructure as code languages like Terraform.
  • Knowledge of data modeling, data warehouse technologies, and cloud infrastructures.
  • Experience with AWS or other cloud platforms like Azure or GCP.
  • Ability to provide constructive code reviews.
  • Strong communication and collaboration skills.
  • Work with engineering managers and tech leads to identify and plan projects based on team goals.
  • Collaborate with teams across engineering, analytics, and product to deliver technology for analytical use cases.
  • Write high-quality, understandable code.
  • Review teammates' work and offer feedback.
  • Serve as a technical mentor for other engineers.
  • Promote a respectful and supportive team environment.
  • Participate in on-call rotation.

AWSPythonSQLTerraformData modeling

Posted 21 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted 22 days ago

📍 Poland, Spain, United Kingdom

🔍 Beauty marketplace

🏢 Company: Booksy👥 501-1000💰 Debt Financing 5 months agoMobile PaymentsMarketplaceSaaSPaymentsMobile AppsWellnessSoftware

  • 5+ years of experience in backend and data engineering, with strong system design skills.
  • Practical proficiency in cloud technologies (ideally GCP), with expertise in tools like BigQuery, Dataflow, Pub/Sub, or similar.
  • Hands-on experience with CI/CD tools (e.g., GitLab CI) and infrastructure as code.
  • Strong focus on data quality, governance, and building scalable, automated workflows.
  • Experience designing self-service data platforms and infrastructure.
  • Proven ability to mentor and support others, fostering data literacy across teams.
  • Design and implement robust data solutions.
  • Enable teams to make informed, data-driven decisions.
  • Ensure data is accessible, reliable, and well-governed.
  • Play a key role in driving growth, innovation, and operational excellence.

GCPData engineeringCI/CDData modeling

Posted 22 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 1 month ago

📍 South Africa, Mauritius, Kenya, Nigeria

🔍 Technology, Marketplaces

  • BSc degree in Computer Science, Information Systems, Engineering, or related technical field or equivalent work experience.
  • 3+ years related work experience.
  • Minimum of 2 years experience building and optimizing ‘big data’ data pipelines, architectures and maintaining data sets.
  • Experienced in Python.
  • Experienced in SQL (PostgreSQL, MS SQL).
  • Experienced in using cloud services: AWS, Azure or GCP.
  • Proficiency in version control, CI/CD and GitHub.
  • Understanding/experience in Glue and PySpark highly desirable.
  • Experience in managing data life cycle.
  • Proficiency in manipulating, processing and architecting large disconnected data sets for analytical requirements.
  • Ability to maintain and optimise processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Good understanding of data management principles - data quality assurance and governance.
  • Strong analytical skills related to working with unstructured datasets.
  • Understanding of message queuing, stream processing, and highly scalable ‘big data’ datastores.
  • Strong attention to detail.
  • Good communication and interpersonal skills.
  • Suggest efficiencies and execute on implementation of internal process improvements in automating manual processes.
  • Implement enhancements and new features across data systems.
  • Improve streamline processes within data systems with support from Senior Data Engineer.
  • Test CI/CD process for optimal data pipelines.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Highly efficient in ETL processes.
  • Develop and conduct unit tests on data pipelines as well as ensuring data consistency.
  • Develop and maintain automated monitoring solutions.
  • Support reporting and analytics infrastructure.
  • Maintain data quality and data governance as well as upkeep of overall maintenance of data infrastructure systems.
  • Maintain data warehouse and data lake metadata, data catalogue, and user documentation for internal business users.
  • Ensure best practice is implemented and maintained on database.

AWSPostgreSQLPythonSQLETLGitCI/CD

Posted about 1 month ago
Apply
Apply

📍 Paris, New York, San Francisco, Sydney, Madrid, London, Berlin

🔍 Communication technology

  • Passionate about data engineering.
  • Experience in designing and developing data infrastructure.
  • Technical skills to solve complex challenges.
  • Play a crucial role in designing, developing, and maintaining data infrastructure.
  • Collaborate with teams across the company to solve complex challenges.
  • Improve operational efficiency and lead business towards strategic goals.
  • Contribute to engineering efforts that enhance customer journey.

AWSPostgreSQLPythonSQLApache AirflowETLData engineering

Posted 3 months ago
Apply