Apply

Senior Data Engineer

Posted about 1 month agoViewed

View full description

πŸ’Ž Seniority level: Senior

πŸ“ Location: Philippines, Spain, Germany, France, Italy

πŸ” Industry: Fintech, Healthcare, EdTech, Construction, Hospitality

🏒 Company: IntellectsoftπŸ‘₯ 251-500Augmented RealityArtificial Intelligence (AI)DevOpsBlockchainInternet of ThingsUX DesignWeb DevelopmentMobile AppsQuality AssuranceSoftware

πŸͺ„ Skills: AWSPythonSQLApache AirflowETLGCPAzureData modeling

Requirements:
  • Proficiency in SQL for data manipulation and querying large datasets.
  • Strong experience with Python for data processing and scripting.
  • Expertise in pySpark for distributed data processing and big data workflows.
  • Hands-on experience with Airflow for workflow orchestration and automation.
  • Deep understanding of Database Management Systems (DBMS), including design, optimization, and maintenance.
  • Solid knowledge of data modeling, ETL pipelines, and data integration.
  • Familiarity with cloud platforms such as AWS, GCP, or Azure.
Responsibilities:
  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Build and optimize large-scale data processing frameworks using PySpark.
  • Create workflows and automate processes using Apache Airflow.
  • Manage, monitor, and enhance database performance and integrity.
  • Collaborate with cross-functional teams, including data analysts, scientists, and stakeholders, to understand data needs.
  • Ensure data quality, reliability, and compliance with industry standards.
  • Troubleshoot, debug, and optimize data pipelines and workflows.
  • Continuously evaluate and integrate new tools and technologies to enhance data infrastructure.
Apply

Related Jobs

Apply
πŸ”₯ Senior Data Engineer
Posted about 21 hours ago

πŸ“ Madrid, Barcelona

🏒 Company: Clarity AI

  • 5+ years in data architecture, data engineering, or a related role, with hands-on experience in data modeling, schema design, and cloud-based data systems
  • Strong Python and SQL skills and proficiency with distributed data stores
  • Proficiency in schema design, data modeling, and building data products
  • Proficient in at least one major programming language, preferably Python, with a software engineering mindset for writing clean, maintainable code
  • Deep understanding of architectural principles in microservices, distributed systems, and data pipeline design
  • Familiarity with containerized environments, public/private API integrations, and security best practices
  • Strong communication and interpersonal skills, with experience working cross-functionally
  • Proven ability to guide teams and drive complex projects to successful completion
  • Self-starter, able to take ownership and initiative, with high energy and stamina
  • Decisive and action-oriented, able to make rapid decisions even when they are short of information
  • Highly motivated, independent and deeply passionate about sustainability and impact
  • Excellent oral and written English communication skills (minimum C1 level-proficient user)
  • Transforming raw data into intuitive, high-quality data models
  • Implementing and monitor data quality checks
  • Collaborating across functions
  • Working closely with engineering and product teams
  • Acting as a bridge between technical and non-technical stakeholders
  • Leading initiatives to improve data practices
  • Guiding the team in experimenting with new tools and technologies
  • Continuously evolving the data architecture
  • Applying a pragmatic approach to performance metrics and scaling decisions
  • Implementing performance metrics to monitor system health
  • Maintaining comprehensive documentation of data systems, processes, and best practices

DockerPythonSQLApache AirflowCloud ComputingKubernetesAlgorithmsData engineeringData StructuresPostgresREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingAgile methodologiesMicroservicesData visualizationData modelingData analyticsData management

Posted about 21 hours ago
Apply
Apply
πŸ”₯ Senior Data Engineer
Posted about 21 hours ago

πŸ“ Tallinn, Harju County, Estonia. Barcelona, Catalonia, Spain. Lisbon, Lisbon, Portugal. Bucharest, Bucharest, Romania. Cluj-Napoca, Cluj County, Romania

🧭 Full-Time

πŸ” Influencer Marketing

🏒 Company: Modash

  • Experienced as a Data Engineer
  • Used to working with unstructured data
  • Experienced working in Spark environment
  • Experience managing data workflows with a distributed workflow manager (i.e.: Airflow, AWS Step functions)
  • Hands-on experience in writing code in Python and SQL
  • Experience in building and maintaining ETL & ELT data pipelines
  • Familiarity with AWS ecosystem: DynamoDB, Glue, EMR, Kinesis, SQS, Lambda, ECS
  • Hands-on experience with SQL/NoSQL database design
  • Take ownership of building the best data platform for influencer marketing.
  • Design, develop, and test data pipelines in order to collect, process and store data.
  • Collaborate with your colleagues, from pair programming to mob reviewing. We are all for one.
  • You are expected to take part in every area of the data product, from brainstorming, roadmap planning, implementing, and reviewing, to releasing. Working with the CEO, CTO, engineers, sales, and customers.
  • Help us choose the best technical direction by providing well-reasoned ideas on which frameworks and tools to use.
  • Implement systems to monitor data quality for optimized accuracy and clarity.
  • Teach and be taught through code reviews and feedback.

AWSPythonSQLApache AirflowDynamoDBETLData engineeringSpark

Posted about 21 hours ago
Apply
Apply

πŸ“ Germany, Austria, Italy, Spain, Portugal

🧭 Full-Time

πŸ” Digital Solutions for Financial and Real Estate Industries

🏒 Company: PriceHubbleπŸ‘₯ 101-250πŸ’° Non-equity Assistance about 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years experience building and maintaining production data pipelines
  • Proficient in working with geospatial data (bonus)
  • Work with backend engineers and data scientists to turn raw data into trusted insights
  • Navigate cost-value trade-offs to deliver value to customers
  • Develop solutions that work in over 10 countries
  • Lead a project from concept to launch
  • Drive the team to deliver high-quality products, services, and processes
  • Improve the performance, data quality, and cost-efficiency of data pipelines
  • Maintain and monitor the data systems

PostgreSQLPythonSQLApache AirflowETLData engineering

Posted 1 day ago
Apply
Apply

πŸ“ Europe, APAC, Americas

🧭 Full-Time

πŸ” Software Development

🏒 Company: DockerπŸ‘₯ 251-500πŸ’° $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture

PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 4 days ago
Apply
Apply

πŸ“ Germany

🧭 Full-Time

πŸ” Insurtech

🏒 Company: Getsafe

  • 4+ years of experience in creating data pipelines using SQL/Python/Airflow
  • Experience designing Data Mart and Data Warehouse
  • Experience with cloud infrastructure, including Terraform
  • Analyze, design, develop, and deliver Data Warehouse solutions
  • Create ETL/ELT pipelines using Python and Airflow
  • Design, develop, maintain and support Data Warehouse & BI platform

PythonSQLApache AirflowETLTerraform

Posted 8 days ago
Apply
Apply

πŸ“ Spain

πŸ’Έ 80000.0 - 110000.0 EUR per year

πŸ” Financial services

  • 5+ years of professional experience in Data Engineering or similar roles.
  • Proficiency in SQL and DBT for data transformations.
  • Fluency in Python or other modern programming languages.
  • Experience with infrastructure as code languages like Terraform.
  • Knowledge of data modeling, data warehouse technologies, and cloud infrastructures.
  • Experience with AWS or other cloud platforms like Azure or GCP.
  • Ability to provide constructive code reviews.
  • Strong communication and collaboration skills.
  • Work with engineering managers and tech leads to identify and plan projects based on team goals.
  • Collaborate with teams across engineering, analytics, and product to deliver technology for analytical use cases.
  • Write high-quality, understandable code.
  • Review teammates' work and offer feedback.
  • Serve as a technical mentor for other engineers.
  • Promote a respectful and supportive team environment.
  • Participate in on-call rotation.

AWSPythonSQLTerraformData modeling

Posted 22 days ago
Apply
Apply

πŸ“ Poland, Spain, United Kingdom

πŸ” Beauty marketplace

🏒 Company: BooksyπŸ‘₯ 501-1000πŸ’° Debt Financing 5 months agoMobile PaymentsMarketplaceSaaSPaymentsMobile AppsWellnessSoftware

  • 5+ years of experience in backend and data engineering, with strong system design skills.
  • Practical proficiency in cloud technologies (ideally GCP), with expertise in tools like BigQuery, Dataflow, Pub/Sub, or similar.
  • Hands-on experience with CI/CD tools (e.g., GitLab CI) and infrastructure as code.
  • Strong focus on data quality, governance, and building scalable, automated workflows.
  • Experience designing self-service data platforms and infrastructure.
  • Proven ability to mentor and support others, fostering data literacy across teams.
  • Design and implement robust data solutions.
  • Enable teams to make informed, data-driven decisions.
  • Ensure data is accessible, reliable, and well-governed.
  • Play a key role in driving growth, innovation, and operational excellence.

GCPData engineeringCI/CDData modeling

Posted 23 days ago
Apply
Apply
πŸ”₯ Senior Data Engineer
Posted about 1 month ago

πŸ“ South Africa, Mauritius, Kenya, Nigeria

πŸ” Technology, Marketplaces

  • BSc degree in Computer Science, Information Systems, Engineering, or related technical field or equivalent work experience.
  • 3+ years related work experience.
  • Minimum of 2 years experience building and optimizing β€˜big data’ data pipelines, architectures and maintaining data sets.
  • Experienced in Python.
  • Experienced in SQL (PostgreSQL, MS SQL).
  • Experienced in using cloud services: AWS, Azure or GCP.
  • Proficiency in version control, CI/CD and GitHub.
  • Understanding/experience in Glue and PySpark highly desirable.
  • Experience in managing data life cycle.
  • Proficiency in manipulating, processing and architecting large disconnected data sets for analytical requirements.
  • Ability to maintain and optimise processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Good understanding of data management principles - data quality assurance and governance.
  • Strong analytical skills related to working with unstructured datasets.
  • Understanding of message queuing, stream processing, and highly scalable β€˜big data’ datastores.
  • Strong attention to detail.
  • Good communication and interpersonal skills.
  • Suggest efficiencies and execute on implementation of internal process improvements in automating manual processes.
  • Implement enhancements and new features across data systems.
  • Improve streamline processes within data systems with support from Senior Data Engineer.
  • Test CI/CD process for optimal data pipelines.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Highly efficient in ETL processes.
  • Develop and conduct unit tests on data pipelines as well as ensuring data consistency.
  • Develop and maintain automated monitoring solutions.
  • Support reporting and analytics infrastructure.
  • Maintain data quality and data governance as well as upkeep of overall maintenance of data infrastructure systems.
  • Maintain data warehouse and data lake metadata, data catalogue, and user documentation for internal business users.
  • Ensure best practice is implemented and maintained on database.

AWSPostgreSQLPythonSQLETLGitCI/CD

Posted about 1 month ago
Apply
Apply

πŸ“ Paris, New York, San Francisco, Sydney, Madrid, London, Berlin

πŸ” Communication technology

  • Passionate about data engineering.
  • Experience in designing and developing data infrastructure.
  • Technical skills to solve complex challenges.
  • Play a crucial role in designing, developing, and maintaining data infrastructure.
  • Collaborate with teams across the company to solve complex challenges.
  • Improve operational efficiency and lead business towards strategic goals.
  • Contribute to engineering efforts that enhance customer journey.

AWSPostgreSQLPythonSQLApache AirflowETLData engineering

Posted 3 months ago
Apply