Apache Airflow Jobs

Find remote positions requiring Apache Airflow skills. Browse through opportunities where you can utilize your expertise and grow your career.

Apache Airflow
59 jobs found. to receive daily emails with new job openings that match your preferences.
59 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

๐Ÿ“ USA

๐Ÿงญ Internship

๐Ÿ’ธ 18.0 USD per hour

๐Ÿ” Personal finance

๐Ÿข Company: GOBankingRates

  • Must be available to work 24 hours per week Monday-Friday for 16 weeks.
  • Currently pursuing a degree in Computer Science, Data Science, Engineering, or a related field.
  • Ability to write basic SQL queries for data validation.
  • Familiarity with Python for scripting and automation.
  • Strong problem-solving abilities and attention to detail.
  • Excellent verbal and written communication skills.
  • A passion for learning new tools and technologies in data engineering and quality assurance.

  • Actively participate in agile ceremonies to understand data requirements and ensure proper test coverage.
  • Work closely with data engineers, analysts, and product managers to refine requirements and validate data processes.
  • Assist in validating data transformations and ensuring accuracy across ETL pipelines using SQL queries.
  • Contribute to building automated test scripts using Python for data validation and regression testing.
  • Support testing of data pipelines developed using dbt and Apache Airflow.
  • Help monitor data pipeline performance and troubleshoot issues.
  • Create and maintain documentation for test cases and data pipelines.

PythonSQLApache AirflowETLSnowflakeData engineering

Posted about 17 hours ago
Apply
Apply
๐Ÿ”ฅ VP, Data and Analytics
Posted about 18 hours ago

๐Ÿ“ US

๐Ÿ” Compliance Technology

  • Technical expertise to architect data platforms in the cloud.
  • Ability to lead data engineering teams.
  • Strong understanding of data acquisition, ingestion, normalization, and analytics.
  • Experience with artificial intelligence, big data, analytics, and machine learning.

  • Build a world-class data organization to grow and transform data and analytics engineering teams.
  • Responsible for all aspects of Data Engineering including acquisition, ingestion, normalization, and analytics.
  • Architect the future of our cloud-based data platform.
  • Lead various data-driven engineering teams delivering software-based data solutions.
  • Collaborate with business units to drive insights into data on scalable infrastructure.
  • Partner with business and product teams for efficient data processing and predictive insights.
  • Define the approach to AI, big data, analytics, and machine learning leveraging collected data.

AWSSQLApache AirflowETLMachine LearningData engineeringData analytics

Posted about 18 hours ago
Apply
Apply

๐Ÿ“ Canada

๐Ÿงญ Full-Time

๐Ÿ” Technology for small businesses

๐Ÿข Company: Jobber๐Ÿ‘ฅ 501-1000๐Ÿ’ฐ $100,000,000 Series D almost 2 years agoSaaSMobileSmall and Medium BusinessesTask Management

  • Proven ability to lead and collaborate in team environments.
  • Strong coding skills in Python and SQL.
  • Expertise in building and maintaining ETL pipelines using tools like Airflow and dbt.
  • Experience with AWS tools such as Redshift, Glue, and Lambda.
  • Familiarity with handling large datasets using tools like Spark.
  • Experience with Terraform for infrastructure management.
  • Knowledge of dimensional modelling, star schemas, and data warehousing.

  • Design, develop, and maintain batch and real-time data pipelines within cloud infrastructure (preferably AWS).
  • Develop tools that automate processes and set up monitoring systems.
  • Collaborate with teams to extract actionable insights from data.
  • Lead initiatives to propose new technologies, participate in design and code reviews, and maintain data integrity.

AWSPythonSQLApache AirflowETLSparkTerraform

Posted about 23 hours ago
Apply
Apply

๐Ÿ“ Canada

๐Ÿงญ Contract

๐Ÿ” Advertising

  • Experience in data architecture and operations.
  • Proficiency in building and managing data pipelines.
  • Knowledge of business intelligence tools and analytics.

  • Accountable for data architecture and data pipelines management.
  • Responsible for building ingestion data pipelines from multiple sources.
  • Architecting fit-for-purpose data models for business needs.
  • Collaborating with BI Engineers to meet stakeholdersโ€™ requirements.

AWSPythonSQLApache AirflowData AnalysisETLData engineeringData modeling

Posted 2 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Internship

๐Ÿ” B2B technology

๐Ÿข Company: Demandbase๐Ÿ‘ฅ 501-1000๐Ÿ’ฐ $175,000,000 Debt Financing almost 2 years agoSales AutomationAdvertisingBig DataSaaSAnalyticsB2BMarketingMarketing AutomationSoftware

  • Pursuing a four-year degree in Computer Science or related field.
  • Preferably a rising senior.
  • Experience with at least one JVM language (Java, Scala, Kotlin, etc.).
  • Understanding of SDLC principles (CI/CD, Unit Testing, git, etc.).
  • Interest in data-intensive systems and data platforms.

  • Help build backend services/applications to support seamless management and integration of Demandbaseโ€™s Unified Data Platform.
  • Assist in developing the next generation of Demandbaseโ€™s Unified Data Platform through a combination of data pipelines, APIs, internal tools, and third-party/open-source tooling.
  • Work closely with and support cross-functional teams integrating with our Unified Data Platform.

Apache AirflowCloud ComputingGitJavaKafkaKotlinData engineeringCI/CDScala

Posted 2 days ago
Apply
Apply

๐Ÿ“ Brazil

๐Ÿงญ Full-Time

๐Ÿ” Digital Engineering and Modernization

๐Ÿข Company: Encora๐Ÿ‘ฅ 10001-10001๐Ÿ’ฐ $200,000,000 Private over 5 years agoBig DataCloud ComputingSoftware

  • Experience in data modeling.
  • Experience developing and maintaining data pipelines.
  • Proficiency in SQL.
  • Proficiency in Python.
  • Experience with AWS Redshift.
  • Experience with Apache Airflow.
  • Familiarity with BI tools.

  • Develop and maintain efficient and scalable data pipelines.
  • Model and transform data to meet analysis and reporting needs.
  • Collaborate closely with the customer, including BI and software engineering.
  • Lead other BI or DE team members.
  • Create and maintain detailed technical documentation.
  • Develop dashboards in AWS Quicksight with support from a BI Analyst.

PythonSQLApache AirflowBusiness IntelligenceData modeling

Posted 4 days ago
Apply
Apply

๐Ÿ“ Copenhagen, London, Stockholm, Berlin, Madrid, Montreal, Lisbon, 35 other countries

๐Ÿงญ Full-Time

๐Ÿ” Financial Technology

  • Strong background in building and managing data infrastructure at scale.
  • Expertise in Python, AWS, dbt, Airflow, and Kubernetes.
  • Ability to translate business and product requirements into technical data solutions.
  • Experience in mentoring and fostering collaboration within teams.
  • Curiosity and enthusiasm for experimenting with new technologies to solve complex problems.
  • Hands-on experience with modern data tools and contributing to strategic decision-making.

  • Partnering with product and business teams to develop data strategies that enable new features and improve user experience.
  • Driving key strategic projects across the organisation, dipping in and out as needed to provide leadership and hands-on support.
  • Supporting multiple teams across Pleo in delivering impactful data and analytics solutions.
  • Building data products that directly support Pleo's product roadmap and business goals.
  • Collaborating with the VP of Data and other data leaders to set the vision for Pleoโ€™s data strategy and ensure alignment with company objectives.
  • Enhancing our data infrastructure and pipelines to improve scalability, performance, and data quality.
  • Experimenting with and implementing innovative technologies to keep Pleoโ€™s data stack at the forefront of the industry.
  • Mentoring engineers, analysts, and data scientists to foster growth and build a world-class data team.

AWSPythonApache AirflowKubernetesData engineering

Posted 4 days ago
Apply
Apply

๐Ÿ“ Mexico

๐Ÿงญ Full-Time

๐Ÿ” Digital engineering and modernization

๐Ÿข Company: Encora๐Ÿ‘ฅ 10001-10001๐Ÿ’ฐ $200,000,000 Private over 5 years agoBig DataCloud ComputingSoftware

  • 7+ years of experience in backend development, focusing on data engineering.
  • Proficient in Python and writing clean, maintainable code.
  • Strong SQL skills for complex queries and optimization.
  • Hands-on experience in ETL processes and data pipelines.
  • Experience in data ingestion and modeling for various use cases.
  • Solid implementation of unit tests and error handling.
  • Hands-on AWS experience with services like S3, ECS, and Lambda.
  • Familiarity with Docker for containerization.
  • Ability to collaborate with data scientists and business teams.
  • Familiarity with version control using Git and CI/CD pipelines.

  • Design, develop, and maintain backend systems using Python.
  • Build and maintain data pipelines for cloud environments.
  • Implement efficient data models for large datasets.
  • Write unit tests and handle errors appropriately.
  • Use AWS services for deploying applications and data pipelines.
  • Collaborate with teams to create and optimize data visualizations.
  • Identify and resolve performance issues.

AWSDockerPythonSQLApache AirflowETLCI/CDData modeling

Posted 7 days ago
Apply
Apply

๐Ÿ“ CA, WA, NY, NJ, CT, all other U.S. states

๐Ÿงญ Full-Time

๐Ÿ’ธ 200000.0 - 275000.0 USD per year

๐Ÿ” Financial Technology

  • 8+ years of experience designing, developing, and launching backend systems using Python or Kotlin.
  • Extensive experience with highly available distributed systems utilizing AWS, MySQL, Spark, and Kubernetes.
  • Experience with online, real-time ML infrastructure like model servers or feature stores.
  • Developed offline environments for large scale data analysis and model training using Spark, Kubeflow, Ray, and Airflow.
  • Experience delivering major system features and writing high quality code.
  • Comfortable navigating from low-level language idioms to large system architecture.
  • Mastered gathering feedback and strong communication skills.
  • Bachelor's degree in a related field or equivalent practical experience.

  • Responsible for setting technical strategy for the team on a year-long time scale and linking it with business-impacting projects.
  • Collaborate across teams in the ML development lifecycle with machine learning engineers, platform engineers, and product management.
  • Act as a force-multiplier, defining and advocating for technical solutions and operational processes.
  • Ensure team operations and availability through monitoring, triage rotations, and testing.
  • Foster a culture of quality and ownership by setting standards and advocating beyond the team.
  • Develop talent by providing feedback, guidance, and leading by example.

AWSPythonApache AirflowKotlinKubeflowKubernetesMySQLSpark

Posted 7 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 120000.0 - 130000.0 USD per year

๐Ÿ” Rewards and loyalty app

๐Ÿข Company: Fetch

  • Proficient in SQL and understand the difference between SQL that works and SQL that performs.
  • Have worked with data modeling and orchestration tools.
  • Experience with relational (SQL), non-relational (NoSQL), and/or object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB).
  • Solid understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools.
  • Prior experience clearly communicating about data with internal and external customers.
  • Highly motivated to work autonomously, managing multiple work streams.
  • Interest in building and experimenting with different tools and tech, sharing learnings with the broader organization.
  • Experience developing and maintaining DBT or Airflow in production environments.
  • Experience programmatically deploying cloud resources on AWS, Azure, or GCP.
  • Implemented data quality, data governance, or disaster recovery initiatives.
  • Proficient in at least one imperative programming language (i.e., Python).

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance.
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices.
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data.
  • Translate business requirements for near-real-time actionable insights into data models and artifacts.
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders.
  • Administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure.
  • Test, monitor, and report on data health and data quality.
  • Lead data documentation and data discovery initiatives.

AWSPythonSQLApache AirflowBusiness IntelligenceETLSnowflake

Posted 7 days ago
Apply
Shown 10 out of 59