Airflow Jobs

Find remote positions requiring Airflow skills. Browse through opportunities where you can utilize your expertise and grow your career.

Airflow
143 jobs found. to receive daily emails with new job openings that match your preferences.
143 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
πŸ”₯ Analytics Engineer
Posted about 14 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 115000.0 - 129000.0 USD per year

πŸ” Education

🏒 Company: amplify_careers

  • BS in Computer Science, Data Science, or equivalent experience.
  • 3+ years of professional software development or data engineering experience
  • Strong computer, data, and analytics engineering fundamentals.
  • Proven fluency in SQL and its use in code-based ETL frameworks preferably dbt
  • Understanding of ETL/ELT pipelines, analytical data modeling, aggregations, and metrics
  • Strong understanding of analytical modeling architectures, including the Kimball dimensional data model design
  • Ability to clearly communicate and present technical concepts to a broad audience both verbally and in written form
  • Build well-tested and documented ELT data pipelines for both full and incremental dbt models to funnel into a fact and dimensional data mart.
  • Work closely with sales on logistics pipeline forecasting and sales pipeline tracking to help focus our sales teams in the right areas for the biggest impact.
  • Align with finance on making sure we have well audited data inline with established financial best practices.
  • Engineer novel datasets which express a student's progress and performance through an adaptive learning experience which allows for flexible comparison across students and deep analysis of individual students.
  • Work with the data science team to measure the impact of design changes on an administrator reporting application.
  • Contribute to leading industry data standards, such as Caliper Analytics, EdFi, or xAPI
  • Craft slowly changing dimensional models that take into account the nuances of K-12 education such as School Year changes and students moving schools or classes.

AWSPostgreSQLPythonSQLBusiness IntelligenceData AnalysisETLSnowflakeTableauAirflowData engineeringAnalytical SkillsCI/CDData visualizationData modelingData analytics

Posted about 14 hours ago
Apply
Apply
πŸ”₯ Data Engineer
Posted 4 days ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 153000.0 - 216000.0 USD per year

πŸ” Software Development

  • 3+ years of experience in a data engineering role building products, ideally in a fast-paced environment
  • Good foundations in Python and SQL.
  • Experience with Spark, PySpark, DBT, Snowflake and Airflow
  • Knowledge of visualization tools, such as Metabase, Jupyter Notebooks (Python)
  • Collaborate on the design and improvements of the data infrastructure
  • Partner with product and engineering to advocate best practices and build supporting systems and infrastructure for the various data needs
  • Create data pipelines that stitch together various data sources in order to produce valuable business insights
  • Create real-time data pipelines in collaboration with the Data Science team

PythonSQLETLSnowflakeAirflowData engineeringSparkRESTful APIsData visualization

Posted 4 days ago
Apply
Apply

πŸ“ Sweden

πŸ” Ad Tech

🏒 Company: eyeoπŸ‘₯ 51-100InternetOpen SourcePrivacySoftwareBrowser Extensions

  • Experience translating data strategy into scalable, fault-tolerant architectures
  • Familiarity with different approaches to data architecture: warehouse, lake, mesh, batch vs streaming, ETL vs ELT, etc., and how they can be leveraged in different use cases
  • Experience in Python and common data libraries and platforms such as Airflow, Pandas, PySpark, etc
  • Experience with cloud services (ideally Google Cloud), including managing infrastructure with Terraform
  • Expertise with advanced SQL queries and query optimization, ideally in BigQuery
  • Passion about introducing engineering best practices, for instance to ensure testability, data quality and completeness, etc
  • Excellent communication and collaboration skills, both with engineers and non-technical stakeholders
  • Design and build data platforms, endpoints and pipelines that meet business requirements and enable stakeholders to make more data-driven decisions, without losing track of technical quality and maintainability
  • Actively collaborate with teams across all of eyeo (like browser extension developers, data analysts and legal counsels) to design data collection systems that are compliant with regulations and respect user privacy
  • Manage software from proof-on-concept to deployment, to operation, and finally deprecation; and manage data through ingestion, access, schema changes and deletion
  • Improve both our software and data lifecycle processes
  • Implement strategies to ensure that data is accurate, complete, timely and consistent
  • Identify, design and implement process improvements: automating manual processes, simplifying collaboration with data analysts, etc
  • Contribute to ongoing management of our platforms, including performance monitoring, troubleshooting and resolution of technical issues
  • Be a multiplier in your team, encouraging debate, and helping to create an environment that fosters learning and growth

PythonSQLCloud ComputingETLGCPAirflowAlgorithmsData engineeringData StructuresREST APIPandasCommunication SkillsCollaborationProblem SolvingLinuxDevOpsTerraformData visualizationData modelingData management

Posted 4 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 220000.0 USDC per year

πŸ” Software Development

🏒 Company: OrcaπŸ‘₯ 11-50πŸ’° $18,000,000 Series A over 3 years agoCryptocurrencyBlockchainOnline PortalsInformation Technology

  • A strong track record of working on high-performance, scalable systems with expertise in release engineering, infrastructure, and operations.
  • Extensive experience with AWS services (e.g., ECS, copilot, Cloudwatch) and the ability to troubleshoot and optimize cloud-based systems.
  • Hands-on experience with tools like GitHub Action for reliable and efficient deployment workflows.
  • Familiarity with tools like Datadog to build actionable monitoring and alerting systems.
  • Proficiency in infrastructure-as-code tools like Terraform, and containerization tools like Docker. Experience with orchestrators like Kubernetes or Airflow is a plus.
  • Comfortable working independently in an async environment while collaborating effectively with a team. You understand trade-offs and advocate for pragmatic solutions.
  • Familiarity with Decentralized Finance (DeFi) concepts, AMMs, and the Solana ecosystem is a plus but not required.
  • Design, manage, and optimize AWS infrastructure with a focus on scalability, reliability, and cost efficiency.
  • Triage and resolve critical infrastructure issues proactively.
  • Build and refine CI/CD processes using modern tools, ensuring seamless, secure, and efficient deployments.
  • Develop robust monitoring, logging, and alerting systems using tools like Datadog or Grafana to improve visibility and system performance.
  • Architect systems that handle growth effortlessly, minimize downtime, and maintain high performance.
  • Implement effective alerting mechanisms to prioritize and address critical issues proactively.
  • Optimize and document infrastructure processes, leveraging tools like Terraform, Docker, and Airflow to create scalable and maintainable systems.
  • Partner with engineering teams to design and refine infrastructure that powers features like real-time monitoring, automated transaction execution, and analytics.

AWSDockerPostgreSQLKubernetesAirflowGrafanaRustCI/CDLinuxDevOpsTerraform

Posted 4 days ago
Apply
Apply

πŸ“ United Kingdom

πŸ” Ad tech

🏒 Company: eyeoπŸ‘₯ 51-100InternetOpen SourcePrivacySoftwareBrowser Extensions

  • Experience translating data strategy into scalable, fault-tolerant architectures
  • Familiarity with different approaches to data architecture: warehouse, lake, mesh, batch vs streaming, ETL vs ELT, etc., and how they can be leveraged in different use cases
  • Experience in Python and common data libraries and platforms such as Airflow, Pandas, PySpark, etc
  • Experience with cloud services (ideally Google Cloud), including managing infrastructure with Terraform
  • Expertise with advanced SQL queries and query optimization, ideally in BigQuery
  • Passion about introducing engineering best practices, for instance to ensure testability, data quality and completeness, etc
  • Excellent communication and collaboration skills, both with engineers and non-technical stakeholders
  • Design and build data platforms, endpoints and pipelines that meet business requirements and enable stakeholders to make more data-driven decisions, without losing track of technical quality and maintainability
  • Actively collaborate with teams across all of eyeo (like browser extension developers, data analysts and legal counsels) to design data collection systems that are compliant with regulations and respect user privacy
  • Manage software from proof-on-concept to deployment, to operation, and finally deprecation; and manage data through ingestion, access, schema changes and deletion
  • Improve both our software and data lifecycle processes
  • Implement strategies to ensure that data is accurate, complete, timely and consistent
  • Identify, design and implement process improvements: automating manual processes, simplifying collaboration with data analysts, etc
  • Contribute to ongoing management of our platforms, including performance monitoring, troubleshooting and resolution of technical issues
  • Be a multiplier in your team, encouraging debate, and helping to create an environment that fosters learning and growth

PythonSoftware DevelopmentSQLCloud ComputingData AnalysisETLGCPAirflowData engineeringREST APIPandasCommunication SkillsAnalytical SkillsCollaborationProblem SolvingTerraformData visualizationData modelingSoftware Engineering

Posted 5 days ago
Apply
Apply

πŸ“ Ireland

πŸ” Ad tech

🏒 Company: eyeoπŸ‘₯ 51-100InternetOpen SourcePrivacySoftwareBrowser Extensions

  • Experience translating data strategy into scalable, fault-tolerant architectures
  • Familiarity with different approaches to data architecture: warehouse, lake, mesh, batch vs streaming, ETL vs ELT, etc., and how they can be leveraged in different use cases
  • Experience in Python and common data libraries and platforms such as Airflow, Pandas, PySpark, etc
  • Experience with cloud services (ideally Google Cloud), including managing infrastructure with Terraform
  • Expertise with advanced SQL queries and query optimization, ideally in BigQuery
  • Passion about introducing engineering best practices, for instance to ensure testability, data quality and completeness, etc
  • Design and build data platforms, endpoints and pipelines that meet business requirements and enable stakeholders to make more data-driven decisions, without losing track of technical quality and maintainability
  • Actively collaborate with teams across all of eyeo (like browser extension developers, data analysts and legal counsels) to design data collection systems that are compliant with regulations and respect user privacy
  • Manage software from proof-on-concept to deployment, to operation, and finally deprecation; and manage data through ingestion, access, schema changes and deletion
  • Improve both our software and data lifecycle processes
  • Implement strategies to ensure that data is accurate, complete, timely and consistent
  • Identify, design and implement process improvements: automating manual processes, simplifying collaboration with data analysts, etc
  • Contribute to ongoing management of our platforms, including performance monitoring, troubleshooting and resolution of technical issues
  • Be a multiplier in your team, encouraging debate, and helping to create an environment that fosters learning and growth

DockerPythonSQLCloud ComputingETLAirflowApache KafkaData engineeringPandasCommunication SkillsAnalytical SkillsCollaborationProblem SolvingRESTful APIsTerraformMicroservicesData visualizationData modelingSoftware EngineeringData management

Posted 5 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 215900.0 - 254000.0 USD per year

πŸ” Software Development

🏒 Company: HeadwayπŸ‘₯ 201-500πŸ’° $125,000,000 Series C over 1 year agoMental Health Care

  • 5+ years of experience as a Data Engineer
  • 5+ years of experience as an Engineering Manager
  • Prior experience with Python, Spark
  • Prior experience building infrastructure for ML and AI systems supporting iterative model training
  • Have built data platform systems on common PaaS such as AWS, FiveTran, Snowflake and/or DBT
  • Lead and manage a team of 5+ engineers with headcount for growth
  • Help plan and build our next generation data platform systems with an eye toward scalability and stability
  • Work directly with business, finance, and product stakeholders to roadmap and align projects
  • Develop and grow your team through weekly 1-1s, mentorship, and feedback
  • Work with and help develop aspiring engineering leaders within the engineering team
  • Contribute to the broader company strategy and product roadmap

AWSDockerLeadershipProject ManagementPythonSQLETLKubernetesPeople ManagementSnowflakeCross-functional Team LeadershipAirflowData engineeringREST APISparkCommunication SkillsAnalytical SkillsCI/CDAgile methodologiesTerraformTeam managementStakeholder managementMentorshipData modelingData analyticsData management

Posted 5 days ago
Apply
Apply

πŸ“ Arizona, CA, AZ, CO, NC, UT

🧭 Internship

πŸ’Έ 29.6 - 31.45 USD per hour

πŸ” Software Development

🏒 Company: Rocket LawyerπŸ‘₯ 251-500πŸ’° $223,000,000 Debt Financing almost 4 years agoLegal TechLaw EnforcementLegal

  • Senior Undergraduate Student pursuing a Bachelor’s degree in Computer Science or a related field.
  • Proficiency in SQL, Python, Git, and Linux.
  • Ability to work collaboratively in a fast-paced, team-oriented environment.
  • Familiarity with Google Cloud Platform, Airflow, and Snowflake.
  • Analyze, extract, transform, and load data from multiple internal and external sources into a Snowflake warehouse to support data analysis and business intelligence efforts.
  • Use SQL and API calls to extract data, and then transform and load it using SQL-based processes.
  • Use Airflow in Google Cloud to manage and orchestrate data flows, ensuring seamless data pipeline operations.
  • Participate in daily team huddles, working alongside other engineers, report developers, and product teams to ensure alignment and progress.
  • Work closely with report developers to create useful target tables and views, supporting various business intelligence reports.
  • Partner with product owners, release managers, and QA staff to validate and promote code to production.
  • Update a JIRA Kanban board during certain weeks and a sprint board in others, helping to manage and track progress in an agile environment.

PythonSQLETLGitSnowflakeAirflowLinux

Posted 5 days ago
Apply
Apply

πŸ“ CA, AZ, CO, NC, UT

🧭 Internship

πŸ’Έ 29.6 - 31.45 USD per hour

πŸ” Software Development

🏒 Company: Rocket LawyerπŸ‘₯ 251-500πŸ’° $223,000,000 Debt Financing almost 4 years agoLegal TechLaw EnforcementLegal

  • Senior Undergraduate Student pursuing a Bachelor’s degree in Computer Science or a related field
  • Proficiency in SQL, Python, Git, and Linux
  • Ability to work collaboratively in a fast-paced, team-oriented environment
  • Familiarity with Google Cloud Platform, Airflow, and Snowflake
  • Analyze, extract, transform, and load data from multiple internal and external sources into a Snowflake warehouse
  • Use SQL and API calls to extract data, and then transform and load it using SQL-based processes
  • Use Airflow in Google Cloud to manage and orchestrate data flows, ensuring seamless data pipeline operations
  • Participate in daily team huddles, working alongside other engineers, report developers, and product teams to ensure alignment and progress
  • Work closely with report developers to create useful target tables and views, supporting various business intelligence reports
  • Partner with product owners, release managers, and QA staff to validate and promote code to production
  • Update a JIRA Kanban board during certain weeks and a sprint board in others, helping to manage and track progress in an agile environment

PythonSQLBusiness IntelligenceETLGitSnowflakeAirflowAPI testingData engineeringLinux

Posted 5 days ago
Apply
Apply

πŸ“ California, Arizona, Colorado, North Carolina, Utah

🧭 Internship

πŸ’Έ 29.6 - 31.45 USD per hour

πŸ” Software Development

🏒 Company: Rocket LawyerπŸ‘₯ 251-500πŸ’° $223,000,000 Debt Financing almost 4 years agoLegal TechLaw EnforcementLegal

  • Senior Undergraduate Student pursuing a Bachelor’s degree in Computer Science or a related field.
  • Proficiency in SQL, Python, Git, and Linux.
  • Ability to work collaboratively in a fast-paced, team-oriented environment.
  • Familiarity with Google Cloud Platform, Airflow, and Snowflake.
  • Analyze, extract, transform, and load data from multiple internal and external sources into a Snowflake warehouse to support data analysis and business intelligence efforts.
  • Use SQL and API calls to extract data, and then transform and load it using SQL-based processes.
  • Use Airflow in Google Cloud to manage and orchestrate data flows, ensuring seamless data pipeline operations.
  • Participate in daily team huddles, working alongside other engineers, report developers, and product teams to ensure alignment and progress.
  • Work closely with report developers to create useful target tables and views, supporting various business intelligence reports.
  • Partner with product owners, release managers, and QA staff to validate and promote code to production.
  • Update a JIRA Kanban board during certain weeks and a sprint board in others, helping to manage and track progress in an agile environment.

PythonSQLETLGitSnowflakeAirflowAPI testingData engineeringLinuxData modeling

Posted 5 days ago
Apply
Shown 10 out of 143