Airflow Jobs

Find remote positions requiring Airflow skills. Browse through opportunities where you can utilize your expertise and grow your career.

Airflow
118 jobs found. to receive daily emails with new job openings that match your preferences.
118 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States, Latin America, India

πŸ” Data solutions and cloud data platforms

  • 10+ years of hands-on experience as a Solutions Architect and/or Data Engineer.
  • Team lead and mentorship experience.
  • Programming expertise in Java, Python, and/or Scala.
  • Experience with core cloud data platforms like Snowflake, Spark, AWS, Azure, Databricks, and GCP.
  • Proficient in SQL with the ability to write, debug, and optimize queries.
  • Strong client-facing written and verbal communication skills.
  • 4-year Bachelor's degree in Computer Science or a related field.
  • Design and implement end-to-end technical solutions into production.
  • Ensure performance, security, scalability, and robust data integration.
  • Lead and mentor other engineers.
  • Create and deliver detailed presentations and documentation.

AWSPythonSQLGCPHadoopJavaKafkaSnowflakeAirflowAzureSparkScala

Posted 2 days ago
Apply
Apply
πŸ”₯ Data Engineer
Posted 3 days ago

πŸ“ California

🧭 Full-Time

πŸ’Έ 145000.0 USD per year

πŸ” Health Insurance

🏒 Company: Sidecar HealthπŸ‘₯ 101-250πŸ’° $165,000,000 Series D 7 months agoπŸ«‚ Last layoff over 2 years agoHealth InsuranceInsurTechInsuranceHealth CareFinTech

  • Master’s degree or foreign degree equivalent in Computer Science or a related field.
  • 1+ years of experience in Data Engineer or Software Engineer roles.
  • Proficiency in SQL and Python, with the ability to write complex SQL statements.
  • Hands-on experience with ETL processes, real-time and batch data processing.
  • Familiarity with Spark, Athena, Docker, and version control systems like GIT.
  • Knowledge of secure, scalable, cloud-based architectures compliant with HIPAA or PCI.
  • Experience in creating data visualizations using Tableau or ThoughtSpot.
  • Ability to translate business requirements into scalable software solutions.
  • Use SQL and Python on AWS to build ETL jobs and data pipelines for data integration into Snowflake.
  • Leverage DBT to transform data, consolidate records, and create clean data models.
  • Utilize AWS technologies to send reports and support business teams.
  • Containerize and orchestrate data pipelines with Docker and Airflow.
  • Perform data quality checks and ensure data reliability.
  • Develop reports and dashboards using Tableau and ThoughtSpot.
  • Participate in agile development activities.

AWSDockerPythonSQLETLSnowflakeTableauAirflowSpark

Posted 3 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 170000.0 - 195000.0 USD per year

πŸ” Healthcare

🏒 Company: Parachute HealthπŸ‘₯ 101-250πŸ’° $1,000 about 5 years agoMedicalHealth CareSoftware

  • 5+ years of relevant experience.
  • Experience in Data Engineering with Python.
  • Experience building customer-facing software.
  • Strong listening and communication skills.
  • Time management and organizational skills.
  • Proactive, a driven self-starter who can work independently or as part of a team.
  • Ability to think with the 'big picture' in mind.
  • Passionate about improving patient outcomes in the healthcare space.
  • Architect solutions to integrate and manage large volumes of data across various internal and external systems.
  • Establish best practices and data governance standards to ensure that data infrastructure is built for long-term scalability.
  • Build and maintain a reporting product for external customers that visualizes data and provides tabular reports.
  • Collaborate across the organization to assess data engineering needs.

PythonETLAirflowData engineeringData visualization

Posted 7 days ago
Apply
Apply

πŸ“ Philippines

🏒 Company: Manila RecruitmentπŸ‘₯ 11-50Staffing AgencyConsultingHuman ResourcesRecruitingSocial Media

  • At least 3 years of experience in Ruby on Rails.
  • At least 3 years of experience in Typescript/React/Next.js.
  • Experience with any cloud platform, ideally AWS.
  • Strong relationship management and solution development expertise.
  • Exceptional communication, presentation, and negotiation skills.
  • Collaborative mindset with a proactive, problem-solving approach.
  • Ability to manage complex technical conversations and align cross-functional teams.
  • Taking full responsibility for the delivery of major features or entire systems, ensuring quality and meeting deadlines.
  • Providing technical mentorship and guidance to less experienced team members, helping with their career growth.
  • Working closely with stakeholders, such as product managers and business leaders, to align technical solutions with business goals.
  • Ensuring code meets high standards of quality, performance, and maintainability.
  • Identifying opportunities to improve development practices, performance, and team efficiency.
  • Designing systems that can scale and meet performance requirements, addressing architectural challenges.
  • Ensuring that systems and software are secure, robust, and adhere to compliance standards where required.

AWSNode.jsPythonElasticSearchRuby on RailsSnowflakeTypeScriptAirflowNext.jsReactTerraform

Posted 7 days ago
Apply
Apply
πŸ”₯ GCP Data Engineer
Posted 11 days ago

πŸ“ India

🧭 Full-Time

πŸ” Experience Management

🏒 Company: Experience.comπŸ‘₯ 101-250πŸ’° $14,575,000 Series A about 6 years agoCustomer ServiceConsumerInformation ServicesConsultingSaaSAnalyticsQuality AssuranceInformation TechnologySoftware

  • 4+ years of experience with PySpark and SQL for building scalable ETL pipelines.
  • Strong proficiency in Python programming.
  • Knowledge of GCP Data Analytics ecosystem (BigQuery, PySpark, SQL, etc.).
  • Experience with Airflow/Composer for workflow orchestration.
  • Experience with in-memory applications, database design, and data integration.
  • Strong analytical thinking and problem-solving abilities.
  • Design, build, and maintain scalable and robust ETL/ELT pipelines using PySpark and SQL.
  • Work on data extraction, transformation, and loading processes from multiple sources into data warehouses such as BigQuery.
  • Leverage GCP data analytics tools (BigQuery, DataProc, Cloud Functions, etc.) to process and analyze data.
  • Optimize data workflows for benchmarking, performance, and tuning to ensure efficiency and reliability.
  • Collaborate with engineering and analytics teams to develop data integration solutions that meet business needs.
  • Ensure the accuracy and quality of data by implementing strong in-memory applications and database designs.
  • Implement monitoring and alerting for pipelines and workflows to ensure data consistency and issue resolution.

PythonSQLElasticSearchETLGCPMongoDBAirflow

Posted 11 days ago
Apply
Apply

πŸ“ Bulgaria, Georgia, Moldova, Poland, Romania, Ukraine

🧭 Full-Time

πŸ” Advertising

🏒 Company: Coherent SolutionsπŸ‘₯ 501-1000OutsourcingSoftware

  • At least 3 years of experience as a DevOps Engineer.
  • Proven experience with data pipelines and tools such as Airflow and Spark (or similar).
  • Experience with databases such as Redis, MongoDB, and ElasticSearch.
  • Expertise in Kubernetes administration.
  • Strong knowledge of Linux administration.
  • Proven experience with CI/CD process implementation.
  • Proficiency in scripting languages such as Bash, Python, or any general-purpose programming language.
  • Experience with cloud platforms, particularly AWS.
  • Familiarity with monitoring and observability tools like OpenTelemetry, Grafana, Prometheus, and ElasticSearch.
  • English proficiency from B1 for effective communication.
  • Operate and maintain infrastructure, ensuring services are healthy, automated, and scalable.
  • Contribute to system design and network implementation with a focus on automation.
  • Collaborate with developers on operational issues, providing guidance and solutions.
  • Develop tools to monitor infrastructure and applications.
  • Troubleshoot system issues across the stack to maintain reliability.
  • Document system designs and operational procedures.
  • Build, monitor, and optimize production systems.

AWSPythonBashElasticSearchKubernetesMongoDBAirflowGrafanaPrometheusRedisSparkCI/CDLinux

Posted 12 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 185800.0 - 322000.0 USD per year

πŸ” Technology, Internet Services

🏒 Company: RedditπŸ‘₯ 1001-5000πŸ’° $410,000,000 Series F over 3 years agoπŸ«‚ Last layoff over 1 year agoNewsContentSocial NetworkSocial Media

  • 3-10+ years of industry experience as a machine learning engineer or software engineer developing backend/infrastructure at scale.
  • Experience building machine learning models using PyTorch or Tensorflow.
  • Experience with search & recommender systems and pipelines.
  • Production-quality code experience with testing, evaluation, and monitoring using Python and Golang.
  • Familiarity with GraphQL, REST, HTTP, Thrift, or gRPC and design of APIs.
  • Experience developing applications with large scale data stacks such as Kubeflow, Airflow, BigQuery, Kafka, Redis.
  • Develop and enhance Search Retrievals and Ranking models.
  • Design and build pipelines and algorithms for user answers.
  • Collaborate with product managers, data scientists, and platform engineers.
  • Develop and test new pipeline components and deploy ML models.
  • Ensure high uptime and low latency for search systems.

GraphQLPythonKafkaKubeflowMachine LearningPyTorchAirflowRedisTensorflow

Posted 14 days ago
Apply
Apply

πŸ“ USA

🧭 Full-Time

🏒 Company: Global Channel Management, Inc

  • 5+ years of experience in building data engineering pipelines on on-premise and cloud platforms (Snowflake, Databricks).
  • Strong experience coding in Python, PySpark, SQL and building automations.
  • Knowledge of Cybersecurity, IT infrastructure, and software concepts.
  • Knowledge of IT Asset Management, ITIL, ITSM practices is a plus.
  • 3+ years of experience using data warehousing/data lake techniques in cloud environments.
  • 3+ years of developing data visualizations using Tableau, Plotly, and Streamlit.
  • Experience with ELT/ETL tools like dbt, Airflow, Cribl, Glue, FiveTran, AirByte, etc.
  • Experience capturing incremental data changes, streaming data ingestion, and stream processing.
  • Experience in processes supporting data governance, data structures, and metadata management.
  • Solid grasp of data and analytics concepts and methodologies including data science, data engineering, and data storytelling.
  • Streamline the process of sourcing and organizing data from various sources using Python, PySpark, SQL, and Spark.
  • Accelerate data for analysis to support analytics efforts.
  • Support the data curation process by feeding the data catalog and knowledge bases.
  • Create data tools to assist analytics and data science team members in building and optimizing data products.

PythonSQLETLSnowflakeTableauAirflowData engineeringData visualization

Posted 21 days ago
Apply
Apply

πŸ“ US

🧭 Full-Time

πŸ’Έ 206700.0 - 289400.0 USD per year

πŸ” Social Media / Technology

  • MS or PhD in a quantitative discipline: engineering, statistics, operations research, computer science, informatics, applied mathematics, economics, etc.
  • 7+ years of experience with large-scale ETL systems and building clean, maintainable code (Python preferred).
  • Strong programming proficiency in Python, SQL, Spark, Scala.
  • Experience with data modeling, ETL concepts, and patterns for data governance.
  • Experience with data workflows, data modeling, and engineering.
  • Experience in data visualization and dashboard design using tools like Looker, Tableau, and D3.
  • Deep understanding of relational and MPP database designs.
  • Proven track record of cross-functional collaboration and excellent communication skills.
  • Act as the analytics engineering lead within Ads DS team contributing to data science data quality and automation initiatives.
  • Work on ETLs, reporting dashboards, and data aggregations for business tracking and ML model development.
  • Develop and maintain robust data pipelines for data ingestion, processing, and transformation.
  • Create user-friendly tools for internal team use, streamlining analysis and reporting processes.
  • Lead efforts to build a data-driven culture by enabling data self-service.
  • Provide technical guidance and mentorship to data analysts.

PythonSQLETLAirflowSparkScalaData visualizationData modeling

Posted 22 days ago
Apply
Apply

πŸ“ Armenia

🧭 Full-Time

πŸ’Έ 90000.0 - 110000.0 USD per year

πŸ” Ecommerce

🏒 Company: Constructor

  • Hands-on experience building and owning services for production with Python.
  • Experience with monitoring and quality assurance for services affecting customer experience.
  • Willingness to explore the experimental/data analytics domain.
  • Curiosity about how users interact with the product beyond technical aspects.
  • Experience with PySpark or ETL pipelines (5-10% time to improve these pipelines).
  • Experience building analytical services or experiment platforms.
  • Owning the whole experimentation process from traffic splitting to revealing experiment results.
  • Developing the internal experiments platform for Machine Learning and Data Science teams.
  • Implementing user-facing features for running and analyzing the results of experiments.
  • Improving the performance and scalability of services.
  • Enhancing the trustworthiness and efficiency of experiments by integrating advanced A/B testing techniques.

AWSPostgreSQLPythonSQLFlaskAirflowData scienceFastAPIA/B testing

Posted 29 days ago
Apply
Shown 10 out of 118