Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today — fast and easy!

Remote IT Jobs
Airflow
78 jobs found. to receive daily emails with new job openings that match your preferences.
78 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

📍 United States, Latin America, India

🔍 Software Development

  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst
  • Programming expertise in Java, Python and/or Scala
  • Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP
  • SQL and the ability to write, debug, and optimize SQL queries
  • Client-facing written and verbal communication skills and experience
  • 4-year Bachelor's degree in Computer Science or a related field
  • Develop end-to-end technical solutions into production — and to help ensure performance, security, scalability, and robust data integration.
  • Create and deliver detailed presentations
  • Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)

AWSPythonSQLCloud ComputingData AnalysisETLGCPJavaKafkaSnowflakeAirflowAzureData engineeringSparkCommunication SkillsScalaData modelingSoftware Engineering

Posted 1 day ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 7+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.

AWSDockerPythonSQLCloud ComputingETLKafkaKubernetesAirflowData engineeringPostgresSparkTerraformData modeling

Posted 1 day ago
Apply
Apply

📍 CA, CO, CT, FL, IL, MA, MD, NC, NJ, NY, OR, VA, VT, WA, United Kingdom

🧭 Full-Time

💸 175000.0 - 191300.0 USD per year

🔍 Crowdfunding

🏢 Company: Kickstarter PBC

  • 8+ years of experience in data engineering, analytics engineering, or related fields.
  • Strong experience with cloud-based data warehouses (Redshift, Snowflake, or BigQuery) and query performance optimization.
  • Expertise in SQL, Python, and data transformation frameworks like dbt.
  • Experience building scalable data pipelines with modern orchestration tools (Airflow, MWAA, Dagster, etc.).
  • Knowledge of real-time streaming architectures (Kafka, Kinesis, etc.) and event-based telemetry best practices.
  • Experience working with business intelligence tools (e.g. Looker) and enabling self-serve analytics.
  • Ability to drive cost-efficient and scalable data solutions, balancing performance with resource management.
  • Familiarity with machine learning operations (MLOps) and experimentation tooling is a plus.
  • Develop, own and improve Kickstarter’s data architecture—optimize our Redshift warehouse, implement best practices for data storage, processing, and orchestration.
  • Design and build scalable ETL/ELT pipelines to transform raw data into clean, usable datasets for analytics, product insights, and machine learning applications.
  • Enhance data accessibility and self-service analytics by improving Looker models and enabling better organizational data literacy.
  • Support real-time data needs by optimizing event-based telemetry and integrating new data streams to fuel new products, personalization, recommendations, and fraud detection.
  • Lead cost optimization efforts—identify and implement more efficient processes and tools to lower costs.
  • Drive data governance and security best practices—ensure data integrity, access controls, and proper lineage tracking.
  • Collaborate across teams to ensure data solutions align with product, growth, and business intelligence needs.

PythonSQLETLKafkaSnowflakeAirflowData engineeringData visualizationData modeling

Posted 3 days ago
Apply
Apply

📍 São Paulo, Rio Grande do Sul, Rio de Janeiro, Belo Horizonte

🔍 Data field

🏢 Company: TELUS Digital Brazil

  • 3+ years in the Data field
  • Experience in the construction and optimization of data pipelines, architectures, and 'big data' datasets
  • Proficiency with Apache Spark with a general-purpose programming language such as Python or Scala
  • Ability to create processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL databases
  • Experience with data pipeline and workflow management tools
  • Be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams;
  • Create and maintain optimal data pipeline architecture;
  • Assemble large, complex data sets that meet business requirements;
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc;
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

AWSPythonSQLCloud ComputingETLGCPHadoopKafkaAirflowAzureData engineeringScala

Posted 5 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 177000.0 - 213000.0 USD per year

🔍 FinTech

🏢 Company: Flex

  • A minimum of 6 years of industry experience in the data infrastructure/data engineering domain.
  • A minimum of 6 years of experience with Python and SQL.
  • A minimum of 3 years of industry experience using DBT.
  • A minimum of 3 years of industry experience using Snowflake and its basic features.
  • Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc.
  • Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc.
  • Industry experience working with relational and NoSQL databases in a production environment.
  • Strong fundamentals in data structures, algorithms, and design patterns.
  • Design, implement, and maintain high-quality data infrastructure services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers.
  • Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling.
  • Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion.
  • Create scalable real-time streaming pipelines and offline ETL pipelines.
  • Design, implement, and manage a data warehouse that provides secure access to large datasets.
  • Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
  • Create engineering documentation for design, runbooks, and best practices.

AWSPythonSQLBashDesign PatternsETLHadoopJavaKafkaSnowflakeAirflowAlgorithmsCassandraData engineeringData StructuresNosqlSparkCommunication SkillsCI/CDRESTful APIsTerraformWritten communicationDocumentationData modelingDebugging

Posted 6 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 168000.0 - 200000.0 USD per year

🔍 Healthcare

🏢 Company: Datavant

  • 6+ years of engineering experience including multiple years in leadership, managing individual contributors, building teams from scratch and leading large high impact customer go lives and multiphase integration rollouts.
  • Experience managing a highly functioning team of customer facing engineers
  • Several years of experience with AWS, Airflow, Python, SQL, and API integrations
  • Experience with Healthcare interoperability standards like HL7 or FHIR or writing proprietary integrations
  • Build, grow, and lead a world-class team of data engineers
  • Create the technical vision for Datavant’s delivery of Site Connect.
  • Partner with Customer Success and Product to lead successful go lives.
  • Be a critical leader of the payer products and business, delivering rapid revenue and customer growth by directing customer go lives, custom capabilities, and more to create a winning product and customer experience.
  • Be the champion & driver of innovative processes and initiatives.
  • Be hands-on by leading high impact efforts.
  • Be a strong communicator, coach, and mentor.

AWSLeadershipPythonSQLPeople ManagementCross-functional Team LeadershipAirflowAPI testingData engineeringCommunication SkillsProblem SolvingTeam managementCustomer Success

Posted 9 days ago
Apply
Apply

📍 Latin America

🧭 Full-Time

💸 62400.0 - 83200.0 USD per year

🔍 Retail Security

🏢 Company: Panoptyc👥 1-10ElectronicsArtificial Intelligence (AI)Information Technology

  • Minimum of 3 years of experience in data engineering or BI development
  • Strong programming skills in Python, SQL, and data processing frameworks
  • Experience with ETL development and data pipeline orchestration tools
  • Proven ability to optimize data models and query performance at scale
  • Experience working with cloud data warehouses (Snowflake, BigQuery, Redshift)
  • Knowledge of data modeling techniques and dimensional modeling
  • Strong troubleshooting and problem-solving skills with a keen attention to detail
  • Ability to work independently in a fast-paced, remote environment
  • Strong verbal and written communication skills for effective collaboration with technical and non-technical stakeholders
  • Availability to work 40 hours per week on Eastern Standard Time (EST)
  • Design and implement data pipelines to extract, transform, and load data into data warehouses
  • Develop scalable data models and optimize query performance for business intelligence applications
  • Create and maintain BI processes using programming languages and data engineering tools
  • Build automated data validation frameworks to ensure data quality and integrity
  • Develop APIs and interfaces for data access and integration across systems
  • Implement data monitoring tools and performance metrics to ensure system reliability
  • Collaborate with data scientists and analysts to productionize data solutions

PythonSQLBusiness IntelligenceETLSnowflakeAirflowData engineeringData modeling

Posted 9 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 200000.0 - 250000.0 USD per year

🔍 Health Tech

🏢 Company: SmarterDx👥 101-250💰 $50,000,000 Series B 10 months agoArtificial Intelligence (AI)HospitalInformation TechnologyHealth Care

  • 12+ years of software engineering experience
  • 3+ years as an engineering manager
  • Expertise running an agile process (preferably Kanban) for a team and fluency with the agile toolkit
  • Experience working with product managers to structure requirements into user stories
  • Experience with Python
  • Experience with ETL/ELT processes and tools like AirFlow and dbt
  • Expertise with one or more database systems, especially PostgreSQL
  • Deep experience designing and implementing microservice architectures, especially on Kubernetes
  • Excellent written and verbal communication skills
  • Bachelor’s or Master’s in Computer Science, Engineering, or a related field, or equivalent experience
  • Experience working at startups, especially in the health tech space
  • Be accountable for all aspects of the team’s performance including velocity of feature delivery, quality, operational excellence, hiring, retention, professional growth, and well-being
  • Develop and operate software systems that integrate with EHRs to ingest clinical data and organize it into data sets and APIs
  • Implement, monitor, and iterate on team metrics to further improve team processes
  • Collaborate across disciplines to understand our users and iterate on new ideas
  • Protect patients’ privacy by teaching secure coding practices and ensuring they are used throughout the codebase
  • Support our apps and systems in production
  • Work with the Head of Engineering to improve the broader engineering organization.

AWSBackend DevelopmentLeadershipPostgreSQLPythonSQLAgileETLKubernetesPeople ManagementAirflowData engineeringData StructuresPostgresREST APICI/CDRESTful APIsTerraformMicroservicesSoftware EngineeringDebugging

Posted 12 days ago
Apply
Apply

📍 United States, Canada

🧭 Full-Time

💸 139000.0 - 218000.0 USD per year

🔍 Software Development

  • 5+ years developing and deploying complex web applications, with a proven track record of shipping performant quality code.
  • Proficiency in Java, Python, or another high performance back-end language.
  • Experience working with high-performance real-time analytics, event processing, and large-scale distributed systems.
  • Strong data engineering skills, including experience with relational and non-relational databases.
  • Knowledge of REST APIs and event-driven architectures.
  • Can debug production issues across services and multiple levels of the stack.
  • Experience with testing frameworks (e.g. Jest, Mocha, Playwright, Cypress, TestNG).
  • Architect, design, and implement scalable multi-tenant backend services and APIs.
  • Work on technologies such as Java, MongoDB, Druid, Airflow, Amazon Web Services EC2, S3, Lambdas, RDS, and more.
  • Design and implement data processing pipelines that includes ingestion, transformation, storage, and query.
  • Work with Druid and other data stores to efficiently integrate and query large-scale event data.
  • Ensure system scalability and reliability by optimizing distributed architectures, caching strategies, and event-driven systems for low-latency performance.
  • Lead projects that directly contribute to team and engineering organization's deliverables.
  • Produce and elevate the quality of maintainable, tested, performant, and scalable code.
  • Build and maintain unit and integration tests.
  • Author, collaborate on, and evaluate design documents.
  • Influence technical designs and team-level prioritization as well as participate in technical solutions.
  • Collaborate with software engineers, product managers, and designers in an autonomous, supportive team environment.
  • Mentor other engineers in technical skills, best practices, and quality.
  • Participate in engineering citizenship activities such as co-authoring engineering blogs, strengthening and improving our hiring processes, and leading internal hackathon teams.

AWSBackend DevelopmentSQLJavaMongoDBMySQLAirflowAlgorithmsAmazon Web ServicesData engineeringData StructuresREST APIMicroservicesData modelingSoftware Engineering

Posted 12 days ago
Apply
Apply

📍 United States

💸 145000.0 - 155000.0 USD per year

🔍 Education Technology

🏢 Company: Echo360 Inc

  • 7+ years of experience in data engineering, 2+ years of experience in a lead role.
  • Solid understanding of data lake, lakehouse, and warehouse data architectures.
  • Deep hands-on expertise in AWS cloud architecture and data-related services, patterns, and practices, inclusive of S3, EMR, Glue, Athena, Redshift, QuickSight, Kinesis or equivalent streaming, SQS and other queue/messaging components, Lambda, Step Functions, Airflow or other job management as appropriate.
  • Strong command of SQL, Python and related Python libraries.
  • Knowledge of data visualization tools (QuickSight preferred) and advanced analytics techniques.
  • Proven track record of successfully building and scaling reliable, secure, cost efficient, and performant data and analytics solutions.
  • Demonstrable experience with designing and implementing data quality processes.
  • Excellent leadership and communication skills, with the ability to motivate and inspire teams.
  • Strong analytical and problem-solving skills, with a focus on data-driven decision-making.
  • Ability to work collaboratively across diverse teams and prioritize effectively in a fast-paced environment.
  • An understanding of data security and data privacy and related regulatory requirements.
  • Prior experience working in a global environment with offshore team members.
  • Prior experience in education technology product development and learning analytics is strongly preferred.
  • Recent, relevant experience with Machine Learning and Generative AI as they apply to data analytics uses cases for discovery, analysis, and predictive use cases is strongly preferred.
  • Bachelor’s degree in computer science or related engineering/technical discipline
  • Provide technical and people leadership and direction for the data team, setting goals, defining technical approaches, architecture, planning and executing work following Agile practices to consistently deliver on commitments.
  • Partner closely with product management, application, SRE, and security teams to build consensus on needed capabilities, designs, APIs, infrastructure, and ensure security and privacy of data.
  • Oversee the architecture and implementation of data pipelines, structured and unstructured storage, data transformation, data warehouses, analytics solution, and data APIs.
  • Ensure the delivery of high-quality, timely, and accurate data solutions.
  • Drive the adoption of best practices in data quality, and security.
  • Monitor emerging technologies and trends in data and analytics to recommend and implement innovative solutions.
  • Manage project timelines, budgets, and resources to meet internal and customer commitments.
  • Build and maintain strong relationships with internal stakeholders, external vendors, and customers.
  • Recruit, develop, and retain strong talent for the data team, inclusive of employees and off-shore/near-shore partners.

AWSBackend DevelopmentLeadershipProject ManagementPythonSQLAgileCloud ComputingData AnalysisETLMachine LearningPeople ManagementAirflowApache KafkaAPI testingData engineeringData StructuresREST APIServerlessCommunication SkillsCI/CDMicroservicesData visualizationTeam managementStrategic thinkingData modelingData analyticsData management

Posted 13 days ago
Apply
Shown 10 out of 78

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming — software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative — graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales — digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring — teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content — creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) — Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting — bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time — the ideal choice for those who value stability and predictability;
  • part-time — perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract — suited for professionals who want to work on projects for a set period.
  • Temporary — short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship — a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners — ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists — if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts — roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.