Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today — fast and easy!

Remote IT Jobs
Airflow
84 jobs found. to receive daily emails with new job openings that match your preferences.
84 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: NerdWallet

  • 8+ years in software engineering, with a strong background in backend development, distributed systems and data pipelines.
  • 3+ years of experience with AWS, Snowflake, DBT, Airflow or any other compatible systems.
  • 3+ years of experience building APIs and building scalable backend systems.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience).
  • Proficiency in modern programming languages such as Java, Python or Typescript.
  • Experience with microservices architecture, RESTful APIs, and cloud infrastructure (AWS, GCP, or Azure).
  • Strong understanding of database systems (both SQL and NoSQL), with experience in high-volume data processing.
  • Knowledge of security best practices, particularly in financial services.
  • Familiarity with CI/CD pipelines, containerization, and orchestration technologies like Docker and Kubernetes.
  • Experience in consumer credit, lending, loans, or insurance, with a solid understanding of industry regulations, underwriting processes, and risk assessment.
  • Driving meaningful and real revenue for Nerdwallet’s CLAW division
  • Serving as a mentor to the engineers on the team you are assigned to
  • Serving as a trusted advisor and tech lead for our Engineering Managers
  • Delivering large amounts of features to live in a high quality fashion, serving as an example of what good looks like to the team
  • Helping to drive our existing culture towards better engineering practices
  • Helping to drive our existing culture towards strong continuous improvement thinking
  • Helping to drive our existing culture towards failing fast and repairing faster
  • Partnering with Management to continuously develop the engineering roadmap for our team

AWSBackend DevelopmentDockerLeadershipPythonSQLJavaKubernetesSnowflakeTypeScriptAirflowData engineeringNosqlCommunication SkillsAnalytical SkillsCI/CDProblem SolvingAgile methodologiesRESTful APIsMentoringDevOpsMicroservicesTeamworkData visualizationData modelingSoftware EngineeringData analytics

Posted about 7 hours ago
Apply
Apply

📍 United States, Latin America, India

🔍 Software Development

  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst
  • Programming expertise in Java, Python and/or Scala
  • Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP
  • SQL and the ability to write, debug, and optimize SQL queries
  • Client-facing written and verbal communication skills and experience
  • 4-year Bachelor's degree in Computer Science or a related field
  • Develop end-to-end technical solutions into production — and to help ensure performance, security, scalability, and robust data integration.
  • Create and deliver detailed presentations
  • Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)

AWSPythonSQLCloud ComputingData AnalysisETLGCPJavaKafkaSnowflakeAirflowAzureData engineeringSparkCommunication SkillsScalaData modelingSoftware Engineering

Posted 1 day ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 7+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.

AWSDockerPythonSQLCloud ComputingETLKafkaKubernetesAirflowData engineeringPostgresSparkTerraformData modeling

Posted 1 day ago
Apply
Apply

📍 CA, CO, CT, FL, IL, MA, MD, NC, NJ, NY, OR, VA, VT, WA, United Kingdom

🧭 Full-Time

💸 175000.0 - 191300.0 USD per year

🔍 Crowdfunding

🏢 Company: Kickstarter PBC

  • 8+ years of experience in data engineering, analytics engineering, or related fields.
  • Strong experience with cloud-based data warehouses (Redshift, Snowflake, or BigQuery) and query performance optimization.
  • Expertise in SQL, Python, and data transformation frameworks like dbt.
  • Experience building scalable data pipelines with modern orchestration tools (Airflow, MWAA, Dagster, etc.).
  • Knowledge of real-time streaming architectures (Kafka, Kinesis, etc.) and event-based telemetry best practices.
  • Experience working with business intelligence tools (e.g. Looker) and enabling self-serve analytics.
  • Ability to drive cost-efficient and scalable data solutions, balancing performance with resource management.
  • Familiarity with machine learning operations (MLOps) and experimentation tooling is a plus.
  • Develop, own and improve Kickstarter’s data architecture—optimize our Redshift warehouse, implement best practices for data storage, processing, and orchestration.
  • Design and build scalable ETL/ELT pipelines to transform raw data into clean, usable datasets for analytics, product insights, and machine learning applications.
  • Enhance data accessibility and self-service analytics by improving Looker models and enabling better organizational data literacy.
  • Support real-time data needs by optimizing event-based telemetry and integrating new data streams to fuel new products, personalization, recommendations, and fraud detection.
  • Lead cost optimization efforts—identify and implement more efficient processes and tools to lower costs.
  • Drive data governance and security best practices—ensure data integrity, access controls, and proper lineage tracking.
  • Collaborate across teams to ensure data solutions align with product, growth, and business intelligence needs.

PythonSQLETLKafkaSnowflakeAirflowData engineeringData visualizationData modeling

Posted 2 days ago
Apply
Apply

📍 São Paulo, Rio Grande do Sul, Rio de Janeiro, Belo Horizonte

🔍 Data field

🏢 Company: TELUS Digital Brazil

  • 3+ years in the Data field
  • Experience in the construction and optimization of data pipelines, architectures, and 'big data' datasets
  • Proficiency with Apache Spark with a general-purpose programming language such as Python or Scala
  • Ability to create processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL databases
  • Experience with data pipeline and workflow management tools
  • Be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams;
  • Create and maintain optimal data pipeline architecture;
  • Assemble large, complex data sets that meet business requirements;
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc;
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

AWSPythonSQLCloud ComputingETLGCPHadoopKafkaAirflowAzureData engineeringScala

Posted 5 days ago
Apply
Apply

📍 United States

🏢 Company: HSO👥 1001-5000Information Technology

  • 5+ years of experience in management of data and analytics projects
  • 5+ years of managing small to large Data projects and development teams
  • Proven experience in delivering Data Projects in Microsoft Azure eco system
  • Expertise in Azure DevOps, Jira for Project Management, Sprint Planning, Tracking and Reporting Activities
  • Implement DevOps strategies and methods throughout the development and deployment process
  • Experience with different SDLC project methodologies (such as agile and waterfall)
  • Experience working with SaaS solutions
  • Experience with cloud-based technologies, preferably Azure
  • Strong understanding of business processes/business process analysis
  • PMP certification is highly desirable
  • Manage and perform tasks throughout the full life cycle course of implementations
  • Participate in or lead Design Sessions which are used to gather requirements from project stakeholders
  • Lead steering committee meetings and facilitate communication with all parties involved
  • Support and manage project teams, project scope, project schedules, project risks/issues, and expectations

LeadershipProject ManagementAgileBusiness AnalysisCloud ComputingMicrosoft AzurePeople ManagementProject CoordinationJiraCross-functional Team LeadershipActiveMQAirflowAlgorithmsCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelCI/CDMicrosoft OfficeRESTful APIsDevOpsOrganizational skillsTime ManagementWritten communicationDocumentationExcellent communication skillsReportingCross-functional collaborationRisk ManagementStakeholder managementData modelingData analyticsData managementChange ManagementSaaSBudget management

Posted 6 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 177000.0 - 213000.0 USD per year

🔍 FinTech

🏢 Company: Flex

  • A minimum of 6 years of industry experience in the data infrastructure/data engineering domain.
  • A minimum of 6 years of experience with Python and SQL.
  • A minimum of 3 years of industry experience using DBT.
  • A minimum of 3 years of industry experience using Snowflake and its basic features.
  • Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc.
  • Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc.
  • Industry experience working with relational and NoSQL databases in a production environment.
  • Strong fundamentals in data structures, algorithms, and design patterns.
  • Design, implement, and maintain high-quality data infrastructure services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers.
  • Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling.
  • Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion.
  • Create scalable real-time streaming pipelines and offline ETL pipelines.
  • Design, implement, and manage a data warehouse that provides secure access to large datasets.
  • Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
  • Create engineering documentation for design, runbooks, and best practices.

AWSPythonSQLBashDesign PatternsETLHadoopJavaKafkaSnowflakeAirflowAlgorithmsCassandraData engineeringData StructuresNosqlSparkCommunication SkillsCI/CDRESTful APIsTerraformWritten communicationDocumentationData modelingDebugging

Posted 6 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 185000.0 - 210000.0 USD per year

🔍 Data and Integrations

🏢 Company: Axonius👥 600-600💰 $200,000,000 Series E about 1 year agoAsset ManagementCloud SecurityInformation TechnologyCyber SecurityNetwork Security

  • 8+ years of experience in Data and related domains including big data pipelines, cloud data warehouses, SQL and NoSQL databases, data analysis and integrations.
  • 3+ years of experience recruiting and managing technical teams
  • Experience building and maintaining data catalogs and making data widely available
  • Experience collaborating with business partners to develop roadmaps and driving outcomes
  • Proficiency with one or more programming languages - Python/Scala/Java/Go.
  • Proficiency with SQL and experience with ETL and data modeling.
  • Experience working with and integrating with SaaS applications such as Salesforce, Marketo, Netsuite and Workday.
  • Experience in an agile development methodology
  • Bachelor's degree in Computer Science or equivalent experience.
  • Lead the overall data integration strategy, identifying key data sources and establishing data governance policies for data accuracy and consistency.
  • Collaborate with business functions across the enterprise to understand system capabilities and data needs, communicating integration plans and delivering data solutions.
  • Recruit and manage a team of data professionals (data engineers, integration specialists, etc.) to design, develop, and maintain data pipelines and a comprehensive data catalog.
  • Select and implement data integration tools to efficiently move data between systems.
  • Ensure data is user-friendly, performant, and accessible across the organization.
  • Ensure all data and pipelines adhere to data privacy regulations and security standards.
  • Transform raw data into models using dbt.

LeadershipPythonSQLAgileCloud ComputingData AnalysisETLSnowflakeAirflowAlgorithmsData engineeringData StructuresREST APICommunication SkillsAnalytical SkillsMicrosoft ExcelProblem SolvingAgile methodologiesExcellent communication skillsRecruitmentJSONData visualizationTeam managementData modelingData managementSaaS

Posted 7 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 168000.0 - 200000.0 USD per year

🔍 Healthcare

🏢 Company: Datavant

  • 6+ years of engineering experience including multiple years in leadership, managing individual contributors, building teams from scratch and leading large high impact customer go lives and multiphase integration rollouts.
  • Experience managing a highly functioning team of customer facing engineers
  • Several years of experience with AWS, Airflow, Python, SQL, and API integrations
  • Experience with Healthcare interoperability standards like HL7 or FHIR or writing proprietary integrations
  • Build, grow, and lead a world-class team of data engineers
  • Create the technical vision for Datavant’s delivery of Site Connect.
  • Partner with Customer Success and Product to lead successful go lives.
  • Be a critical leader of the payer products and business, delivering rapid revenue and customer growth by directing customer go lives, custom capabilities, and more to create a winning product and customer experience.
  • Be the champion & driver of innovative processes and initiatives.
  • Be hands-on by leading high impact efforts.
  • Be a strong communicator, coach, and mentor.

AWSLeadershipPythonSQLPeople ManagementCross-functional Team LeadershipAirflowAPI testingData engineeringCommunication SkillsProblem SolvingTeam managementCustomer Success

Posted 8 days ago
Apply
Apply

📍 Latin America

🧭 Full-Time

💸 62400.0 - 83200.0 USD per year

🔍 Retail Security

🏢 Company: Panoptyc👥 1-10ElectronicsArtificial Intelligence (AI)Information Technology

  • Minimum of 3 years of experience in data engineering or BI development
  • Strong programming skills in Python, SQL, and data processing frameworks
  • Experience with ETL development and data pipeline orchestration tools
  • Proven ability to optimize data models and query performance at scale
  • Experience working with cloud data warehouses (Snowflake, BigQuery, Redshift)
  • Knowledge of data modeling techniques and dimensional modeling
  • Strong troubleshooting and problem-solving skills with a keen attention to detail
  • Ability to work independently in a fast-paced, remote environment
  • Strong verbal and written communication skills for effective collaboration with technical and non-technical stakeholders
  • Availability to work 40 hours per week on Eastern Standard Time (EST)
  • Design and implement data pipelines to extract, transform, and load data into data warehouses
  • Develop scalable data models and optimize query performance for business intelligence applications
  • Create and maintain BI processes using programming languages and data engineering tools
  • Build automated data validation frameworks to ensure data quality and integrity
  • Develop APIs and interfaces for data access and integration across systems
  • Implement data monitoring tools and performance metrics to ensure system reliability
  • Collaborate with data scientists and analysts to productionize data solutions

PythonSQLBusiness IntelligenceETLSnowflakeAirflowData engineeringData modeling

Posted 8 days ago
Apply
Shown 10 out of 84

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming — software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative — graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales — digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring — teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content — creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) — Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting — bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time — the ideal choice for those who value stability and predictability;
  • part-time — perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract — suited for professionals who want to work on projects for a set period.
  • Temporary — short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship — a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners — ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists — if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts — roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.