Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today — fast and easy!

Remote IT Jobs
Airflow
106 jobs found. to receive daily emails with new job openings that match your preferences.
106 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
🔥 Senior Engineering Manager, Data
Posted about 24 hours ago

📍 India

🧭 Full-Time

🔍 Software Development

🏢 Company: Frequence

  • 10+ years of experience in software development, including 5+ years in leadership roles managing managers and leading high-performing engineering teams.
  • A deep understanding of design patterns, software development methodologies, and distributed systems architecture.
  • Hands-on experience with relevant technologies and tools, such as Airflow, Spark, and streaming architectures.
  • Well-versed in data pipelining, data warehousing, data modeling, data streaming, streaming architectures, database performance and ETL processes.
  • Experience in cloud infrastructure (preferably GCP) and have built large-scale, distributed systems with a focus on reliability, observability, and cost efficiency.
  • Data-driven, using metrics and analytics to track team performance, identify areas for improvement, and drive strategic decisions.
  • Identify and drive new strategic opportunities.
  • Weave AI into current data and reporting solutions.
  • Ensure the reliability, security and performance of database systems.
  • Work closely with executive leadership and other engineering leaders.
  • Manage managers and work effectively with skip-level employees.
  • Drive the planning and estimation process for cross-functional initiatives.
  • Proactively identify risks and implement mitigation strategies.
  • Advocate for strong engineering practices and software architecture improvements.
  • Ensure that engineering outcomes have a measurable impact on company top-line or bottom-line performance.

LeadershipProject ManagementPythonSoftware DevelopmentSQLCloud ComputingData AnalysisETLGCPKafkaKubernetesMachine LearningPeople ManagementSoftware ArchitectureCross-functional Team LeadershipAirflowAlgorithmsData engineeringREST APIStrategic ManagementSparkCommunication SkillsAnalytical SkillsCI/CDAgile methodologiesDevOpsMicroservicesRisk ManagementData visualizationTeam managementStakeholder managementData modelingData managementBudget management

Posted about 24 hours ago
Apply
Apply
🔥 Middle NLP Engineer
Posted 1 day ago

📍 Serbia

🧭 Full-Time

🔍 Software Development

🏢 Company: Social Discovery Group👥 501-1000Venture CapitalFinanceInformation Technology

  • Proficiency in Python (pandas, NumPy, scikit-learn, matplotlib, Plotly)
  • Excellent coding skills
  • Strong knowledge of LLMs: APIs, best models, and prompting techniques
  • Familiarity with Data & ML production pipelines (Airflow, MlFlow)
  • Proven experience working on ML solutions
  • Work on NLP-related tasks as part of the team, including writing prompts, creating functions, and fine-tuning models when necessary
  • Build ML/AI models, write services (if needed for the project), wrap them into Docker
  • Collaborate with cross-functional team of: Product owner, Analytics, ML
  • Share ideas and help with knowledge with the team

AWSDockerPythonData AnalysisGCPMachine LearningMLFlowNumpyAirflowAlgorithmsAzureREST APIPandas

Posted 1 day ago
Apply
Apply

📍 Poland

🧭 Full-Time

🔍 Software Development

🏢 Company: Craft Machine Inc

  • 2+ years of experience in Data Engineering.
  • 2+ years of experience with Python.
  • Experience in developing, maintaining, and ensuring the reliability, scalability, fault tolerance and observability of data pipelines in a production environment.
  • Strong knowledge of SDLC and solid software engineering practices.
  • Knowledge of and experience with Amazon Web Services (AWS) and Databricks.
  • Demonstrated curiosity through asking questions, digging into new technologies, and always trying to grow.
  • Strong problem solving and the ability to communicate ideas effectively.
  • Familiar with infrastructure-as-code approach.
  • Self-starter, independent, likes to take initiative.
  • Have fundamental knowledge of data engineering techniques: ETL/ELT, batch and streaming, DWH, Data Lakes, distributed processing.
  • Familiarity with at least some technologies in our current tech stack: Python, PySpark, Pandas, SQL (PostgreSQL), Airflow, Docker, Databricks & AWS (S3, Batch, Athena, RDS, DynamoDB, Glue, ECS), CircleCI, GitHub, Terraform
  • Building and optimizing data pipelines (batch and streaming).
  • Extracting, analyzing and modeling of rich and diverse datasets.
  • Designing software that is easily testable and maintainable.
  • Support in setting data strategies and our vision.
  • Keep track of emerging technologies and trends in the Data Engineering world, incorporating modern tooling and best practices at Craft.
  • Work on extendable data processing systems that allows to add and scale pipelines.
  • Applying machine learning techniques such as anomaly detection, clustering, regression classification, summarization to extract value from our data sets.

AWSDockerPostgreSQLPythonSQLETLMachine LearningAirflowAmazon Web ServicesData engineeringPandasCI/CDTerraformData modelingSoftware Engineering

Posted 2 days ago
Apply
Apply

📍 United States, Latin America, India

🔍 Software Development

  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst
  • Programming expertise in Java, Python and/or Scala
  • Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP
  • SQL and the ability to write, debug, and optimize SQL queries
  • Client-facing written and verbal communication skills and experience
  • 4-year Bachelor's degree in Computer Science or a related field
  • Develop end-to-end technical solutions into production — and to help ensure performance, security, scalability, and robust data integration.
  • Create and deliver detailed presentations
  • Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)

AWSPythonSQLCloud ComputingData AnalysisETLGCPJavaKafkaSnowflakeAirflowAzureData engineeringSparkCommunication SkillsScalaData modelingSoftware Engineering

Posted 3 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 7+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.

AWSDockerPythonSQLCloud ComputingETLKafkaKubernetesAirflowData engineeringPostgresSparkTerraformData modeling

Posted 3 days ago
Apply
Apply

📍 CA, CO, CT, FL, IL, MA, MD, NC, NJ, NY, OR, VA, VT, WA, United Kingdom

🧭 Full-Time

💸 175000.0 - 191300.0 USD per year

🔍 Crowdfunding

🏢 Company: Kickstarter PBC

  • 8+ years of experience in data engineering, analytics engineering, or related fields.
  • Strong experience with cloud-based data warehouses (Redshift, Snowflake, or BigQuery) and query performance optimization.
  • Expertise in SQL, Python, and data transformation frameworks like dbt.
  • Experience building scalable data pipelines with modern orchestration tools (Airflow, MWAA, Dagster, etc.).
  • Knowledge of real-time streaming architectures (Kafka, Kinesis, etc.) and event-based telemetry best practices.
  • Experience working with business intelligence tools (e.g. Looker) and enabling self-serve analytics.
  • Ability to drive cost-efficient and scalable data solutions, balancing performance with resource management.
  • Familiarity with machine learning operations (MLOps) and experimentation tooling is a plus.
  • Develop, own and improve Kickstarter’s data architecture—optimize our Redshift warehouse, implement best practices for data storage, processing, and orchestration.
  • Design and build scalable ETL/ELT pipelines to transform raw data into clean, usable datasets for analytics, product insights, and machine learning applications.
  • Enhance data accessibility and self-service analytics by improving Looker models and enabling better organizational data literacy.
  • Support real-time data needs by optimizing event-based telemetry and integrating new data streams to fuel new products, personalization, recommendations, and fraud detection.
  • Lead cost optimization efforts—identify and implement more efficient processes and tools to lower costs.
  • Drive data governance and security best practices—ensure data integrity, access controls, and proper lineage tracking.
  • Collaborate across teams to ensure data solutions align with product, growth, and business intelligence needs.

PythonSQLETLKafkaSnowflakeAirflowData engineeringData visualizationData modeling

Posted 4 days ago
Apply
Apply

📍 São Paulo, Rio Grande do Sul, Rio de Janeiro, Belo Horizonte

🔍 Data field

🏢 Company: TELUS Digital Brazil

  • 3+ years in the Data field
  • Experience in the construction and optimization of data pipelines, architectures, and 'big data' datasets
  • Proficiency with Apache Spark with a general-purpose programming language such as Python or Scala
  • Ability to create processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL databases
  • Experience with data pipeline and workflow management tools
  • Be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams;
  • Create and maintain optimal data pipeline architecture;
  • Assemble large, complex data sets that meet business requirements;
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc;
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

AWSPythonSQLCloud ComputingETLGCPHadoopKafkaAirflowAzureData engineeringScala

Posted 7 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 177000.0 - 213000.0 USD per year

🔍 FinTech

🏢 Company: Flex

  • A minimum of 6 years of industry experience in the data infrastructure/data engineering domain.
  • A minimum of 6 years of experience with Python and SQL.
  • A minimum of 3 years of industry experience using DBT.
  • A minimum of 3 years of industry experience using Snowflake and its basic features.
  • Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc.
  • Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc.
  • Industry experience working with relational and NoSQL databases in a production environment.
  • Strong fundamentals in data structures, algorithms, and design patterns.
  • Design, implement, and maintain high-quality data infrastructure services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers.
  • Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling.
  • Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion.
  • Create scalable real-time streaming pipelines and offline ETL pipelines.
  • Design, implement, and manage a data warehouse that provides secure access to large datasets.
  • Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
  • Create engineering documentation for design, runbooks, and best practices.

AWSPythonSQLBashDesign PatternsETLHadoopJavaKafkaSnowflakeAirflowAlgorithmsCassandraData engineeringData StructuresNosqlSparkCommunication SkillsCI/CDRESTful APIsTerraformWritten communicationDocumentationData modelingDebugging

Posted 8 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 185000.0 - 210000.0 USD per year

🔍 Data and Integrations

🏢 Company: Axonius👥 600-600💰 $200,000,000 Series E about 1 year agoAsset ManagementCloud SecurityInformation TechnologyCyber SecurityNetwork Security

  • 8+ years of experience in Data and related domains including big data pipelines, cloud data warehouses, SQL and NoSQL databases, data analysis and integrations.
  • 3+ years of experience recruiting and managing technical teams
  • Experience building and maintaining data catalogs and making data widely available
  • Experience collaborating with business partners to develop roadmaps and driving outcomes
  • Proficiency with one or more programming languages - Python/Scala/Java/Go.
  • Proficiency with SQL and experience with ETL and data modeling.
  • Experience working with and integrating with SaaS applications such as Salesforce, Marketo, Netsuite and Workday.
  • Experience in an agile development methodology
  • Bachelor's degree in Computer Science or equivalent experience.
  • Lead the overall data integration strategy, identifying key data sources and establishing data governance policies for data accuracy and consistency.
  • Collaborate with business functions across the enterprise to understand system capabilities and data needs, communicating integration plans and delivering data solutions.
  • Recruit and manage a team of data professionals (data engineers, integration specialists, etc.) to design, develop, and maintain data pipelines and a comprehensive data catalog.
  • Select and implement data integration tools to efficiently move data between systems.
  • Ensure data is user-friendly, performant, and accessible across the organization.
  • Ensure all data and pipelines adhere to data privacy regulations and security standards.
  • Transform raw data into models using dbt.

LeadershipPythonSQLAgileCloud ComputingData AnalysisETLSnowflakeAirflowAlgorithmsData engineeringData StructuresREST APICommunication SkillsAnalytical SkillsMicrosoft ExcelProblem SolvingAgile methodologiesExcellent communication skillsRecruitmentJSONData visualizationTeam managementData modelingData managementSaaS

Posted 9 days ago
Apply
Apply
🔥 Lead Engineer, BI
Posted 10 days ago

📍 United States, Canada

🔍 Business Intelligence

🏢 Company: OneSix - External

  • 6+ years of experience in business intelligence, data engineering, or a similar role.
  • Expert-level proficiency in SQL, data modeling, and ETL development.
  • Extensive experience with BI tools such as Power BI, Tableau, or Looker.
  • Deep understanding of cloud data platforms (AWS, Azure) and data warehousing solutions.
  • Strong programming skills in Python, R, or other scripting languages for data automation.
  • Proven ability to lead teams and manage large-scale BI initiatives.
  • Excellent problem-solving, strategic thinking, and stakeholder communication skills.
  • Experience with machine learning, AI-driven analytics, or predictive modeling.
  • Familiarity with data pipeline orchestration tools (Airflow, dbt, or similar).
  • Knowledge of version control (Git) and CI/CD for data workflows.
  • Background in Agile methodologies and project management.
  • Define and drive the BI strategy, ensuring alignment with business objectives.
  • Lead and mentor a team of BI engineers, fostering a culture of innovation and continuous improvement.
  • Collaborate with executives and key stakeholders to understand business needs and translate them into BI solutions.
  • Design, implement, and optimize scalable data models and architectures for reporting and analytics.
  • Oversee the development and maintenance of ETL pipelines, ensuring data integrity and performance.
  • Manage cloud-based data platforms (AWS, Azure, GCP) and data warehousing solutions (Snowflake)
  • Lead the development of enterprise-level dashboards, reports, and self-service BI tools.
  • Implement and optimize BI solutions using Power BI, Tableau, Looker, or similar tools.
  • Integrate advanced analytics, predictive modeling, and AI-driven insights into BI platforms.
  • Establish and enforce BI best practices, including data governance, security, and documentation.
  • Monitor and improve query performance, data processing efficiency, and system scalability.
  • Ensure compliance with data privacy and security regulations.
  • Work cross-functionally with data engineers, analysts, and business teams to align data strategies.
  • Act as a key advisor on data-driven initiatives, influencing decision-making at the leadership level.
  • Facilitate training and enablement programs to enhance BI adoption across the organization.

AWSLeadershipPythonSQLAgileBusiness IntelligenceCloud ComputingData AnalysisETLGitMachine LearningSnowflakeTableauAirflowAzureData engineeringCI/CDProblem SolvingExcellent communication skillsData visualizationMentorshipStrategic thinkingData modelingData management

Posted 10 days ago
Apply
Shown 10 out of 106

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming — software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative — graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales — digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring — teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content — creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) — Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting — bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time — the ideal choice for those who value stability and predictability;
  • part-time — perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract — suited for professionals who want to work on projects for a set period.
  • Temporary — short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship — a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners — ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists — if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts — roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.