Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today β€” fast and easy!

Remote IT Jobs
Apache Airflow
152 jobs found. to receive daily emails with new job openings that match your preferences.
152 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
πŸ”₯ Senior Data Analyst
Posted about 7 hours ago

πŸ“ Ireland, United States (AZ, CA, CO, CT, FL, GA, IL, IN, IA, KS, MA, MI, MN, MO, NE, NJ, NY, NC, PA, SD, TN, TX, UT, WA, WI)

πŸ” Data Analysis

🏒 Company: SojernπŸ‘₯ 501-1000πŸ’° $9,842,000 over 1 year agoπŸ«‚ Last layoff over 1 year agoDigital MarketingAdvertising PlatformsSaaSTravel

  • 3+ years of experience in a data-driven role, preferably in analytics, data engineering, or business intelligence.
  • Strong proficiency in SQL (writing complex queries, working with relational databases).
  • Exposure to building ETL pipelines
  • Familiarity with BI tools (Tableau, Power BI, Looker, etc.) and the ability to create meaningful data visualizations.
  • Ability to work in a fast-paced environment, manage multiple priorities, and communicate insights effectively to both technical and non-technical stakeholders.
  • Work closely with Sales, Finance, Product, and Marketing teams to understand business needs and translate them into data solutions.
  • Analyze large datasets to identify trends, performance metrics, and actionable insights that support strategic business initiatives.
  • Develop detailed reports and dashboards using Tableau to help stakeholders make data-driven decisions.
  • Collaborate with data scientists and operations analysts to support testing and analysis on campaigns.
  • Build, optimize, and maintain ETL pipelines to process and transform raw data into structured, meaningful datasets.
  • Work with SQL, Python, and BigQuery to extract, clean, and manipulate data from multiple sources.
  • Ensure data quality, integrity, and consistency across different data sources.

PythonSQLApache AirflowBusiness IntelligenceData AnalysisETLJenkinsTableauAlgorithmsData engineeringData StructuresREST APIReportingData entryData visualizationData modelingData analyticsData management

Posted about 7 hours ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 232000.0 - 310000.0 USD per year

πŸ” Software Development

  • 10+ years of experience designing, developing and launching backend systems at scale using languages like Python or Kotlin.
  • Strong experience leading multiple engineering teams to deliver high quality software
  • Track record of successfully leading engineering teams at both rapidly scaling startups and complex larger technology companies.
  • Expertise in synthesizing complex technical requirements, designs, trade-offs, and capabilities into clear decisions to influence ML & engineering direction
  • Extensive experience developing highly available distributed systems using technologies like AWS, MySQL, Spark and Kubernetes.
  • Experience building and operating online, real-time ML infrastructure including a model server and a feature store
  • Experience developing an offline environment for large scale data analysis and model training using technologies including Spark, Kubeflow, Ray, and Airflow
  • Experience delivering major features and system components
  • Set the multi-year, multi-team technical strategy for ML Platform and deliver it through direct implementation or broad technical leadership
  • Partner with technical leaders across the company to create joint roadmaps that will achieve business impacting goals through the advancement of machine learning
  • Act as a force-multiplier for your teams through your definition and advocacy of technical solutions and operational processes
  • You have an ownership mindset, and you will proactively champion investments in availability so that every project in your area achieves its availability targets
  • You will foster a culture of quality and ownership on your team by setting system design standards for your team, and advocating for them beyond your team through your writing and tech talks
  • You will help develop talent on your team by providing feedback and guidance, and leading by example

AWSBackend DevelopmentLeadershipProject ManagementPythonApache AirflowData AnalysisKotlinKubeflowKubernetesMachine LearningMySQLSoftware ArchitectureCross-functional Team LeadershipData engineeringSparkCommunication SkillsRESTful APIsDevOps

Posted about 13 hours ago
Apply
Apply

πŸ“ Argentina, Colombia, Peru, Bolivia, Plurinational State of, Mexico

🧭 Contract

πŸ’Έ 2300.0 - 2500.0 USD per month

πŸ” Software Development

🏒 Company: Workana

  • Experience with Selenium, Puppeteer or Playwright.
  • Experience with Java (Spring Boot, Jsoup, HttpClient) and Python (Scrapy, Selenium, Playwright, FastAPI).
  • Experience with pandas and NumPy.
  • Knowledge in SQL databases (PostgreSQL, MySQL, SQL Server).
  • Implementation of scrapers in scalable environments with Docker and Kubernetes.
  • Deployment in AWS or GCP.
  • Development and maintenance of scrapers for extracting data from web portals.
  • Refactorization and optimization of legacy scrapers in Java towards Python.
  • Implementation of more efficient architectures to improve performance.

AWSBackend DevelopmentDockerGraphQLPostgreSQLPythonSQLApache AirflowETLGCPJavaJava EEKubernetesMySQLNumpySpring BootData engineeringFastAPIREST APIPandasSeleniumCI/CDMicroservicesJSONData management

Posted 1 day ago
Apply
Apply

πŸ“ Lithuania

πŸ’Έ 4000.0 - 6000.0 EUR per month

πŸ” Software Development

🏒 Company: Softeta

  • 4+ years of experience as a Data Engineer
  • Experience with Azure (Certifications are a Plus)
  • Experience with Databricks, Azure Data Lake, Data Factory and Apache Airflow
  • CI/CD or infrastructure as code
  • Knowledge of Medallion Architecture or Multihop architecture
  • Experience developing and administering ETL processes in the Cloud (Azure, AWS or GCP) environment
  • Strong programming skills in Python and SQL
  • Strong problem-solving and analytical skills
  • Design, develop, and maintain data pipelines and ETL processes
  • Data modeling, data cleansing
  • Automating data processing workflows using tools such as Airflow or other workflow management tools
  • Optimizing the performance of databases, including designing and implementing data structures and using indexes appropriately
  • Implement data quality and data governance processes
  • Being a data advocate and helping unlock business value by using data

PythonSQLApache AirflowETLAzureData engineeringCI/CDData modeling

Posted 1 day ago
Apply
Apply

πŸ“ United States

πŸ’Έ 144000.0 - 180000.0 USD per year

πŸ” Software Development

🏒 Company: HungryrootπŸ‘₯ 101-250πŸ’° $40,000,000 Series C almost 4 years agoArtificial Intelligence (AI)Food and BeverageE-CommerceRetailConsumer GoodsSoftware

  • 5+ years of experience in ETL development and data modeling
  • 5+ years of experience in both Scala and Python
  • 5+ years of experience in Spark
  • Excellent problem-solving skills and the ability to translate business problems into practical solutions
  • 2+ years of experience working with the Databricks Platform
  • Develop pipelines in Spark (Python + Scala) in the Databricks Platform
  • Build cross-functional working relationships with business partners in Food Analytics, Operations, Marketing, and Web/App Development teams to power pipeline development for the business
  • Ensure system reliability and performance
  • Deploy and maintain data pipelines in production
  • Set an example of code quality, data quality, and best practices
  • Work with Analysts and Data Engineers to enable high quality self-service analytics for all of Hungryroot
  • Investigate datasets to answer business questions, ensuring data quality and business assumptions are understood before deploying a pipeline

AWSPythonSQLApache AirflowData MiningETLSnowflakeAlgorithmsAmazon Web ServicesData engineeringData StructuresSparkCI/CDRESTful APIsMicroservicesJSONScalaData visualizationData modelingData analyticsData management

Posted 1 day ago
Apply
Apply

πŸ“ India

πŸ” Software Development

🏒 Company: Apollo.ioπŸ‘₯ 501-1000πŸ’° $100,000,000 Series D over 1 year agoSoftware Development

  • 8+ years of experience building Machine Learning or AI systems
  • Experience deploying and managing machine learning models in the cloud
  • Experience working with fine tuning LLMs and prompt engineering
  • Strong analytical and problem-solving skills
  • Proven software engineering skills in production environment, primarily using Python
  • Experience with Machine Learning software tools and libraries (e.g., Scikit-learn, TensorFlow, Keras, PyTorch, etc.)
  • Design, build, evaluate, deploy and iterate on scalable Machine Learning systems
  • Understand the Machine Learning stack at Apollo and continuously improve it
  • Build systems that help Apollo personalize their users’ experience
  • Evaluate the performance of machine learning systems against business objectives
  • Develop and maintain scalable data pipelines that power our algorithms
  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while productionizing data & ML workflows
  • Write unit/integration tests and contribute to engineering wiki

PythonSQLApache AirflowCloud ComputingData AnalysisKerasMachine LearningMLFlowNumpyPyTorchAlgorithmsREST APITensorflowSoftware Engineering

Posted 1 day ago
Apply
Apply

πŸ“ United States

πŸ’Έ 255000.0 - 300000.0 USD per year

πŸ” Mental Healthcare

🏒 Company: HeadwayπŸ‘₯ 201-500πŸ’° $125,000,000 Series C over 1 year agoMental Health Care

  • 5+ years of experience in a senior data leadership role with proven experience building high-functioning data or analytics team inΒ  a fast-paced environment
  • 5+ years of hands-on experience working with data at various levels
  • Understanding of various technology stacks, and be native in different tools and languages
  • Strong at simplifying data, be able to explain complex data stories to different stakeholders
  • Hire, inspire, and manage a growing team of data and product analysts
  • Partner with the Engineering, Infrastructure, Product, Operations, Finance and Business functions closely to define the direction of the data team and to drive results
  • Be responsible for the tracking and communicating of KPIs, both through central reporting and liberalization of the data
  • Be a key decision-maker on projects involving data within the company
  • Be a key ownerΒ  in insuring the successful execution of the various projects involving data
  • Be a critical thinker, able to quickly grasp metrics, numbers, financial information, and converse with the various stakeholders in a thoughtful manner

AWSLeadershipProject ManagementPythonSQLApache AirflowCloud ComputingData AnalysisETLMachine LearningPeople ManagementCross-functional Team LeadershipProduct AnalyticsData engineeringData scienceCommunication SkillsAnalytical SkillsReportingData visualizationStakeholder managementData modelingData analyticsData management

Posted 3 days ago
Apply
Apply

πŸ“ United States

πŸ” Software Development

🏒 Company: ge_externalsite

  • Exposure to industry standard data modeling tools (e.g., ERWin, ER Studio, etc.).
  • Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend
  • Exposure to industry standard data catalog, automated data discovery and data lineage tools (e.g., Alation, Collibra, TAMR etc., )
  • Hands-on experience in programming languages like Java, Python or Scala
  • Hands-on experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or HiveQL
  • Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase)
  • Exposure to unstructured datasets and ability to handle XML, JSON file formats
  • Work independently as well as with a team to develop and support Ingestion jobs
  • Evaluate and understand various data sources (databases, APIs, flat files etc. to determine optimal ingestion strategies
  • Develop a comprehensive data ingestion architecture, including data pipelines, data transformation logic, and data quality checks, considering scalability and performance requirements.
  • Choose appropriate data ingestion tools and frameworks based on data volume, velocity, and complexity
  • Design and build data pipelines to extract, transform, and load data from source systems to target destinations, ensuring data integrity and consistency
  • Implement data quality checks and validation mechanisms throughout the ingestion process to identify and address data issues
  • Monitor and optimize data ingestion pipelines to ensure efficient data processing and timely delivery
  • Set up monitoring systems to track data ingestion performance, identify potential bottlenecks, and trigger alerts for issues
  • Work closely with data engineers, data analysts, and business stakeholders to understand data requirements and align ingestion strategies with business objectives.
  • Build technical data dictionaries and support business glossaries to analyze the datasets
  • Perform data profiling and data analysis for source systems, manually maintained data, machine generated data and target data repositories
  • Build both logical and physical data models for both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) solutions
  • Develop and maintain data mapping specifications based on the results of data analysis and functional requirements
  • Perform a variety of data loads & data transformations using multiple tools and technologies.
  • Build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications
  • Maintain metadata structures needed for building reusable Extract, Transform & Load (ETL) components.
  • Analyze reference datasets and familiarize with Master Data Management (MDM) tools.
  • Analyze the impact of downstream systems and products
  • Derive solutions and make recommendations from deep dive data analysis.
  • Design and build Data Quality (DQ) rules needed

AWSPostgreSQLPythonSQLApache AirflowApache HadoopData AnalysisData MiningErwinETLHadoop HDFSJavaKafkaMySQLOracleSnowflakeCassandraClickhouseData engineeringData StructuresREST APINosqlSparkJSONData visualizationData modelingData analyticsData management

Posted 3 days ago
Apply
Apply

πŸ“ United States

πŸ” Medical device, pharmaceutical, clinical, or biotechnology

🏒 Company: JobgetherπŸ‘₯ 11-50πŸ’° $1,493,585 Seed about 2 years agoInternet

  • Proficiency in SQL and programming with R or Python
  • Experience with Google Cloud Platform (BigQuery, Storage, Compute Engine) is highly valued.
  • Strong problem-solving skills and the ability to work independently or as part of a team.
  • Excellent communication skills, able to convey complex statistical concepts to non-technical stakeholders.
  • Organize and merge data from diverse sources, ensuring data quality and integrity.
  • Identify and resolve bottlenecks in data processing and analysis, implementing solutions like automation and optimization.
  • Collaborate with clinical and technical teams to streamline data collection and entry processes.
  • Perform statistical analysis, data visualization, and generate reports for clinical studies and other projects.
  • Prepare summary statistics, tables, figures, and listings for presentations and publications.

PythonSQLApache AirflowData MiningETLGCPAlgorithmsData engineeringPandasSparkData visualizationData modelingData analyticsData management

Posted 3 days ago
Apply
Apply

πŸ“ States of SΓ£o Paulo and Rio Grande do Sul, Rio de Janeiro, Belo Horizonte

πŸ” Data Engineering

🏒 Company: TELUS Digital Brazil

  • At least 3 years of experience as Data Engineer
  • Have actively participated in the design and development of data architectures
  • Hands-on experience in developing and optimizing data pipelines
  • Experience working with databases and data modeling projects, as well as practical experience utilizing SQL
  • Effective English communication - able to explain technical and non-technical concepts to different audiences
  • Experience with a general-purpose programming language such as Python or Scala
  • Ability to work well in teams and interact effectively with others
  • Ability to work independently and manage multiple tasks simultaneously while meeting deadlines
  • Develop and optimize scalable, high-performing, secure, and reliable data pipelines that address diverse business needs and considerations
  • Identify opportunities to enhance internal processes, implement automation to streamline manual tasks, and contribute to infrastructure redesign
  • Act as a guide and mentor to junior engineers, supporting their professional growth and fostering an inclusive working environment
  • Collaborate with cross-functional teams to ensure data quality and support data-driven decision-making to strive for greater functionality in our data systems
  • Collaborate with project managers and product owners to assist in prioritizing, estimating, and planning development tasks
  • Provide constructive feedback, and share expertise with fellow team members, fostering mutual growth and learning
  • Engage in ongoing research and adoption of new technologies, libraries, frameworks, and best practices to enhance the capabilities of the data team
  • Demonstrate a commitment to accessibility and ensure that your work considers and positively impacts others

AWSDockerPythonSQLAgileApache AirflowCloud ComputingETLKubernetesData engineeringData scienceCommunication SkillsAnalytical SkillsTeamworkData modelingEnglish communication

Posted 5 days ago
Apply
Shown 10 out of 152

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming β€” software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative β€” graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales β€” digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring β€” teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content β€” creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) β€” Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting β€” bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time β€” the ideal choice for those who value stability and predictability;
  • part-time β€” perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract β€” suited for professionals who want to work on projects for a set period.
  • Temporary β€” short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship β€” a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners β€” ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists β€” if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts β€” roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.