Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today — fast and easy!

Remote IT JobsRemote Job Salaries
Data engineering
802 jobs found. to receive daily emails with new job openings that match your preferences.
802 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
🔥 Director | Data Science
Posted about 1 hour ago

📍 United States

🧭 Full-Time

🔍 Healthcare

🏢 Company: Machinify👥 51-100💰 $10,000,000 Series A over 6 years agoArtificial Intelligence (AI)Business IntelligencePredictive AnalyticsSaaSMachine LearningAnalytics

  • 10+ years of data science experience, with at least 5 years in a leadership role, including a leadership role at a start-up. Proven track record of managing data teams and delivering complex, high-impact products from concept to deployment
  • Strong knowledge of data privacy regulations and best practices in data security
  • Exceptional team management abilities, with experience in building and leading high-performing teams
  • Ability to think strategically and execute methodically
  • Ability to drive change and inspire a distributed team
  • Strong problem-solving skills and a data-driven mindset
  • Ability to communicate effectively, collaborating with diverse groups to solve complex problems
  • Provide direction and guidance to a team of Senior and Staff Data Scientists, enabling them to do their best work
  • Collaborate with the leadership team to define key technical and business metrics and objectives
  • Translate objectives into internal team priorities and assignments
  • Drive sprints and work with cross-functional stakeholders to appropriately prioritize various initiatives to improve customer metrics
  • Hire, mentor and develop team members
  • Foster a culture of innovation, collaboration, and continuous improvement
  • Communicate technical concepts and strategies to technical and non-technical stakeholders effectively
  • Own the success of various models on the field by continuously monitoring KPI’s and initiating projects to improve quality.

AWSLeadershipPythonSQLApache AirflowCloud ComputingData AnalysisETLKerasMachine LearningMLFlowNumpyPeople ManagementCross-functional Team LeadershipAlgorithmsData engineeringData scienceData StructuresPandasSparkTensorflowCommunication SkillsProblem SolvingAgile methodologiesRESTful APIsMentoringData visualizationTeam managementStrategic thinkingData modeling

Posted about 1 hour ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 15 hours ago

📍 Worldwide

🧭 Full-Time

🔍 Software Development

🏢 Company: Kit👥 11-50💰 over 1 year agoEducationFinancial ServicesApps

  • Strong command of SQL, including DDL and DML.
  • Proficient in Python
  • Strong understanding of DBMS internals, including an appreciation for platform-specific nuances.
  • A willingness to work with Redshift and deeply understand its nuances.
  • Familiarity with our key tools (Redshift, Segment, dbt, github)
  • 8+ years in data, with at least 3 years specializing in Data Engineering
  • Proven track record managing and optimizing OLAP clusters
  • Experience refactoring problematic data pipelines without disrupting business operations
  • History of implementing data quality frameworks and validation processes
  • Dive into our Redshift warehouse, dbt models, and workflows.
  • Evaluate the CRM data lifecycle, including source extraction, warehouse ingestion, transformation, and reverse ETL.
  • Refine and start implementing your design for source extraction and warehouse ingestion.
  • Complete the implementation of the CRM source extraction/ingestion project and use the learnings to refine your approach in preparation for other, similar initiatives including, but by no means limited to web traffic events and product usage logs.

PythonSQLETLGitData engineeringRDBMSData modelingData management

Posted about 15 hours ago
Apply
Apply

📍 Poland

💸 22900.0 - 29900.0 PLN per month

🔍 Threat Intelligence

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 2+ years of experience as a Data Engineer, working with large-scale distributed systems.
  • Proven expertise in Lakehouse architecture and Apache Hudi in production environments.
  • Experience with Airflow, Kafka, or streaming data pipelines.
  • Strong programming skills in Python and PySpark.
  • Comfortable working in a cloud-based environment (preferably AWS).
  • Design, build, and manage scalable data pipelines using Python, SQL, and PySpark.
  • Develop and maintain lakehouse architectures, with hands-on use of Apache Hudi for data versioning, upserts, and compaction.
  • Implement efficient ETL/ELT processes for both batch and real-time data ingestion.
  • Optimize data storage and query performance across large datasets (partitioning, indexing, compaction).
  • Ensure data quality, governance, and lineage, integrating validation and monitoring into pipelines.
  • Work with cloud-native services (preferably AWS – S3, Athena, EMR) to support modern data workflows.
  • Collaborate closely with data scientists, analysts, and platform engineers to deliver reliable data infrastructure

AWSPythonSQLCloud ComputingETLKafkaAirflowData engineeringCI/CDTerraform

Posted about 15 hours ago
Apply
Apply
🔥 Senior Data Governance Engineer
Posted about 16 hours ago

📍 Brazil

🔍 Data Governance

🏢 Company: TELUS Digital Brazil

  • At least 3 years of experience in Data Governance, Metadata Management, Data Cataloging, or Data Engineering.
  • Have actively participated in the design, implementation, and management of data governance frameworks and data catalogs.
  • Experience working with Colibra and a strong understanding of the Collibra Operating Model, workflows, domains, and policies.
  • Experience working with APIs.
  • Experience with a low-code/no-code ETL tool, such as Informatica
  • Experience working with databases and data modeling projects, as well as practical experience utilizing SQL
  • Effective English communication - able to explain technical and non-technical concepts to different audiences
  • Experience with a general-purpose programming language such as Python or Scala
  • Ability to work well in teams and interact effectively with others
  • Ability to work independently and manage multiple tasks simultaneously while meeting deadlines
  • Conduct detailed assessments of the customer’s data governance framework and current-state Collibra implementation
  • Translate business needs into functional specifications for Collibra use cases
  • Serve as a trusted advisor to the customer’s data governance leadership
  • Lead requirement-gathering sessions and workshops with business users and technical teams
  • Collaborate with cross-functional teams to ensure data quality and support data-driven decision-making to strive for greater functionality in our data systems
  • Collaborate with project managers and product owners to assist in prioritizing, estimating, and planning development tasks
  • Provide constructive feedback and share expertise with fellow team members, fostering mutual growth and learning
  • Demonstrate a commitment to accessibility and ensure that your work considers and positively impacts others

PythonSQLETLData engineeringRESTful APIsData visualizationData modelingData analyticsData management

Posted about 16 hours ago
Apply
Apply
🔥 Health Data Analytics Engineer
Posted about 16 hours ago

📍 United States

🧭 Full-Time

💸 135000.0 - 170000.0 USD per year

🔍 Healthcare

🏢 Company: SmarterDx👥 101-250💰 $50,000,000 Series B about 1 year agoArtificial Intelligence (AI)HospitalInformation TechnologyHealth Care

  • 3+ years of analytics engineering experience in the healthcare industry, involving clinical and/or billing/claims data.
  • You are very well-versed in SQL and ETL processes, significant experience in dbt is a must
  • You have experience in a general purpose programming language (Python, Java, Scala, Ruby, etc.)
  • You have strong experience in data modeling and their implementation in production  data pipelines.
  • You are comfortable with the essentials of data orchestration
  • Designing, developing, and maintaining dbt data models that support our healthcare analytics products.
  • Integrating and transforming customer data to conform to our data specifications and standards.
  • Collaborating with cross-functional teams to translate data and business requirements into effective data models.
  • Configuring and improving data pipelines that integrate and connect the data models.
  • Conducting QA and testing on data models to ensure data accuracy, reliability, and performance.
  • Applying industry standards and best practices around data modeling, testing, and data pipelining.
  • Participating in a rota with other engineers to help investigate and resolve data related issues.

AWSPostgreSQLPythonSQLApache AirflowETLSnowflakeData engineeringData modelingData analytics

Posted about 16 hours ago
Apply
Apply

📍 United States

💸 175000.0 - 200000.0 USD per year

🔍 Manufacturing

  • 10+ years of experience in data, analytics, and solution architecture roles.
  • Proven success in a pre-sales or client-facing architecture role.
  • Deep understanding of Microsoft Azure data services, including: Azure Synapse, Data Factory, Data Lake, Databricks/Spark, Fabric, etc.
  • Experience aligning modern data architectures to business strategies in the manufacturing or industrial sector.
  • Experience scoping, pricing, and estimating delivery efforts in early-stage client engagements.
  • Clear and confident communicator who can build trust with technical and business stakeholders alike.
  • Lead strategic discovery sessions with executives and business stakeholders to understand business goals, challenges, and technical landscapes.
  • Define solution architectures and delivery roadmaps that align with client objectives and Microsoft best practices.
  • Scope, estimate, and present proposals for complex Data, AI, and Azure-based projects.
  • Architect end-to-end data and analytics solutions using Microsoft Azure (Data Lake, Synapse, ADF, Power BI, Microsoft Fabric, Azure AI Services, etc.).
  • Oversee solution quality from ideation through handoff to delivery teams.
  • Support and guide sales consultants in an effort to align solutions, progress deals and effectively close business.
  • Collaborate with Delivery to ensure project success and customer satisfaction.
  • Contribute to thought leadership, IP development, and reusable solution frameworks.

SQLCloud ComputingETLMachine LearningMicrosoft AzureMicrosoft Power BIAzureData engineeringRESTful APIsData modelingData analyticsData management

Posted about 17 hours ago
Apply
Apply
🔥 Product Manager
Posted about 18 hours ago

📍 United States, Africa, Latin America, Asia

🧭 Full-Time

🔍 AI

🏢 Company: Pareto.AI👥 101-250💰 $4,500,000 Seed about 3 years agoSoftware

  • 3+ years shipping data infrastructure or ML-adjacent SaaS products with measurable adoption.
  • Proven record of rapid, iterative releases—multiple significant deployments per quarter.
  • Competence with SQL and basic Python; can self-serve analysis without data-engineering support.
  • Familiarity with data-collection pipelines, LLM training stages and workflows, and privacy frameworks (GDPR, CCPA).
  • Comfortable translating technical detail into clear product choices.
  • Exceptional written communication; produce single-page briefs that unblock design and engineering the same day.
  • Own the end-to-end life cycle for assigned platform modules—requirements, prioritization, release, and iteration.
  • Define and monitor metrics for data latency, cost per datapoint, and quality compliance; drive course-corrections in real time.
  • Conduct regular customer and stakeholder interviews; translate insights into backlog refinements and roadmap updates.
  • Facilitate sprint rituals and decision reviews, ensuring blockers are identified and resolved within service-level targets.
  • Present concise status and recommendations to leadership, backing arguments with data.

PythonSQLData AnalysisMachine LearningProduct ManagementData engineeringRESTful APIs

Posted about 18 hours ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 19 hours ago

📍 United States of America

🏢 Company: IDEXX

  • Bachelor’s degree in Computer Science, Computer Engineering, Information Systems, Information Systems Engineering or a related field and 5 years of experience or Master’s degree in Computer Science, Computer Engineering, Information Systems, Information Systems Engineering or a related field and 3 years of related professional experience.
  • Advanced SQL knowledge and experience working with relational databases, including Snowflake, Oracle, Redshift.
  • Experience with AWS or Azure cloud platforms
  • Experience with data pipeline and workflow scheduling tools: Apache Airflow, Informatica.
  • Experience with ETL/ELT tools and data processing techniques
  • Experience in database design, development, and modeling
  • 3 years of related professional experience with object-oriented languages: Python, Java, and Scala
  • Design and implement scalable, reliable distributed data processing frameworks and analytical infrastructure
  • Design metadata and schemas for assigned projects based on a logical model
  • Create scripts for physical data layout
  • Write scripts to load test data
  • Validate schema design
  • Develop and implement node cluster models for unstructured data storage and metadata
  • Design advanced level Structured Query Language (SQL), data definition language (DDL) and Python scripts
  • Define, design, and implement data management, storage, backup and recovery solutions
  • Design automated software deployment functionality
  • Monitor structural performance and utilization, identifying problems and implements solutions
  • Lead the creation of standards, best practices and new processes for operational integration of new technology solutions
  • Ensures environments are compliant with defined standards and operational procedures
  • Implement measures to ensure data accuracy and accessibility, constantly monitoring and refining the performance of data management systems

AWSPythonSQLApache AirflowCloud ComputingETLJavaOracleSnowflakeAzureData engineeringScalaData modelingData management

Posted about 19 hours ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 138000.0 - 150000.0 USD per year

🔍 Education

🏢 Company: Gradient Learning

  • 5+ years of experience working in the K-12 space
  • 3+ years of experience in data product development
  • 3+ years of experience translating complex data into educator-friendly visualizations using Tableau
  • 3+ years of people management experience
  • 3+ years of experience with Snowflake or comparable cloud-based data warehousing platforms (strongly preferred)
  • Experience using AI or machine learning to enhance data analysis or deliver scalable, educator-facing insights (strongly preferred)
  • Familiarity with LTI standards, LMS platforms, and education data interoperability; direct experience with Canvas LMS (strongly preferred)
  • Knowledge of data privacy, security, and protection standards, particularly as they relate to PII and educational data (FERPA, COPPA, etc.) (preferred)
  • Design, Refine, and Lead the Data & Insights Product Strategy
  • Oversee Data & Insights Product Development and Delivery
  • Strengthen Data Infrastructure in Partnership with Information Systems
  • Lead the Data & Insights Product Delivery Team

SQLData AnalysisETLMachine LearningPeople ManagementProduct ManagementSnowflakeUser Experience DesignCross-functional Team LeadershipTableauProduct DevelopmentData engineeringCommunication SkillsAgile methodologiesRESTful APIsData visualizationStakeholder managementStrategic thinkingData modelingData analyticsData management

Posted about 20 hours ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Information Security

  • 5+ years of experience in security engineering, with a primary focus on SIEM platforms.
  • Hands-on experience with at least two of the following SIEM platforms: Splunk, Microsoft Sentinel, Elastic, Google SecOps, CrowdStrike NG-SIEM, LogScale
  • 2+ years of experience with Cribl or similar observability pipeline tools (e.g., Logstash, Fluentd, Kafka).
  • Strong knowledge of log formats, data normalization, and event correlation.
  • Familiarity with detection engineering, threat modeling, and MITRE ATT&CK framework.
  • Proficiency with scripting (e.g., Python, PowerShell, Bash) and regular expressions.
  • Deep understanding of logging from cloud (AWS, Azure, GCP) and on-prem environments.
  • Architect, implement, and maintain SIEM solutions with a focus on modern platforms
  • Design and manage log ingestion pipelines using tools such as Cribl Stream, Edge, or Search (or similar).
  • Optimize data routing, enrichment, and filtering to improve SIEM efficiency and cost control.
  • Collaborate with cybersecurity, DevOps, and cloud infrastructure teams to integrate log sources and telemetry data.
  • Develop custom parsers, dashboards, correlation rules, and alerting logic for security analytics and threat detection.
  • Maintain and enhance system reliability, scalability, and performance of logging infrastructure.
  • Provide expertise and guidance on log normalization, storage strategy, and data retention policies.
  • Lead incident response investigations and assist with root cause analysis leveraging SIEM insights.
  • Mentor junior engineers and contribute to strategic security monitoring initiatives.

AWSPythonBashCloud ComputingGCPKafkaKubernetesAPI testingAzureData engineeringCI/CDRESTful APIsLinuxDevOpsJSONAnsibleScripting

Posted about 21 hours ago
Apply
Shown 10 out of 802

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming — software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative — graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales — digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring — teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content — creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) — Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting — bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time — the ideal choice for those who value stability and predictability;
  • part-time — perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract — suited for professionals who want to work on projects for a set period.
  • Temporary — short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship — a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners — ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists — if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts — roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.