Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today β€” fast and easy!

Remote IT JobsRemote Job Salaries
Data engineering
798 jobs found. to receive daily emails with new job openings that match your preferences.
798 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Brazil

🧭 Full-Time

πŸ” Software Development

🏒 Company: Grupo QuintoAndar

  • Solid understanding of the engineering challenges of deploying machine learning systems to production;
  • Solid understanding of systems, software and data engineering best practices;
  • Proficiency with cloud-based services;
  • Proficiency in Python or another major programming language;
  • Have experience building backend systems, event driven architectures and rest/http applications
  • Have experience building solutions with LLMs (RAG, fine-tuning);
  • Have experience building AI agents and AI powered applications, familiarity with agentic frameworks (LangGraph, LangChain, AutoGen, CrewAI);
  • Experience leading teams and managing careers.
  • Lead a small team of Data Scientists and Machine Learning Engineers to build solutions based on AI/ML.
  • Be a technical reference to the team, including doing hands-on engineering work.
  • Shaping the technical direction of our products, translating business requirements into solutions.
  • Discuss business requirements with the Product Manager and other stakeholders.

AWSBackend DevelopmentDockerLeadershipProject ManagementPythonSoftware DevelopmentSQLArtificial IntelligenceCloud ComputingKubernetesMachine LearningSoftware ArchitectureAlgorithmsData engineeringData scienceData StructuresREST APICommunication SkillsCI/CDAgile methodologiesRESTful APIsExcellent communication skillsProblem-solving skillsTeam managementTechnical supportData modeling

Posted about 8 hours ago
Apply
Apply

πŸ“ Portugal

🧭 Full-Time

πŸ” Real Estate Tech

🏒 Company: Grupo QuintoAndar

  • Solid understanding of the engineering challenges of deploying machine learning systems to production
  • Proficiency with cloud-based services
  • Proficiency in Python or another major programming language
  • Have experience building backend systems, event driven architectures and rest/http applications
  • Have experience building solutions with LLMs (RAG, fine-tuning)
  • Have experience building AI agents and AI powered applications, familiarity with agentic frameworks (LangGraph, LangChain, AutoGen, CrewAI)
  • Lead a small team of Data Scientists and Machine Learning Engineers to build solutions based on AI/ML.
  • Be a technical reference to the team, including doing hands-on engineering work.
  • Shaping the technical direction of our products, translating business requirements into solutions.
  • Discuss business requirements with the Product Manager and other stakeholders.

AWSBackend DevelopmentDockerLeadershipPythonSoftware DevelopmentSQLApache AirflowCloud ComputingFlaskKubernetesMachine LearningNumpyData engineeringData scienceRDBMSREST APIPandasCommunication SkillsCI/CDRESTful APIs

Posted about 8 hours ago
Apply
Apply
πŸ”₯ Engineering Manager (Data)
Posted about 10 hours ago

πŸ“ Romania

🧭 Full-Time

πŸ” Software Development

🏒 Company: Plain ConceptsπŸ‘₯ 251-500ConsultingAppsMobile AppsInformation TechnologyMobile

  • At least 3 years of experience as a Delivery Manager, Engineering Manager, or similar role in software, data-intensive or analytics projects.
  • Proven experience managing client relationships and navigating stakeholder expectations.
  • Strong technical background in Data Engineering (e.g., Python, Spark, SQL) and Cloud Data Platforms (e.g., Azure Data Services, AWS, or similar).
  • Solid understanding of scalable software and data architectures, CI/CD practices for data pipelines, and cloud-native data solutions.
  • Experience with data pipelines, sensor integration, edge computing, or real-time analytics is a big plus.
  • Ability to read, write, and discuss technical documentation with confidence.
  • Strong analytical and consultative skills to identify impactful opportunities.
  • Agile mindset, always focused on delivering real value fast.
  • Conflict resolution skills and a proactive approach to identifying and mitigating risks.
  • Understanding the business and technical objectives of data-driven projects.
  • Leading multidisciplinary teams to deliver scalable and robust software and data solutions on time and within budget.
  • Maintaining proactive and transparent communication with clients, helping them understand the impact of data products.
  • Supporting the team during key client interactions and solution presentations.
  • Designing scalable architectures for data ingestion, processing, and analytics.
  • Collaborating with data engineers, analysts, and data scientists to align solutions with client needs.
  • Ensuring the quality and scalability of data solutions and deliverables across cloud environments.
  • Analyzing system performance and recommending improvements using data-driven insights.
  • Providing hands-on technical guidance and mentorship to your team and clients when needed

AWSPythonSQLAgileCloud ComputingAzureData engineeringSparkCommunication SkillsCI/CDClient relationship managementTeam managementStakeholder managementData analytics

Posted about 10 hours ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 140000.0 - 160000.0 USD per year

πŸ” Software Development

🏒 Company: JobgetherπŸ‘₯ 11-50πŸ’° $1,493,585 Seed about 2 years agoInternet

  • 5+ years of experience building scalable backend applications and APIs.
  • Proficiency in Go, Python, or Java, with a strong grasp of SQL and NoSQL databases (e.g., Bigtable, BigQuery, DynamoDB).
  • Experience working with cloud infrastructure, preferably AWS or GCP, and CI/CD pipelines.
  • Familiarity with containerization technologies such as Docker and Kubernetes.
  • Strong problem-solving and analytical skills, with the ability to communicate complex concepts clearly.
  • Design and implement ETL pipelines capable of processing large-scale datasets efficiently.
  • Build and maintain robust APIs for data retrieval, including support for complex query types.
  • Architect scalable data storage and retrieval systems using SQL/NoSQL technologies.
  • Transform raw data into structured, high-value data products to support business and operational decisions.
  • Collaborate with internal stakeholders to align data architecture with product and customer needs.
  • Document technical processes and mentor junior team members.
  • Ensure performance, security, and scalability across the data platform.

AWSBackend DevelopmentDockerPythonSQLDynamoDBETLGCPJavaKubernetesAPI testingData engineeringGoNosqlCI/CDRESTful APIsData modelingSoftware EngineeringData analytics

Posted about 10 hours ago
Apply
Apply
πŸ”₯ Enterprise Data Architect
Posted about 11 hours ago

πŸ“ GBR

🧭 Full-Time

πŸ” Insurance

🏒 Company: dxcjobs

  • Experience in defining high quality Logical Data Models and gaining approval for such models in a complex stakeholder environment.
  • Experience of stakeholder management including senior stakeholders, 3rd party partners and standards organisations.
  • Experience of working with industry standard data models and extending those models to support business requirements.
  • Experience of working in an insurance business domain and with insurance logical data models, including commercial insurance.
  • Experience with a wide variety of technical domains from green field microservice architectures to workflow management systems, integration tooling and mainframe.
  • Experience of Logical Physical data mapping and of large-scale transformation projects moving from complex legacy system domains to modern digital applications.
  • Experience of defining transition plans and road maps for data migration, including both bulk and gradual migration.
  • Experience of defining patterns for and implementing BI/MI, analytics, and machine learning on large-volume datasets.
  • Experience of data designing and implementing data security policies and regulatory compliance.
  • Experience of defining and implementing governance processes for data management.
  • Experience of working with external standards organisations.
  • Experience of working with TOGAF framework.
  • Define and manage an Enterprise Data Strategy, supporting the business strategy and objectives.
  • Conduct detailed analysis of business processes, requirements, and scenarios to produce high quality Logical Data Models and work with senior stakeholders, third party partners, and standards organisations to gain approval for finished artefacts.
  • Work with and customise Industry Standard Logical Data Models to produce and maintain models tailored to the specific business domain.
  • Define, document and maintain business value chain for enterprise data.
  • Analyse business processes to identify and document key data dependency through key end-to-end business process flows.
  • Work with internal and external development teams and technical architects to drive logical service architecture design, external API contract design, and internal physical data model design.
  • Work with Technical BA's to analyse existing system behaviour, data models and business process implementations to facilitate logical to physical data mapping into existing systems to drive system integration designs and data transformation.
  • Design data models and patterns for downstream data exposure to facilitate accessibility for and analytics, BI, MI and reporting, both internal and external.
  • Work with external 3rd party standards bodies to define and maintain global messaging standards.
  • Based on business process analysis, understand and document life cycles of key data entities and define non-functional requirements for data consistency, integrity, availability, latency, and auditability.
  • Work with Security Architects to define and agree on data security classifications for data defined in data models.
  • Define and manage data governance framework for the program and work with existing architecture team to hand into BAU.

SQLBusiness IntelligenceCloud ComputingData MiningETLAPI testingData engineeringREST APIMicroservicesData visualizationStakeholder managementData modelingData analyticsData management

Posted about 11 hours ago
Apply
Apply
πŸ”₯ Operations Engineer
Posted about 12 hours ago

πŸ“ United States

🧭 Full-Time

πŸ” Software Development

🏒 Company: ArrivedπŸ‘₯ 1-50

  • Software engineering background with strong programming skills in Python, SQL, and modern development practices
  • Experience building and maintaining production data systems, APIs, or backend services
  • Hands-on experience with data pipeline tools (dbt, Fivetran, or similar)
  • Strong SQL skills and experience with data warehouses (Snowflake)
  • Build production data systems – design and implement scalable data pipelines, ETL processes, and APIs that handle structured and unstructured data from multiple sources
  • Develop internal applications – write maintainable code for internal tools, automation services, and integrations
  • Own system architecture – make technical decisions about data flow, system design, and technology stack choices that balance immediate needs with long-term scalability

AWSBackend DevelopmentProject ManagementPythonSQLETLSnowflakeAlgorithmsAPI testingData engineeringData StructuresREST APICommunication SkillsCI/CDRESTful APIsDevOpsData visualizationData modelingSoftware EngineeringData analyticsData management

Posted about 12 hours ago
Apply
Apply
πŸ”₯ Senior Analytics Engineer
Posted about 13 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 150000.0 - 200000.0 USD per year

πŸ” Energy

🏒 Company: ArcadiaπŸ‘₯ 501-1000πŸ’° $125,000,000 over 2 years agoDatabaseCleanTechRenewable EnergyClean EnergySoftware

  • 4+ years as an Analytics Engineer or equivalent role; experience with dbt is strongly preferred
  • 6+ years, cumulatively, in the data space (data engineering, data science, analytics, or similar)
  • Expert-level understanding of conceptual data modeling and data mart design
  • An understanding of data structures and/or database design plus deep experience with SQL and Python
  • Experience building data pipelines and database management including Snowflake or similar
  • Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business
  • Experience in technical leadership or mentorship
  • Strong communication and collaboration skills
  • Proven ability to solve complex problems in a dynamic and evolving environment
  • Transform, test, deploy, and document data to deliver clean and trustworthy data for analysis to end-users
  • Collaborate with subject matter experts, engineers, and product managers to identify the most elegant and effective data structures to understand our constantly growing and evolving business
  • Help bring engineering best practices (reliability, modularity, test coverage, documentation) to our DAG and to our Data team generally
  • Collaborate with data engineers to build robust, tested, scalable ELT pipelines.
  • Data modeling: model raw data into clean, tested, and reusable datasets to represent our key business data concepts. Define the rules and requirements for the formats and attributes of data
  • Data transformation: build our data lakehouse by transforming raw data into meaningful, useful data elements through joining, filtering, and aggregating source dataData documentation: create and maintain data documentation including data definitions and understandable data descriptions to enable broad-scale understanding of the use of data
  • Employ software engineering best practices to write code and coach analysts and data scientists to do the same

AWSPythonSQLData AnalysisETLGitSnowflakeData engineeringData StructuresCommunication SkillsAnalytical SkillsCollaborationData visualizationData modelingData management

Posted about 13 hours ago
Apply
Apply

πŸ“ Chile, Colombia

πŸ” Software Development

🏒 Company: OfferUpπŸ‘₯ 251-500πŸ’° $120,000,000 about 5 years agoπŸ«‚ Last layoff over 2 years agoMobile PaymentsMarketplaceE-CommerceE-Commerce PlatformsAppsMobileClassifieds

  • 3+ years of professional software development experience
  • Strong ability in distributed systems for processing large scale data processing
  • Ability to communicate technical information effectively to technical and non-technical audiences
  • Proficiency in SQL and Python
  • Experience leveraging open source data infrastructure projects, such as Airflow, Kafka, Avro, Parquet, Hadoop, Hive, HBase, Presto or Druid
  • Experience building scalable data pipelines and real-time data streams
  • Experience building software in AWS or a similar cloud environment
  • Experience with AWS services like Kinesis, Firehose, Lambda, Sagemaker is a big plus
  • Experience with GCP services like BigQuery, Cloud Functions is a big plus
  • Experience with operational tools like Terraform, Datadog and Pagerduty is a big plus
  • Design and develop applications to process large amounts of critical information to power analytics and user-facing features.
  • Monitor and resolve data pipeline or data integrity issues.
  • Work across multiple teams to understand their data needs.
  • Maintain and expand our data infrastructure.

AWSPythonSoftware DevelopmentSQLApache AirflowCloud ComputingGCPApache KafkaData engineeringTerraformData visualization

Posted about 13 hours ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 118300.0 - 147800.0 USD per year

πŸ” Software Development

🏒 Company: TwilioπŸ‘₯ 5001-10000πŸ’° $378,215,525 Post-IPO Equity almost 4 years agoπŸ«‚ Last layoff over 1 year agoMessagingSMSMobile AppsEnterprise SoftwareSoftware

  • 3+ years of success in a solutions engineering (presales) role with 6+ years of progressive professional experience supporting technical products.
  • Experience with Data Engineering, Cloud Data Warehouses, Data Modeling, and APIs
  • Experience with marketing technology such as marketing automation, personalization, journey orchestration, or advertising.
  • Partner with Account Executives to execute pre-sales activities including opportunity qualification, discovery, demonstrations, Proof-of-Concept, RFP responses,and justifying business value.
  • Become an expert builder and evangelist of Segment’s products and partner ecosystem.
  • Lead technical evaluations with our prospects to uncover their goals and technical pain, in order to design, demonstrate and present innovative solutions that unlock business-level gains.
  • Develop subject matter expertise on Customer Data Platforms (CDPs) and Segment’s role within the customer data ecosystem.
  • Act as a trusted advisor to consult with customers and prospects in order to influence their customer data strategy and architecture design.
  • Build a perspective on customer and market trends by listening to prospects and advocating for customer interests to influence Segment’s strategy and vision.

AWSCloud ComputingETLData engineeringData scienceCommunication SkillsProblem SolvingRESTful APIsSales experienceMarketingData modelingScriptingData analyticsData managementCustomer Success

Posted about 14 hours ago
Apply
Apply
πŸ”₯ Data Platform Engineer
Posted about 14 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 180000.0 - 220000.0 USD per year

πŸ” Software Development

🏒 Company: PreparedπŸ‘₯ 51-100πŸ’° $27,000,000 Series B 8 months agoEnterprise SoftwarePublic Safety

  • 5+ years of experience in data engineering, software engineering with a data focus, data science, or a related role
  • Knowledge of designing data pipelines from a variety of source (e.g. streaming, flat files, APIs)
  • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL)
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink, Pulsar, Redpanda)
  • Strong programming skills in common data-focused languages (e.g., Python, Scala)
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Prefect, Temporal)
  • Familiarity with AWS-based data solutions
  • Strong understanding of data warehousing concepts and technologies (Snowflake)
  • Experience documenting data dependency maps and data lineage
  • Strong communication and collaboration skills
  • Ability to work independently and take initiative
  • Proficiency in containerization and orchestration tools (e.g., Docker, Kubernetes)
  • Design, implement, and maintain scalable data pipelines and infrastructure
  • Collaborate with software engineers, product managers, customer success managers, and others across the business to understand data requirements
  • Optimize and manage our data storage solutions
  • Ensure data quality, reliability, and security across the data lifecycle
  • Develop and maintain ETL processes and frameworks
  • Work with stakeholders to define data availability SLAs
  • Create and manage data models to support business intelligence and analytics

AWSDockerPostgreSQLPythonSQLApache AirflowETLKubernetesSnowflakeApache KafkaData engineeringSparkScalaData modeling

Posted about 14 hours ago
Apply
Shown 10 out of 798

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming β€” software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative β€” graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales β€” digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring β€” teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content β€” creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) β€” Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting β€” bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time β€” the ideal choice for those who value stability and predictability;
  • part-time β€” perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract β€” suited for professionals who want to work on projects for a set period.
  • Temporary β€” short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship β€” a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners β€” ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists β€” if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts β€” roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.