Remote Data Analyst Jobs

Data engineering
798 jobs found. to receive daily emails with new job openings that match your preferences.
798 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Brazil

🧭 Full-Time

πŸ” Software Development

🏒 Company: Grupo QuintoAndar

  • Solid understanding of the engineering challenges of deploying machine learning systems to production;
  • Solid understanding of systems, software and data engineering best practices;
  • Proficiency with cloud-based services;
  • Proficiency in Python or another major programming language;
  • Have experience building backend systems, event driven architectures and rest/http applications
  • Have experience building solutions with LLMs (RAG, fine-tuning);
  • Have experience building AI agents and AI powered applications, familiarity with agentic frameworks (LangGraph, LangChain, AutoGen, CrewAI);
  • Experience leading teams and managing careers.
  • Lead a small team of Data Scientists and Machine Learning Engineers to build solutions based on AI/ML.
  • Be a technical reference to the team, including doing hands-on engineering work.
  • Shaping the technical direction of our products, translating business requirements into solutions.
  • Discuss business requirements with the Product Manager and other stakeholders.

AWSBackend DevelopmentDockerLeadershipProject ManagementPythonSoftware DevelopmentSQLArtificial IntelligenceCloud ComputingKubernetesMachine LearningSoftware ArchitectureAlgorithmsData engineeringData scienceData StructuresREST APICommunication SkillsCI/CDAgile methodologiesRESTful APIsExcellent communication skillsProblem-solving skillsTeam managementTechnical supportData modeling

Posted about 4 hours ago
Apply
Apply

πŸ“ Portugal

🧭 Full-Time

πŸ” Real Estate Tech

🏒 Company: Grupo QuintoAndar

  • Solid understanding of the engineering challenges of deploying machine learning systems to production
  • Proficiency with cloud-based services
  • Proficiency in Python or another major programming language
  • Have experience building backend systems, event driven architectures and rest/http applications
  • Have experience building solutions with LLMs (RAG, fine-tuning)
  • Have experience building AI agents and AI powered applications, familiarity with agentic frameworks (LangGraph, LangChain, AutoGen, CrewAI)
  • Lead a small team of Data Scientists and Machine Learning Engineers to build solutions based on AI/ML.
  • Be a technical reference to the team, including doing hands-on engineering work.
  • Shaping the technical direction of our products, translating business requirements into solutions.
  • Discuss business requirements with the Product Manager and other stakeholders.

AWSBackend DevelopmentDockerLeadershipPythonSoftware DevelopmentSQLApache AirflowCloud ComputingFlaskKubernetesMachine LearningNumpyData engineeringData scienceRDBMSREST APIPandasCommunication SkillsCI/CDRESTful APIs

Posted about 4 hours ago
Apply
Apply
πŸ”₯ Engineering Manager (Data)
Posted about 6 hours ago

πŸ“ Romania

🧭 Full-Time

πŸ” Software Development

🏒 Company: Plain ConceptsπŸ‘₯ 251-500ConsultingAppsMobile AppsInformation TechnologyMobile

  • At least 3 years of experience as a Delivery Manager, Engineering Manager, or similar role in software, data-intensive or analytics projects.
  • Proven experience managing client relationships and navigating stakeholder expectations.
  • Strong technical background in Data Engineering (e.g., Python, Spark, SQL) and Cloud Data Platforms (e.g., Azure Data Services, AWS, or similar).
  • Solid understanding of scalable software and data architectures, CI/CD practices for data pipelines, and cloud-native data solutions.
  • Experience with data pipelines, sensor integration, edge computing, or real-time analytics is a big plus.
  • Ability to read, write, and discuss technical documentation with confidence.
  • Strong analytical and consultative skills to identify impactful opportunities.
  • Agile mindset, always focused on delivering real value fast.
  • Conflict resolution skills and a proactive approach to identifying and mitigating risks.
  • Understanding the business and technical objectives of data-driven projects.
  • Leading multidisciplinary teams to deliver scalable and robust software and data solutions on time and within budget.
  • Maintaining proactive and transparent communication with clients, helping them understand the impact of data products.
  • Supporting the team during key client interactions and solution presentations.
  • Designing scalable architectures for data ingestion, processing, and analytics.
  • Collaborating with data engineers, analysts, and data scientists to align solutions with client needs.
  • Ensuring the quality and scalability of data solutions and deliverables across cloud environments.
  • Analyzing system performance and recommending improvements using data-driven insights.
  • Providing hands-on technical guidance and mentorship to your team and clients when needed

AWSPythonSQLAgileCloud ComputingAzureData engineeringSparkCommunication SkillsCI/CDClient relationship managementTeam managementStakeholder managementData analytics

Posted about 6 hours ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 140000.0 - 160000.0 USD per year

πŸ” Software Development

🏒 Company: JobgetherπŸ‘₯ 11-50πŸ’° $1,493,585 Seed about 2 years agoInternet

  • 5+ years of experience building scalable backend applications and APIs.
  • Proficiency in Go, Python, or Java, with a strong grasp of SQL and NoSQL databases (e.g., Bigtable, BigQuery, DynamoDB).
  • Experience working with cloud infrastructure, preferably AWS or GCP, and CI/CD pipelines.
  • Familiarity with containerization technologies such as Docker and Kubernetes.
  • Strong problem-solving and analytical skills, with the ability to communicate complex concepts clearly.
  • Design and implement ETL pipelines capable of processing large-scale datasets efficiently.
  • Build and maintain robust APIs for data retrieval, including support for complex query types.
  • Architect scalable data storage and retrieval systems using SQL/NoSQL technologies.
  • Transform raw data into structured, high-value data products to support business and operational decisions.
  • Collaborate with internal stakeholders to align data architecture with product and customer needs.
  • Document technical processes and mentor junior team members.
  • Ensure performance, security, and scalability across the data platform.

AWSBackend DevelopmentDockerPythonSQLDynamoDBETLGCPJavaKubernetesAPI testingData engineeringGoNosqlCI/CDRESTful APIsData modelingSoftware EngineeringData analytics

Posted about 6 hours ago
Apply
Apply
πŸ”₯ Enterprise Data Architect
Posted about 7 hours ago

πŸ“ GBR

🧭 Full-Time

πŸ” Insurance

🏒 Company: dxcjobs

  • Experience in defining high quality Logical Data Models and gaining approval for such models in a complex stakeholder environment.
  • Experience of stakeholder management including senior stakeholders, 3rd party partners and standards organisations.
  • Experience of working with industry standard data models and extending those models to support business requirements.
  • Experience of working in an insurance business domain and with insurance logical data models, including commercial insurance.
  • Experience with a wide variety of technical domains from green field microservice architectures to workflow management systems, integration tooling and mainframe.
  • Experience of Logical Physical data mapping and of large-scale transformation projects moving from complex legacy system domains to modern digital applications.
  • Experience of defining transition plans and road maps for data migration, including both bulk and gradual migration.
  • Experience of defining patterns for and implementing BI/MI, analytics, and machine learning on large-volume datasets.
  • Experience of data designing and implementing data security policies and regulatory compliance.
  • Experience of defining and implementing governance processes for data management.
  • Experience of working with external standards organisations.
  • Experience of working with TOGAF framework.
  • Define and manage an Enterprise Data Strategy, supporting the business strategy and objectives.
  • Conduct detailed analysis of business processes, requirements, and scenarios to produce high quality Logical Data Models and work with senior stakeholders, third party partners, and standards organisations to gain approval for finished artefacts.
  • Work with and customise Industry Standard Logical Data Models to produce and maintain models tailored to the specific business domain.
  • Define, document and maintain business value chain for enterprise data.
  • Analyse business processes to identify and document key data dependency through key end-to-end business process flows.
  • Work with internal and external development teams and technical architects to drive logical service architecture design, external API contract design, and internal physical data model design.
  • Work with Technical BA's to analyse existing system behaviour, data models and business process implementations to facilitate logical to physical data mapping into existing systems to drive system integration designs and data transformation.
  • Design data models and patterns for downstream data exposure to facilitate accessibility for and analytics, BI, MI and reporting, both internal and external.
  • Work with external 3rd party standards bodies to define and maintain global messaging standards.
  • Based on business process analysis, understand and document life cycles of key data entities and define non-functional requirements for data consistency, integrity, availability, latency, and auditability.
  • Work with Security Architects to define and agree on data security classifications for data defined in data models.
  • Define and manage data governance framework for the program and work with existing architecture team to hand into BAU.

SQLBusiness IntelligenceCloud ComputingData MiningETLAPI testingData engineeringREST APIMicroservicesData visualizationStakeholder managementData modelingData analyticsData management

Posted about 7 hours ago
Apply
Apply
πŸ”₯ Operations Engineer
Posted about 8 hours ago

πŸ“ United States

🧭 Full-Time

πŸ” Software Development

🏒 Company: ArrivedπŸ‘₯ 1-50

  • Software engineering background with strong programming skills in Python, SQL, and modern development practices
  • Experience building and maintaining production data systems, APIs, or backend services
  • Hands-on experience with data pipeline tools (dbt, Fivetran, or similar)
  • Strong SQL skills and experience with data warehouses (Snowflake)
  • Build production data systems – design and implement scalable data pipelines, ETL processes, and APIs that handle structured and unstructured data from multiple sources
  • Develop internal applications – write maintainable code for internal tools, automation services, and integrations
  • Own system architecture – make technical decisions about data flow, system design, and technology stack choices that balance immediate needs with long-term scalability

AWSBackend DevelopmentProject ManagementPythonSQLETLSnowflakeAlgorithmsAPI testingData engineeringData StructuresREST APICommunication SkillsCI/CDRESTful APIsDevOpsData visualizationData modelingSoftware EngineeringData analyticsData management

Posted about 8 hours ago
Apply
Apply
πŸ”₯ Senior Analytics Engineer
Posted about 8 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 150000.0 - 200000.0 USD per year

πŸ” Energy

🏒 Company: ArcadiaπŸ‘₯ 501-1000πŸ’° $125,000,000 over 2 years agoDatabaseCleanTechRenewable EnergyClean EnergySoftware

  • 4+ years as an Analytics Engineer or equivalent role; experience with dbt is strongly preferred
  • 6+ years, cumulatively, in the data space (data engineering, data science, analytics, or similar)
  • Expert-level understanding of conceptual data modeling and data mart design
  • An understanding of data structures and/or database design plus deep experience with SQL and Python
  • Experience building data pipelines and database management including Snowflake or similar
  • Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business
  • Experience in technical leadership or mentorship
  • Strong communication and collaboration skills
  • Proven ability to solve complex problems in a dynamic and evolving environment
  • Transform, test, deploy, and document data to deliver clean and trustworthy data for analysis to end-users
  • Collaborate with subject matter experts, engineers, and product managers to identify the most elegant and effective data structures to understand our constantly growing and evolving business
  • Help bring engineering best practices (reliability, modularity, test coverage, documentation) to our DAG and to our Data team generally
  • Collaborate with data engineers to build robust, tested, scalable ELT pipelines.
  • Data modeling: model raw data into clean, tested, and reusable datasets to represent our key business data concepts. Define the rules and requirements for the formats and attributes of data
  • Data transformation: build our data lakehouse by transforming raw data into meaningful, useful data elements through joining, filtering, and aggregating source dataData documentation: create and maintain data documentation including data definitions and understandable data descriptions to enable broad-scale understanding of the use of data
  • Employ software engineering best practices to write code and coach analysts and data scientists to do the same

AWSPythonSQLData AnalysisETLGitSnowflakeData engineeringData StructuresCommunication SkillsAnalytical SkillsCollaborationData visualizationData modelingData management

Posted about 8 hours ago
Apply
Apply

πŸ“ Chile, Colombia

πŸ” Software Development

🏒 Company: OfferUpπŸ‘₯ 251-500πŸ’° $120,000,000 about 5 years agoπŸ«‚ Last layoff over 2 years agoMobile PaymentsMarketplaceE-CommerceE-Commerce PlatformsAppsMobileClassifieds

  • 3+ years of professional software development experience
  • Strong ability in distributed systems for processing large scale data processing
  • Ability to communicate technical information effectively to technical and non-technical audiences
  • Proficiency in SQL and Python
  • Experience leveraging open source data infrastructure projects, such as Airflow, Kafka, Avro, Parquet, Hadoop, Hive, HBase, Presto or Druid
  • Experience building scalable data pipelines and real-time data streams
  • Experience building software in AWS or a similar cloud environment
  • Experience with AWS services like Kinesis, Firehose, Lambda, Sagemaker is a big plus
  • Experience with GCP services like BigQuery, Cloud Functions is a big plus
  • Experience with operational tools like Terraform, Datadog and Pagerduty is a big plus
  • Design and develop applications to process large amounts of critical information to power analytics and user-facing features.
  • Monitor and resolve data pipeline or data integrity issues.
  • Work across multiple teams to understand their data needs.
  • Maintain and expand our data infrastructure.

AWSPythonSoftware DevelopmentSQLApache AirflowCloud ComputingGCPApache KafkaData engineeringTerraformData visualization

Posted about 8 hours ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 118300.0 - 147800.0 USD per year

πŸ” Software Development

🏒 Company: TwilioπŸ‘₯ 5001-10000πŸ’° $378,215,525 Post-IPO Equity almost 4 years agoπŸ«‚ Last layoff over 1 year agoMessagingSMSMobile AppsEnterprise SoftwareSoftware

  • 3+ years of success in a solutions engineering (presales) role with 6+ years of progressive professional experience supporting technical products.
  • Experience with Data Engineering, Cloud Data Warehouses, Data Modeling, and APIs
  • Experience with marketing technology such as marketing automation, personalization, journey orchestration, or advertising.
  • Partner with Account Executives to execute pre-sales activities including opportunity qualification, discovery, demonstrations, Proof-of-Concept, RFP responses,and justifying business value.
  • Become an expert builder and evangelist of Segment’s products and partner ecosystem.
  • Lead technical evaluations with our prospects to uncover their goals and technical pain, in order to design, demonstrate and present innovative solutions that unlock business-level gains.
  • Develop subject matter expertise on Customer Data Platforms (CDPs) and Segment’s role within the customer data ecosystem.
  • Act as a trusted advisor to consult with customers and prospects in order to influence their customer data strategy and architecture design.
  • Build a perspective on customer and market trends by listening to prospects and advocating for customer interests to influence Segment’s strategy and vision.

AWSCloud ComputingETLData engineeringData scienceCommunication SkillsProblem SolvingRESTful APIsSales experienceMarketingData modelingScriptingData analyticsData managementCustomer Success

Posted about 10 hours ago
Apply
Apply
πŸ”₯ Data Platform Engineer
Posted about 10 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 180000.0 - 220000.0 USD per year

πŸ” Software Development

🏒 Company: PreparedπŸ‘₯ 51-100πŸ’° $27,000,000 Series B 8 months agoEnterprise SoftwarePublic Safety

  • 5+ years of experience in data engineering, software engineering with a data focus, data science, or a related role
  • Knowledge of designing data pipelines from a variety of source (e.g. streaming, flat files, APIs)
  • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL)
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink, Pulsar, Redpanda)
  • Strong programming skills in common data-focused languages (e.g., Python, Scala)
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Prefect, Temporal)
  • Familiarity with AWS-based data solutions
  • Strong understanding of data warehousing concepts and technologies (Snowflake)
  • Experience documenting data dependency maps and data lineage
  • Strong communication and collaboration skills
  • Ability to work independently and take initiative
  • Proficiency in containerization and orchestration tools (e.g., Docker, Kubernetes)
  • Design, implement, and maintain scalable data pipelines and infrastructure
  • Collaborate with software engineers, product managers, customer success managers, and others across the business to understand data requirements
  • Optimize and manage our data storage solutions
  • Ensure data quality, reliability, and security across the data lifecycle
  • Develop and maintain ETL processes and frameworks
  • Work with stakeholders to define data availability SLAs
  • Create and manage data models to support business intelligence and analytics

AWSDockerPostgreSQLPythonSQLApache AirflowETLKubernetesSnowflakeApache KafkaData engineeringSparkScalaData modeling

Posted about 10 hours ago
Apply
Shown 10 out of 798

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Remote Data Analyst Jobs

Data analysts are highly sought after as companies prioritize data-driven strategies. Whether you are just starting out or looking to advance your career from home, our platform is the top place to find and apply for the best jobs worldwide. Register on Remoote.app today and explore numerous data analyst jobs with leading companies hiring in the field of analytics!

Data Analyst Responsibilities

Analysts help businesses make smart decisions based on data. Their responsibilities include:

  • gathering, processing, and analyzing large datasets from various sources;
  • creating clear and informative reports and visualizations;
  • identifying patterns, trends, and anomalies in data;
  • supporting strategic business planning with data-driven insights;
  • collaborating with different departments to align analytics with company goals.

The most commonly sought-after skills that employers look for in data analysts are:

  • experience with Excel, SQL, Python, or R;
  • knowledge of BI platforms such as Power BI, Tableau, Looker;
  • ability to visualize data and create reports;
  • skills in statistical analysis and machine learning (for advanced positions).

Requirements vary depending on the company’s field of activity and the level of expertise needed.

Find Top Remote Data Analyst Jobs with Remoote.app

Our service offers top advantages for those who want to find remote jobs quickly:

  • daily updated listings from a wide range of companies worldwide, so you can explore opportunities in multiple countries and regions;
  • advanced search filters tailored to your experience level, preferred work model (full-time, part-time), and location, making it easy to find the perfect match for your skills and lifestyle;
  • AI-powered vacancy text processing that saves you time by highlighting the most important information from the ad;
  • personalized notifications sent via email or Telegram, so you never miss new vacancies that fit your profile;
  • helpful resources including resume-building tools and interview tips to boost your chances of landing your ideal position.

You can apply to up to 5 jobs per day for free. Need more? Choose one of our flexible subscription plans β€” weekly, monthly, or annual β€” to maximize your chances of landing the perfect position.