Remote Data Analyst Jobs

ETL
702 jobs found. to receive daily emails with new job openings that match your preferences.
702 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 140000.0 - 160000.0 USD per year

πŸ” Software Development

🏒 Company: JobgetherπŸ‘₯ 11-50πŸ’° $1,493,585 Seed about 2 years agoInternet

  • 5+ years of experience building scalable backend applications and APIs.
  • Proficiency in Go, Python, or Java, with a strong grasp of SQL and NoSQL databases (e.g., Bigtable, BigQuery, DynamoDB).
  • Experience working with cloud infrastructure, preferably AWS or GCP, and CI/CD pipelines.
  • Familiarity with containerization technologies such as Docker and Kubernetes.
  • Strong problem-solving and analytical skills, with the ability to communicate complex concepts clearly.
  • Design and implement ETL pipelines capable of processing large-scale datasets efficiently.
  • Build and maintain robust APIs for data retrieval, including support for complex query types.
  • Architect scalable data storage and retrieval systems using SQL/NoSQL technologies.
  • Transform raw data into structured, high-value data products to support business and operational decisions.
  • Collaborate with internal stakeholders to align data architecture with product and customer needs.
  • Document technical processes and mentor junior team members.
  • Ensure performance, security, and scalability across the data platform.

AWSBackend DevelopmentDockerPythonSQLDynamoDBETLGCPJavaKubernetesAPI testingData engineeringGoNosqlCI/CDRESTful APIsData modelingSoftware EngineeringData analytics

Posted about 6 hours ago
Apply
Apply
πŸ”₯ Enterprise Data Architect
Posted about 6 hours ago

πŸ“ GBR

🧭 Full-Time

πŸ” Insurance

🏒 Company: dxcjobs

  • Experience in defining high quality Logical Data Models and gaining approval for such models in a complex stakeholder environment.
  • Experience of stakeholder management including senior stakeholders, 3rd party partners and standards organisations.
  • Experience of working with industry standard data models and extending those models to support business requirements.
  • Experience of working in an insurance business domain and with insurance logical data models, including commercial insurance.
  • Experience with a wide variety of technical domains from green field microservice architectures to workflow management systems, integration tooling and mainframe.
  • Experience of Logical Physical data mapping and of large-scale transformation projects moving from complex legacy system domains to modern digital applications.
  • Experience of defining transition plans and road maps for data migration, including both bulk and gradual migration.
  • Experience of defining patterns for and implementing BI/MI, analytics, and machine learning on large-volume datasets.
  • Experience of data designing and implementing data security policies and regulatory compliance.
  • Experience of defining and implementing governance processes for data management.
  • Experience of working with external standards organisations.
  • Experience of working with TOGAF framework.
  • Define and manage an Enterprise Data Strategy, supporting the business strategy and objectives.
  • Conduct detailed analysis of business processes, requirements, and scenarios to produce high quality Logical Data Models and work with senior stakeholders, third party partners, and standards organisations to gain approval for finished artefacts.
  • Work with and customise Industry Standard Logical Data Models to produce and maintain models tailored to the specific business domain.
  • Define, document and maintain business value chain for enterprise data.
  • Analyse business processes to identify and document key data dependency through key end-to-end business process flows.
  • Work with internal and external development teams and technical architects to drive logical service architecture design, external API contract design, and internal physical data model design.
  • Work with Technical BA's to analyse existing system behaviour, data models and business process implementations to facilitate logical to physical data mapping into existing systems to drive system integration designs and data transformation.
  • Design data models and patterns for downstream data exposure to facilitate accessibility for and analytics, BI, MI and reporting, both internal and external.
  • Work with external 3rd party standards bodies to define and maintain global messaging standards.
  • Based on business process analysis, understand and document life cycles of key data entities and define non-functional requirements for data consistency, integrity, availability, latency, and auditability.
  • Work with Security Architects to define and agree on data security classifications for data defined in data models.
  • Define and manage data governance framework for the program and work with existing architecture team to hand into BAU.

SQLBusiness IntelligenceCloud ComputingData MiningETLAPI testingData engineeringREST APIMicroservicesData visualizationStakeholder managementData modelingData analyticsData management

Posted about 6 hours ago
Apply
Apply

πŸ“ United States, Canada

πŸ” Healthcare

🏒 Company: VeradigmπŸ‘₯ 5001-10000πŸ’° $100,000,000 Post-IPO Equity almost 10 years agoInformation ServicesElectronic Health Record (EHR)HospitalInformation TechnologyHealth Care

  • Master’s degree in Data Science, Computer Science, Biomedical Informatics, or a related field.
  • Proficiency in Python and SQL.
  • Experience developing ML models, particularly with NLP techniques and libraries (spaCy, Scikit-learn, etc.).
  • Experience with Azure OpenAI or other large language models.
  • Familiarity with EMR systems, healthcare data standards (e.g. ICD-10, SNOMED).
  • Exposure to cloud environments and big data tools (e.g., Snowflake, Azure, Spark).
  • Demonstrated ability to work with clinical text and health data (e.g., EMRs).
  • Design, develop, and deploy NLP and ML models using Python to extract insights from unstructured and semi-structured clinical text.
  • Leverage Azure OpenAI and other LLM frameworks to enhance clinical data structuring and semantic understanding.
  • Partner closely with clinical subject matter experts to validate models, ensuring high precision and accuracy in clinical contexts.
  • Iterate on model development using feedback from real-world applications and clinician review.
  • Collaborate with Operations teams to design, implement, and optimize an end-to-end data science pipeline from R&D to production.
  • Ensure data integrity, reproducibility, and compliance with healthcare regulations and best practices.
  • Work with clients and stakeholders to identify key business problems and expected outcomes.
  • Explore and assess internal and external data sources to extract relevant features and patterns.
  • Communicate findings, methodologies, and actionable insights to both technical and non-technical audiences.

PythonSQLCloud ComputingData AnalysisETLMachine LearningAlgorithmsAzureData scienceData StructuresCI/CDRESTful APIsData visualization

Posted about 7 hours ago
Apply
Apply
πŸ”₯ Senior Business Analyst
Posted about 7 hours ago

πŸ“ Colombia

🏒 Company: GoDaddyπŸ‘₯ 5001-10000πŸ’° $800,000,000 Post-IPO Equity over 3 years agoπŸ«‚ Last layoff over 1 year agoWeb HostingDomain RegistrarWeb DevelopmentOnline Portals

  • 5+ years in a technical role in a corporate setting.
  • Proficiency in SQL for data discovery, aggregation, and extraction; experience with large datasets and AWS/Redshift preferred.
  • Expertise with data visualization and business intelligence tools such as Tableau or Google Analytics.
  • Strong communication skills to clearly explain complex analyses and support conclusions with effective visualizations.
  • Ability to collaborate across teams in various time zones, navigating both technical and business discussions.
  • Develop analytical project requirements and provide insights to support GoDaddy's business objectives and growth strategies.
  • Conduct in-depth data analysis to identify actionable insights and effectively communicate findings to influence business decisions.
  • Build and maintain dynamic dashboards and visualizations using advanced business intelligence tools to aid in understanding business drivers.
  • Design and implement end-to-end data solutions, including enterprise-wide views, dashboards, and custom reporting.
  • Perform large-scale data analysis and develop models for segmentation, classification, and customer behavior analysis.

AWSSQLBusiness IntelligenceData AnalysisETLGoogle AnalyticsTableauCommunication SkillsAnalytical SkillsData visualizationData modelingData analytics

Posted about 7 hours ago
Apply
Apply
πŸ”₯ Operations Engineer
Posted about 8 hours ago

πŸ“ United States

🧭 Full-Time

πŸ” Software Development

🏒 Company: ArrivedπŸ‘₯ 1-50

  • Software engineering background with strong programming skills in Python, SQL, and modern development practices
  • Experience building and maintaining production data systems, APIs, or backend services
  • Hands-on experience with data pipeline tools (dbt, Fivetran, or similar)
  • Strong SQL skills and experience with data warehouses (Snowflake)
  • Build production data systems – design and implement scalable data pipelines, ETL processes, and APIs that handle structured and unstructured data from multiple sources
  • Develop internal applications – write maintainable code for internal tools, automation services, and integrations
  • Own system architecture – make technical decisions about data flow, system design, and technology stack choices that balance immediate needs with long-term scalability

AWSBackend DevelopmentProject ManagementPythonSQLETLSnowflakeAlgorithmsAPI testingData engineeringData StructuresREST APICommunication SkillsCI/CDRESTful APIsDevOpsData visualizationData modelingSoftware EngineeringData analyticsData management

Posted about 8 hours ago
Apply
Apply
πŸ”₯ Manager, Analytics
Posted about 8 hours ago

πŸ“ Australia

πŸ” Software Development

🏒 Company: Avetta, LLC

  • Deep expertise in SQL, including writing complex queries and performance tuning.
  • Extensive hands-on experience with multiple reporting and BI tools such as Jasper Server, MicroStrategy, Sisense, SQL Server Reporting Services (SSRS), Power BI etc .
  • Extensive hands-on experience with multiple database platforms such as MySQL, Postgresql, Snowflake.
  • 3+ years in a BI/Analytics management or senior/lead BI developer role within a software development company.
  • Experience architecting and supporting distributed BI systems/cloud analytics solutions.
  • Strong understanding of data warehousing principles, data modeling (star/snowflake schemas), and data governance.
  • Proven success managing multiple projects with Agile methodologies, especially involving large data sets and complex reporting requirements.
  • Excellent interpersonal and communication skills, with a demonstrated ability to translate technical work into business value for non-technical audiences.
  • Lead, mentor, and develop a team of BI and analytics professionals, including guidance on best practices for data modeling, ETL, and reporting.
  • Collaborate with product managers, business units, and stakeholders to gather and clarify BI requirements and to translate them into scalable technical solutions.
  • Architect, enhance, and support data warehouses and analytic data marts, primarily utilizing Jasper Server, Power BI and Snowflake.
  • Oversee the design, implementation, and optimization of ETL processes to ensure the accuracy and timeliness of data integrations from multiple sources.
  • Ensure projects are tracked in JIRA, with thorough code reviews, data quality checks, and UAT/testing prior to deployment.
  • Facilitate Agile ceremonies, including sprint planning, daily stand-ups, and retrospectives for the BI and Analytics team.
  • Drive continuous improvement in BI processes, adoption of new tools/technologies, and foster a culture of data quality and governance.
  • Troubleshoot and resolve data issues, working with engineering and business teams as needed to remove blockers.
  • Ensure smooth collaboration with engineering, infrastructure, and business analysts to provide actionable BI solutions across teams.

LeadershipPostgreSQLSQLAgileBusiness IntelligenceETLMySQLSnowflakeCommunication SkillsProblem SolvingData visualizationData modelingData analytics

Posted about 8 hours ago
Apply
Apply
πŸ”₯ Senior Analytics Engineer
Posted about 8 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 150000.0 - 200000.0 USD per year

πŸ” Energy

🏒 Company: ArcadiaπŸ‘₯ 501-1000πŸ’° $125,000,000 over 2 years agoDatabaseCleanTechRenewable EnergyClean EnergySoftware

  • 4+ years as an Analytics Engineer or equivalent role; experience with dbt is strongly preferred
  • 6+ years, cumulatively, in the data space (data engineering, data science, analytics, or similar)
  • Expert-level understanding of conceptual data modeling and data mart design
  • An understanding of data structures and/or database design plus deep experience with SQL and Python
  • Experience building data pipelines and database management including Snowflake or similar
  • Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business
  • Experience in technical leadership or mentorship
  • Strong communication and collaboration skills
  • Proven ability to solve complex problems in a dynamic and evolving environment
  • Transform, test, deploy, and document data to deliver clean and trustworthy data for analysis to end-users
  • Collaborate with subject matter experts, engineers, and product managers to identify the most elegant and effective data structures to understand our constantly growing and evolving business
  • Help bring engineering best practices (reliability, modularity, test coverage, documentation) to our DAG and to our Data team generally
  • Collaborate with data engineers to build robust, tested, scalable ELT pipelines.
  • Data modeling: model raw data into clean, tested, and reusable datasets to represent our key business data concepts. Define the rules and requirements for the formats and attributes of data
  • Data transformation: build our data lakehouse by transforming raw data into meaningful, useful data elements through joining, filtering, and aggregating source dataData documentation: create and maintain data documentation including data definitions and understandable data descriptions to enable broad-scale understanding of the use of data
  • Employ software engineering best practices to write code and coach analysts and data scientists to do the same

AWSPythonSQLData AnalysisETLGitSnowflakeData engineeringData StructuresCommunication SkillsAnalytical SkillsCollaborationData visualizationData modelingData management

Posted about 8 hours ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 118300.0 - 147800.0 USD per year

πŸ” Software Development

🏒 Company: TwilioπŸ‘₯ 5001-10000πŸ’° $378,215,525 Post-IPO Equity almost 4 years agoπŸ«‚ Last layoff over 1 year agoMessagingSMSMobile AppsEnterprise SoftwareSoftware

  • 3+ years of success in a solutions engineering (presales) role with 6+ years of progressive professional experience supporting technical products.
  • Experience with Data Engineering, Cloud Data Warehouses, Data Modeling, and APIs
  • Experience with marketing technology such as marketing automation, personalization, journey orchestration, or advertising.
  • Partner with Account Executives to execute pre-sales activities including opportunity qualification, discovery, demonstrations, Proof-of-Concept, RFP responses,and justifying business value.
  • Become an expert builder and evangelist of Segment’s products and partner ecosystem.
  • Lead technical evaluations with our prospects to uncover their goals and technical pain, in order to design, demonstrate and present innovative solutions that unlock business-level gains.
  • Develop subject matter expertise on Customer Data Platforms (CDPs) and Segment’s role within the customer data ecosystem.
  • Act as a trusted advisor to consult with customers and prospects in order to influence their customer data strategy and architecture design.
  • Build a perspective on customer and market trends by listening to prospects and advocating for customer interests to influence Segment’s strategy and vision.

AWSCloud ComputingETLData engineeringData scienceCommunication SkillsProblem SolvingRESTful APIsSales experienceMarketingData modelingScriptingData analyticsData managementCustomer Success

Posted about 9 hours ago
Apply
Apply
πŸ”₯ Data Platform Engineer
Posted about 10 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 180000.0 - 220000.0 USD per year

πŸ” Software Development

🏒 Company: PreparedπŸ‘₯ 51-100πŸ’° $27,000,000 Series B 8 months agoEnterprise SoftwarePublic Safety

  • 5+ years of experience in data engineering, software engineering with a data focus, data science, or a related role
  • Knowledge of designing data pipelines from a variety of source (e.g. streaming, flat files, APIs)
  • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL)
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink, Pulsar, Redpanda)
  • Strong programming skills in common data-focused languages (e.g., Python, Scala)
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Prefect, Temporal)
  • Familiarity with AWS-based data solutions
  • Strong understanding of data warehousing concepts and technologies (Snowflake)
  • Experience documenting data dependency maps and data lineage
  • Strong communication and collaboration skills
  • Ability to work independently and take initiative
  • Proficiency in containerization and orchestration tools (e.g., Docker, Kubernetes)
  • Design, implement, and maintain scalable data pipelines and infrastructure
  • Collaborate with software engineers, product managers, customer success managers, and others across the business to understand data requirements
  • Optimize and manage our data storage solutions
  • Ensure data quality, reliability, and security across the data lifecycle
  • Develop and maintain ETL processes and frameworks
  • Work with stakeholders to define data availability SLAs
  • Create and manage data models to support business intelligence and analytics

AWSDockerPostgreSQLPythonSQLApache AirflowETLKubernetesSnowflakeApache KafkaData engineeringSparkScalaData modeling

Posted about 10 hours ago
Apply
Apply
πŸ”₯ Senior Data Engineer
Posted about 10 hours ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 145000.0 - 200000.0 USD per year

πŸ” Daily Fantasy Sports

🏒 Company: PrizePicksπŸ‘₯ 101-250πŸ’° Corporate about 2 years agoGamingFantasy SportsSports

  • 5+ years of experience in a data Engineering, or data-oriented software engineering role creating and pushing end-to-end data engineering pipelines.
  • 2+ years of experience acting as technical lead and providing mentorship and feedback to junior engineers.
  • Extensive experience building and optimizing cloud-based data streaming pipelines and infrastructure.
  • Extensive experience exposing real-time predictive model outputs to production-grade systems leveraging large-scale distributed data processing and model training.
  • Experience in most of the following: SQL/NoSQL databases/warehouses: Postgres, BigQuery, BigTable, Materialize, AlloyDB, etc Replication/ELT services: Data Stream, Hevo, etc. Data Transformation services: Spark, Dataproc, etc Scripting languages: SQL, Python, Go. Cloud platform services in GCP and analogous systems: Cloud Storage, Cloud Compute Engine, Cloud Functions, Kubernetes Engine etc. Data Processing and Messaging Systems: Kafka, Pulsar, Flink Code version control: Git Data pipeline and workflow tools: Argo, Airflow, Cloud Composer. Monitoring and Observability platforms: Prometheus, Grafana, ELK stack, Datadog Infrastructure as Code platforms: Terraform, Google Cloud Deployment Manager. Other platform tools such as Redis, FastAPI, and Streamlit.
  • Enhance the capabilities of our existing Core Data platforms and develop new integrations with both internal and external APIs within the Data organization.
  • Work closely with DevOps, architects, and engineers to ensure the success of the Core Data platform.
  • Collaborate with Analytics Engineers to enhance data transformation processes, streamline CI/CD pipelines, and optimize team collaboration workflows.
  • Architect and implement Infrastructure as Code (IaC) solutions to automate and streamline the deployment and management of data infrastructure.
  • Develop and manage CI/CD pipelines to automate and streamline the deployment of data solutions.
  • Ensure code is thoroughly tested, effectively integrated, and efficiently deployed, in alignment with industry best practices for version control, automation, and quality assurance.
  • Serve as a Data Engineering thought leader within the broader PrizePicks technology organization by staying current with emerging technologies, implementing innovative solutions, and sharing knowledge and best practices with junior team members and collaborators.
  • Provide on-call support as part of a shared rotation between the Data and Analytics Engineering teams to maintain system reliability and respond to critical issues.

LeadershipPostgreSQLPythonSQLApache AirflowBashCloud ComputingETLGCPGitKafkaKubernetesData engineeringData scienceREST APICI/CDRESTful APIsMentoringTerraformData modeling

Posted about 10 hours ago
Apply
Shown 10 out of 702

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Remote Data Analyst Jobs

Data analysts are highly sought after as companies prioritize data-driven strategies. Whether you are just starting out or looking to advance your career from home, our platform is the top place to find and apply for the best jobs worldwide. Register on Remoote.app today and explore numerous data analyst jobs with leading companies hiring in the field of analytics!

Data Analyst Responsibilities

Analysts help businesses make smart decisions based on data. Their responsibilities include:

  • gathering, processing, and analyzing large datasets from various sources;
  • creating clear and informative reports and visualizations;
  • identifying patterns, trends, and anomalies in data;
  • supporting strategic business planning with data-driven insights;
  • collaborating with different departments to align analytics with company goals.

The most commonly sought-after skills that employers look for in data analysts are:

  • experience with Excel, SQL, Python, or R;
  • knowledge of BI platforms such as Power BI, Tableau, Looker;
  • ability to visualize data and create reports;
  • skills in statistical analysis and machine learning (for advanced positions).

Requirements vary depending on the company’s field of activity and the level of expertise needed.

Find Top Remote Data Analyst Jobs with Remoote.app

Our service offers top advantages for those who want to find remote jobs quickly:

  • daily updated listings from a wide range of companies worldwide, so you can explore opportunities in multiple countries and regions;
  • advanced search filters tailored to your experience level, preferred work model (full-time, part-time), and location, making it easy to find the perfect match for your skills and lifestyle;
  • AI-powered vacancy text processing that saves you time by highlighting the most important information from the ad;
  • personalized notifications sent via email or Telegram, so you never miss new vacancies that fit your profile;
  • helpful resources including resume-building tools and interview tips to boost your chances of landing your ideal position.

You can apply to up to 5 jobs per day for free. Need more? Choose one of our flexible subscription plans β€” weekly, monthly, or annual β€” to maximize your chances of landing the perfect position.