Remote Data Science Jobs

Data modeling
1,785 jobs found. to receive daily emails with new job openings that match your preferences.
1,785 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 215900.0 - 254000.0 USD per year

πŸ” Healthcare

  • You have 6+ years building platforms that leverage ML for automated decision-making, preferably in healthcare, insurance, or adjacent risk domains.
  • You understand the complexities of health insurance - from claims processing to benefits determination - and have built systems that improve billing accuracy or clean claims rates.
  • You're comfortable diving deep with engineering and data science teams on complex technical problems and can drive architectural decisions for scalable platforms.
  • You've led initiatives requiring difficult tradeoffs between accuracy, speed, and user experience, and can influence stakeholders across multiple teams.
  • You excel at analyzing complex datasets to identify root causes and drive systematic improvements to platform performance.
  • Own the benefits determination platform strategy and roadmap to achieve company goals for billing accuracy and collections loss reduction.
  • Build ML-powered systems that interpret complex insurance data and handle thousands of payer-specific edge cases.
  • Drive technical decisions on platform architecture, model deployment, and system scalability with engineering leadership.
  • Collaborate with Engineering and Data Science on developing models that improve benefits interpretation accuracy and confidence scoring.
  • Partner with Insurance Operations and Customer Care to understand manual intervention patterns and automate high-volume workflows.

SQLData AnalysisMachine LearningProduct ManagementCross-functional Team LeadershipData scienceRESTful APIsData modeling

Posted 31 minutes ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 138000.0 - 150000.0 USD per year

πŸ” Education

🏒 Company: Gradient Learning

  • 5+ years of experience working in the K-12 space
  • 3+ years of experience in data product development
  • 3+ years of experience translating complex data into educator-friendly visualizations using Tableau
  • 3+ years of people management experience
  • 3+ years of experience with Snowflake or comparable cloud-based data warehousing platforms (strongly preferred)
  • Experience using AI or machine learning to enhance data analysis or deliver scalable, educator-facing insights (strongly preferred)
  • Familiarity with LTI standards, LMS platforms, and education data interoperability; direct experience with Canvas LMS (strongly preferred)
  • Knowledge of data privacy, security, and protection standards, particularly as they relate to PII and educational data (FERPA, COPPA, etc.) (preferred)
  • Design, Refine, and Lead the Data & Insights Product Strategy
  • Oversee Data & Insights Product Development and Delivery
  • Strengthen Data Infrastructure in Partnership with Information Systems
  • Lead the Data & Insights Product Delivery Team

SQLData AnalysisETLMachine LearningPeople ManagementProduct ManagementSnowflakeUser Experience DesignCross-functional Team LeadershipTableauProduct DevelopmentData engineeringCommunication SkillsAgile methodologiesRESTful APIsData visualizationStakeholder managementStrategic thinkingData modelingData analyticsData management

Posted 41 minutes ago
Apply
Apply

πŸ“ United States

πŸ’Έ 78000.0 - 117000.0 USD per year

πŸ” Healthcare

🏒 Company: dkc_external

  • Associates or Bachelor's degree
  • 3+ years of data analysis and Excel pivot table experience; Ability to analyze complex data to be proactive in business decisions
  • Expert in computer proficiency in Microsoft Office tools with an emphasis on Microsoft Excel, Access, Outlook, Word, and PowerPoint
  • Ability to work independently, multi-task, and meet deadlines in a fast-paced, deadline driven environment
  • Excellent presentations skills and experience in project management
  • Fluent in the written and verbal skills necessary to successfully perform the essential functions, duties, and responsibilities of the position
  • Demonstrated strengths in organization, attention-to-detail, follow-through, reasoning, critical thinking, and problem-solving skills
  • Experience working in and coding in SQL and/or Python, VBA, Microsoft Access, Excel Macros, and Tableau
  • Relative coding and Excel certifications
  • Experience with various data analytics tools and processes (including SQL, database development and management, and data visualization)
  • Perform data analyses and identify trends
  • Effectively create and communicate key messages from analysis and propose and drive efficiencies
  • Drive and lead projects start to finish
  • Data and reporting analysis, management, and presentation using appropriate tools such as Excel, data warehouses, and servers like SQL and Tableau
  • Compile and transform data across multiple databases to then complete root cause analysis, quality assurance, trending, and building complex analytical models & determining variable assumptions
  • Summarize and package analytical insights into appropriate platforms and forums to share with stakeholders, partners and leaders
  • Develop, pitch, and initiate action plans or process improvements
  • Translate business requirements into specifications that will be used to implement and generate recurring reports that provide business and systems insight
  • Partner, communicate, and lead working sessions with executive stakeholders to deliver accurate data, findings, and recommendations; maintain relationships and provide high level updates on initiatives
  • Oversee smaller initiatives from inception to completion, ensuring that projects are completed on time and meet stakeholders’ requirements. This includes developing project plans, managing timelines, and tracking progress against established goals.
  • Create visual and graphical material to present to leaders and partner teams to provide updates and findings throughout deep dives and process innovations
  • Understands how metrics & processes across multiple business are interrelated and implications of overall work on broader business objectives
  • Innovation and automation ventures
  • Provides direction and team work to help with others’ work; viewed as a mentor and expert across the team
  • User Acceptance Testing (UAT) and validation tasks
  • Understand and adhere to all HIPAA Guidelines, DaVita Teammate Handbook, and Safety and Security Policy and Procedures.

Project ManagementSQLData AnalysisExcel VBAMicrosoft AccessTableauAnalytical SkillsMicrosoft ExcelProblem SolvingData visualizationData modelingData analytics

Posted about 2 hours ago
Apply
Apply
πŸ”₯ Data Analyst - Finance
Posted about 4 hours ago

πŸ“ USA

🧭 Full-Time

πŸ” Fintech

🏒 Company: Comun

  • Expert-level SQL knowledge with demonstrated ability to optimize complex queries (non-negotiable)
  • At least 3+ years of practical experience in data engineering or analytics roles with financial data
  • 3+ years of experience at a fintech or financial services industries in the payments or lending space at a similar role
  • Solid understanding of finance concepts and principles
  • Proven track record building data pipelines and ETL processes
  • Experience implementing cost modeling and optimization analytics
  • Problem-solving mindset with strong analytical skills
  • Excellent communication skills to explain complex technical and financial concepts
  • Design and implement scalable data pipelines to establish and maintain a solid fund flow process
  • Automate financial reconciliation processes and generate actionable reports
  • Develop and maintain revenue and cost models to identify growth opportunities and provide insights for strategic decision-making
  • Build analytical tools to identify and quantify cost optimization opportunities across the organization
  • Monitor vendor performance metrics and evaluate new vendor opportunities
  • Implement data solutions to detect financial anomalies and uncover efficiency opportunities that drive business value
  • Perform cohort-level performance analysis to develop a deeper understanding on customer unit economics
  • Collaborate with finance, data, growth, product, and engineering teams to develop robust financial data architecture
  • Contribute to our mission of financial inclusion by enabling data-informed product and pricing decisions

AWSPostgreSQLPythonSQLData AnalysisETLSnowflakeData engineeringFastAPIFinancial analysisData modelingFinance

Posted about 4 hours ago
Apply
Apply
πŸ”₯ Data Engineer (Contract)
Posted about 5 hours ago

πŸ“ LatAm

🧭 Contract

🏒 Company: AbleRentalProperty ManagementReal Estate

  • 10+ years of data engineering experience with enterprise-scale systems
  • Expertise in Apache Spark and Delta Lake, including ACID transactions, time travel, Z-ordering, and compaction
  • Deep knowledge of Databricks (Jobs, Clusters, Workspaces, Delta Live Tables, Unity Catalog)
  • Experience building scalable ETL/ELT pipelines using tools like Airflow, Glue, Dataflow, or ADF
  • Advanced SQL for data modeling and transformation
  • Strong programming skills in Python (or Scala)
  • Hands-on experience with data formats such as Parquet, Avro, and JSON
  • Familiarity with schema evolution, versioning, and backfilling strategies
  • Working knowledge of at least one major cloud platform: AWS (S3, Athena, Redshift, Glue Catalog, Step Functions), GCP (BigQuery, Cloud Storage, Dataflow, Pub/Sub), or Azure (Synapse, Data Factory, Azure Databricks)
  • Experience designing data architectures with real-time or streaming data (Kafka, Kinesis)
  • Consulting or client-facing experience with strong communication and leadership skills
  • Experience with data mesh architectures and domain-driven data design
  • Knowledge of metadata management, data cataloging, and lineage tracking tools
  • Shape large-scale data architecture vision and roadmap across client engagements
  • Establish governance, security frameworks, and regulatory compliance standards
  • Lead strategy around platform selection, integration, and scaling
  • Guide organizations in adopting data lakehouse and federated data models
  • Lead technical discovery sessions to understand client needs
  • Translate complex architectures into clear, actionable value for stakeholders
  • Build trusted advisor relationships and guide strategic decisions
  • Align architecture recommendations with business growth and goals
  • Design and implement modern data lakehouse architectures with Delta Lake and Databricks
  • Build and manage ETL/ELT pipelines at scale using Spark (PySpark preferred)
  • Leverage Delta Live Tables, Unity Catalog, and schema evolution features
  • Optimize storage and queries on cloud object storage (e.g., AWS S3, Azure Data Lake)
  • Integrate with cloud-native services like AWS Glue, GCP Dataflow, and Azure Synapse Analytics
  • Implement data quality monitoring, lineage tracking, and schema versioning
  • Build scalable pipelines with tools like Apache Airflow, Step Functions, and Cloud Composer
  • Develop cost-optimized, scalable, and compliant data solutions
  • Design POCs and pilots to validate technical approaches
  • Translate business requirements into production-ready data systems
  • Define and track success metrics for platform and pipeline initiatives

AWSPythonSQLCloud ComputingETLGCPKafkaAirflowAzureData engineeringScalaData modeling

Posted about 5 hours ago
Apply
Apply

πŸ“ United Kingdom

πŸ” Software Development

  • Some experience in software engineering, developing full-stack, complex, and distributed applications.
  • An understanding of designing elegant, well-structured APIs.
  • Proficiency in at least one programming language (e.g., Python, Java, JavaScript, C, etc.).
  • A collaborative mindset with a proactive approach to problem-solving, continuous learning, and knowledge sharing across teams.
  • Build and scope tools and infrastructure for automating security policy and enforcement.
  • Improve and enhance vulnerability detection and response capabilities.
  • Identify vulnerabilities through assessments, working with internal partners towards remediation and improvement of Yelp’s corporate environment and infrastructure.
  • Help define policies and security best practices for IT, infrastructure, and other internal organization and third-party integrations.
  • Review and offer feedback on security implications of software system designs submitted from across Yelp Engineering.
  • Exhibit the strong communication ability needed to enforce rigorous security standards, while always playing well with others and partnering with diverse stakeholders to advance Yelp’s goals.
  • Design, develop, and operationalize monitoring, correlation, and alerting capabilities for Yelp’s corporate network, infrastructure, and applications to identify suspicious or anomalous behavior.
  • Participate in a regular on call rotation and occasional incident response.
  • Help perform threat modeling across business applications and infrastructure integrations.

AWSBackend DevelopmentDockerPythonSoftware DevelopmentCloud ComputingCybersecurityFull Stack DevelopmentKubernetesAPI testingCommunication SkillsAnalytical SkillsCollaborationCI/CDProblem SolvingRESTful APIsLinuxDevOpsJSONAnsibleData modelingScriptingData analytics

Posted about 6 hours ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 215900.0 - 254000.0 USD per year

πŸ” Healthcare

🏒 Company: HeadwayπŸ‘₯ 201-500πŸ’° $125,000,000 Series C over 1 year agoMental Health Care

  • 6+ years building platforms that leverage ML for automated decision-making, preferably in healthcare, insurance, or adjacent risk domains.
  • Understand the complexities of health insurance - from claims processing to benefits determination - and have built systems that improve billing accuracy or clean claims rates.
  • Comfortable diving deep with engineering and data science teams on complex technical problems and can drive architectural decisions for scalable platforms.
  • Led initiatives requiring difficult tradeoffs between accuracy, speed, and user experience, and can influence stakeholders across multiple teams.
  • Excel at analyzing complex datasets to identify root causes and drive systematic improvements to platform performance.
  • Own the benefits determination platform strategy and roadmap to achieve company goals for billing accuracy and collections loss reduction.
  • Build ML-powered systems that interpret complex insurance data and handle thousands of payer-specific edge cases.
  • Drive technical decisions on platform architecture, model deployment, and system scalability with engineering leadership.
  • Collaborate with Engineering and Data Science on developing models that improve benefits interpretation accuracy and confidence scoring.
  • Partner with Insurance Operations and Customer Care to understand manual intervention patterns and automate high-volume workflows.

AWSBackend DevelopmentLeadershipSQLData AnalysisMachine LearningProduct ManagementProduct OperationsCross-functional Team LeadershipProduct DevelopmentStrategyData scienceCommunication SkillsAnalytical SkillsRESTful APIsData modeling

Posted about 6 hours ago
Apply
Apply

πŸ“ United States

πŸ’Έ 90000.0 - 150000.0 USD per year

🏒 Company: JobgetherπŸ‘₯ 11-50πŸ’° $1,493,585 Seed about 2 years agoInternet

  • 3–5 years of hands-on experience in Salesforce development
  • Proficient in Salesforce architecture, data modeling, and standard/custom object relationships
  • Strong skills in Apex, Visualforce, Lightning, and Salesforce APIs
  • Design and develop custom applications on the Salesforce platform using Apex, Visualforce, and Lightning Components
  • Maintain and enhance existing Salesforce applications and ensure peak system performance
  • Build and manage integrations with third-party systems via APIs and web services
  • Create and update technical documentation including design documents and user guides
  • Conduct system testing, debugging, and code reviews to ensure quality and compliance
  • Collaborate with stakeholders to gather business requirements and implement solutions aligned with company goals
  • Stay current with Salesforce releases and recommend improvements to existing configurations

SQLSalesforceAPI testingData modeling

Posted about 6 hours ago
Apply
Apply
πŸ”₯ Data Engineer (m/f/d)
Posted about 6 hours ago

πŸ“ Germany

🧭 Full-Time

🏒 Company: RoadsurferπŸ‘₯ 501-1000πŸ’° $5,330,478 almost 4 years agoLeisureRentalTourismRecreational Vehicles

  • Experience with Segment, Braze, or similar CDP/CEP platforms
  • Basic knowledge of data transformation tools
  • Familiarity with data governance practices, such as data ownership, naming conventions, and data lineage
  • Experience implementing data privacy measures such as consent tracking and anonymization
  • Familiarity with data quality metrics and monitoring techniques
  • Understanding of data privacy regulations (GDPR, CCPA)
  • Good communication skills, with the ability to work with cross-functional teams and stakeholders
  • Ensure reliability through automated tests, versioned models, and data lineage
  • Assist in implementing data governance policies to ensure data consistency, quality, and integrity across the CDP and CEP platforms
  • Support the automation of data validation and quality checks, including schema validation and data integrity monitoring
  • Help define and track data quality metrics and provide regular insights on data cleanliness and health
  • Assist in ensuring compliance with data privacy regulations (e.g., GDPR, CCPA), including implementing consent tracking and anonymization measures
  • Work with cross-functional teams to standardize data definitions, naming conventions, and ownership practices
  • Help maintain data cleanliness through automated data cleanup processes and identify areas for improvement
  • Support the analytics team by ensuring data is structured correctly for reporting and analysis

SQLApache AirflowETLData engineeringPostgresRESTful APIsComplianceJSONData visualizationData modelingData analyticsData management

Posted about 6 hours ago
Apply
Apply

πŸ“ CANADA

πŸ’Έ 180000.0 - 202000.0 CAD per year

🏒 Company: CaylentπŸ‘₯ 251-500πŸ’° Private over 2 years agoIaaSDevOpsCloud ComputingCloud Infrastructure

  • 7+ years of hands-on experience with Salesforce platform architecture, including Sales Cloud and Marketing Cloud
  • 4+ years of direct experience with Certinia, including PSA, Accounting, and Revenue Management modules, as well as Analytics Studio and Financial Report Builder.
  • Proven success architecting scalable enterprise solutions within Salesforce and integrating ERP/PSA systems.
  • Strong understanding of Salesforce platform architecture including Apex, Lightning Components, APIs, Data Loader, SOQL and Flows
  • Expertise experience in data modeling, process automation, and reporting across CRM and ERP domains, including integrated BI tools.
  • Experience designing and managing enterprise-level integrations using tools such as MuleSoft, Workato, or Boomi.
  • Familiarity with ITIL, Agile, and DevOps best practices; experience with Jira, Git, and CI/CD tools is a plus.
  • Salesforce certifications such as Application Architect or System Architect are highly preferred.
  • Bachelor's degree in Computer Science, Information Systems, or related experience.
  • Design and govern the enterprise architecture across Salesforce and Certinia platforms.
  • Partner with business stakeholders to translate requirements into scalable and secure technical solutions.
  • Own the platform roadmap and ensure alignment with business strategy, growth, and change management needs.
  • Lead technical evaluations, system design sessions, and solution architecture documentation.
  • Oversee system integration strategies between Salesforce, Certinia, and third-party applications.
  • Ensure platform performance, security, and compliance with data governance policies.
  • Provide guidance to internal and external developers and administrators; set best practices for configuration, customization, and development.
  • Evaluate and implement AppExchange and Certinia packages that support business functions.
  • Collaborate with Enterprise IT and Business Ops on system upgrades, patching cycles, and environment management.

AgileBusiness IntelligenceGitSalesforceJiraCI/CDDevOpsReportingCRMData modeling

Posted about 6 hours ago
Apply
Shown 10 out of 1785

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search β€” filter job listings based on your country of residence;
  • AI-powered job processing β€” artificial intelligence analyzes thousands of listings, highlighting key details so you don’t have to read long descriptions;
  • advanced filters β€” sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates β€” we monitor job relevance and remove outdated listings;
  • personalized notifications β€” get tailored job offers directly via email or Telegram;
  • resume builder β€” create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security β€” modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing β€” up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.