Remote Data Science Jobs

Data Science
34 jobs found. to receive daily emails with new job openings that match your preferences.
34 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
🔥 Data Science Intern
Posted about 19 hours ago

📍 U.S., Canada, Poland

🧭 Internship

💸 25.0 - 30.0 USD per hour

🔍 Software Development

  • Working towards obtaining a Master’s degree in Data Science or a related field or equivalent practical experience.
  • Strong understanding of machine learning, artificial neural networks, and language models.
  • Proficiency in writing object-oriented code in Python.
  • Experience with PyTorch, SciPy, Jupyter Notebooks, and Hugging Face Transformers.
  • Comfort using the Unix command line.
  • Proven research skills and the ability to find reliable information and datasets.
  • Ability to think critically about the contexts in which companies produce and use data.
  • Excellent communication skills for technical concepts.
  • Researching new types of data to add to our classifier.
  • Finding and preparing training data.
  • Helping with data cleanup.
  • Tuning prompts for AI data generation.
  • Developing and analyzing benchmarks measuring accuracy, speed, and consistency of results.
  • Finding new evaluation datasets for the classifier, helping run experiments, and analyzing results.

PythonArtificial IntelligenceData AnalysisMachine LearningNumpyPyTorchAlgorithmsData scienceCommunication SkillsAnalytical SkillsResearch skills

Posted about 19 hours ago
Apply
Apply

📍 US

🧭 Full-Time

💸 165000.0 - 259000.0 USD per year

🔍 Software Development

🏢 Company: Mozilla👥 5001-10000💰 $300,000 Angel over 20 years ago🫂 Last layoff 7 months agoInternetOpen SourceWeb BrowsersSoftwareBrowser Extensions

  • 6+ years of experience in data science with a strong focus on monetization.
  • 3+ years of experience leading and scaling data science teams.
  • Deep expertise in machine learning, statistical modeling, and applied data analytics.
  • Proficiency in Python, as well as SQL.
  • Strong understanding of privacy-preserving technologies and how they can be applied within advertising or monetization contexts.
  • Excellent communication skills, with the ability to distill complex data into clear, actionable insights for both technical and non-technical stakeholders.
  • A collaborative mindset and a track record of influencing cross-functional teams and senior leaders.
  • Lead and scale a high-performing data science team, driving strategy and execution across monetized features and surfaces.
  • Balance business goals and user trust by defining a monetization framework that supports privacy and user-first principles.
  • Translate data into impact through deep analysis, clear insights, and the development of metrics, dashboards, and self-serve reporting tools that enable stakeholders to make informed decisions.
  • Drive measurement rigor by designing and evolving experimentation and evaluation frameworks for testing and optimizing monetization features.
  • Collaborate cross-functionally with Product, Engineering, and Advertising teams to define and deliver on our product and monetization roadmap.

PythonSQLData AnalysisMachine LearningCross-functional Team LeadershipData scienceCommunication SkillsData modeling

Posted 4 days ago
Apply
Apply

📍 United Kingdom, Germany, Romania

🧭 Internship

🔍 Cybersecurity

🏢 Company: crowdstrikecareers

  • Undergraduate/Masters student in your penultimate year of study (2026 graduate) with completed coursework in linear algebra and statistics or an equivalent knowledge of applied mathematics
  • Solid knowledge and experience in writing Python code to pre-process and analyze data to generate actionable insights
  • Comfortable working in a Linux environment
  • Ability to communicate technical detail in a simple, top-down manner
  • Ability to independently make sound, justifiable decisions and take action
  • Ability to plan, organize and prioritize work independently and meet deadlines
  • You’re a problem solver with a can-do attitude who loves interesting challenges
  • You’re interested in the latest technologies in your field of expertise
  • You’re self-motivated and actively seeking opportunities to grow
  • You’re looking to work in a globally distributed team
  • Gain insights into the vast variety of projects our data scientists are working on: from binary analysis over NLP and audio analysis to time series processing
  • Leverage our arsenal of cutting edge technology and hardware to build systems with real-world impact
  • Work directly with Senior Data Scientists
  • Learn how to write scalable, production-ready code

PythonData AnalysisMachine LearningNumpyAlgorithmsData sciencePandasCommunication SkillsLinuxData visualization

Posted 5 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🏢 Company: Worth AI👥 11-50💰 $12,000,000 Seed over 1 year agoArtificial Intelligence (AI)Business IntelligenceRisk ManagementFinTech

  • Bachelor’s degree and 8+ years of professional experience, or Master’s degree and 6+ years, or PhD and 4+ years, in a quantitative field such as Computer Science, Statistics, Applied Mathematics, or Engineering.
  • Proven leadership experience managing data science or machine learning teams in fast-paced, agile environments.
  • Deep expertise in statistical modeling and machine learning techniques (e.g., regression, time series, clustering, ensemble methods, simulation, NLP, deep learning, etc.).
  • Hands-on proficiency in Python and SQL; familiarity with R or other languages is a plus.
  • Practical experience with ML libraries and frameworks such as Scikit-learn, TensorFlow, Keras, PyTorch, Statsmodels, NumPy, and SciPy.
  • Strong understanding of data architecture and MLOps practices in cloud environments such as AWS, Azure, or GCP.
  • Excellent data storytelling and executive communication skills; confident in building and delivering compelling visual presentations (e.g., Tableau, PowerBI).
  • Deep understanding of Agile methodologies such as Scrum or Kanban.
  • Lead, mentor, and scale a high-performing team of data scientists while fostering a culture of innovation, collaboration, and continuous learning.
  • Oversee the end-to-end development of machine learning models—design, training, validation, and deployment—within cloud-based production environments.
  • Partner with product, engineering, and business teams to identify opportunities for data science solutions, define success metrics, and deliver measurable impact.
  • Drive the integration of large-scale data systems and pipelines to power intelligent, adaptive models that reflect changing customer, market, and business dynamics.
  • Ensure rigorous model monitoring, governance, and retraining protocols to maintain accuracy, fairness, and compliance.
  • Communicate technical results and strategic insights clearly and persuasively to senior leadership and non-technical stakeholders.
  • Champion the adoption of advanced analytics tools, frameworks, and best practices across the company.

AWSLeadershipPythonSQLAgileCloud ComputingData AnalysisGCPKerasMachine LearningNumpyPyTorchTableauAlgorithmsData scienceData StructuresPandasTensorflowCommunication SkillsData visualizationTeam managementData modeling

Posted 6 days ago
Apply
Apply

📍 US, EU

🧭 Full-Time

💸 161000.0 - 224000.0 USD per year

🏢 Company: Taskrabbit👥 251-500💰 Secondary Market almost 10 years agoMarketplaceE-CommerceJanitorial ServiceFacilities Support ServicesFreight ServicePeer to PeerSharing Economy

  • 7+ years of experience in data science, analytics, or a related quantitative field, with a proven track record of delivering impactful results.
  • 3+ years of leadership experience, managing and scaling high-performance data science/analytics teams (5+ direct reports preferred).
  • Strong technical expertise in statistical methods, experiment design, and machine learning, with proficiency in SQL and at least one programming language (Python, R).
  • Experience with modern data platforms (AWS, GCP, Spark) and BI tools (Looker, Tableau) to support scalable analytics.
  • Exceptional communication and stakeholder management skills, with the ability to translate complex data insights into clear, actionable recommendations for diverse audiences, including executives.
  • Strategic mindset and business acumen, with a history of guiding data-driven product innovation and aligning initiatives with company goals.
  • Proven ability to influence cross-functional teams and drive alignment in a fast-paced environment, shaping product roadmaps through data-backed narratives.
  • Drive data-informed decision-making
  • Serve as a trusted advisor
  • Define the long-term strategy
  • Hire, develop, and mentor
  • Lead the development and adoption of robust experimentation frameworks
  • Partner with Engineering to build and maintain scalable data infrastructure
  • Balance short-term execution with long-term innovation

AWSLeadershipPythonSQLData AnalysisGCPMachine LearningTableauData engineeringData scienceSparkCommunication SkillsData modelingA/B testing

Posted 12 days ago
Apply
Apply

📍 Alabama, Arizona, Arkansas, California, Colorado, Connecticut, Florida, Georgia, Illinois, Indiana, Iowa, Kansas, Kentucky, Maine, Maryland, Massachusetts, Michigan, Minnesota, Missouri, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, Tennessee, Texas, Utah, Vermont, Virginia, Washington, or Washington, D.C.

🧭 Full-Time

💸 174000.0 - 228900.0 USD per year

🔍 Healthcare, Finance, Insurance

🏢 Company: Oscar Health👥 1001-5000💰 $140,000,000 Private over 4 years ago🫂 Last layoff about 5 years agoHealth InsuranceInsurTechInsuranceHealth Care

  • 6+ years of industry or other quantitative technical fields (which may include academia)
  • 4+ years of work experience working with SQL, R, and/or Python to query, manipulate and analyze data
  • 4+ years experience building data models, using more advanced analytics methods, statistical modeling, and/or data processing
  • 1+ years of people management experience (for people manager roles)
  • Oversee all the models and systems for Oscar's clinical risk adjustment quantification and submissions processes.
  • Translates business goals into plans and tactics for the team
  • Works with more senior leaders to establish longer term strategies and roadmaps
  • Participates in team-wide technical and operations management responsibilities
  • Leads and manages a small team of data scientists and/or analysts
  • Interacts and collaborates closely with data and business management across functional areas
  • Leads the research, development, and maintenance of data pipelines, statistical/ML models, and/or advanced analyses
  • Compliance with all applicable laws and regulations
  • Other duties as assigned

PythonSQLData AnalysisMachine LearningData scienceRisk ManagementData modelingData management

Posted 14 days ago
Apply
Apply

📍 United States of America

💸 193600.0 - 296600.0 USD per year

🔍 Automotive

🏢 Company: careers_gm

  • 7+ years of experience in Data Science in a corporate setting
  • Strong expertise in statistical modeling, machine learning and/or deep learning techniques
  • Experience in safety, automotive, or robotics data science
  • Proficiency SQL and/or Python
  • Develop scalable autonomous vehicle performance measurement solutions to ensure safe and compliant on-road behavior
  • Lead and manage a team of data scientists, providing guidance, coaching, and mentorship to ensure successful project delivery and professional growth
  • Collaborate across various autonomous vehicle development teams, including GPSSC and Software & Services organizations

LeadershipPythonSQLData AnalysisMachine LearningCross-functional Team LeadershipAlgorithmsData scienceMentoringData visualizationStrategic thinkingData modelingData management

Posted 14 days ago
Apply
Apply

📍 Canada

🔍 Fintech

🏢 Company: Jobgether👥 11-50💰 $1,493,585 Seed about 2 years agoInternet

  • Strong proficiency in Python and SQL, with hands-on experience using libraries like pandas, NumPy, and scikit-learn.
  • Deep understanding of statistical inference, including both Bayesian and frequentist methods.
  • Knowledge of core machine learning techniques such as regression and decision trees.
  • Experience working with cloud data warehouses and applying software engineering best practices to data workflows.
  • Partner with cross-functional teams to support growth and product development initiatives through data.
  • Design, analyze, and interpret A/B tests to evaluate new features and business strategies.
  • Build and deploy machine learning models to improve product functionality and personalization.
  • Create reliable data models and pipelines that serve as a foundation for analytics and reporting.
  • Ensure robust documentation, testing, and version control of your code and models.
  • Translate complex data problems into actionable insights for stakeholders across the organization.
  • Promote a data-driven culture by sharing best practices and mentoring team members.

PythonSQLCloud ComputingMachine LearningNumpyPandasData visualizationData modelingA/B testing

Posted 16 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 200000.0 - 270000.0 USD per year

🔍 Sports

🏢 Company: PrizePicks👥 101-250💰 Corporate about 2 years agoGamingFantasy SportsSports

  • 7+ years of experience in Backend Engineering/Machine Learning Engineering shipping and maintaining production-grade systems for internal tools and product users.
  • 3+ years of experience acting as technical lead and providing mentorship and feedback to junior engineers and scientists.
  • Extensive experience exposing real-time predictive model outputs to production-grade systems leveraging large-scale cloud-based data streaming pipelines and infrastructure.
  • Extensive experience working cross-functionally with data engineering, data science, product, and engineering teams, as well as external data providers and 3rd party services.
  • Experience in most of the following: SQL/NoSQL databases/warehouses: Postgres, BigQuery, BigTable, Scripting languages: SQL, Python, Go, Rust. Cloud platform services in GCP and analogous systems: Cloud Storage, Cloud Compute Engine, Cloud Functions, Kubernetes Engine.
  • Code version control: Git, Code testing libraries: PyTest, PyUnit, etc. Common ML and DL frameworks: scikit-learn, PyTorch, Tensorflow, Modeling methods: classical ML techniques, deep learning, gradient boosting, bayesian methods, generative models, MLOps tools: DataBricks, MLFlow, Kubeflow, DVC.
  • Data pipeline and workflow tools: Airflow, Argo Workflows, Cloud Workflows, Cloud Composer, Serverless Framework, Monitoring and Observability platforms: Prometheus, Grafana, Datadog, ELK stack, Infrastructure as Code platforms: Terraform, Google Cloud Deployment Manager, Other platform tools such as Redis, FastAPI, Docker and data visualization tools such as Streamlit or Dash.
  • Create and maintain optimal sport data stream architecture, ensuring data reliability in both speed and quality for both raw and transformed data pipelines.
  • Partner with Data Science to determine best paths for operationalization of DS/ML assets, ensuring model output quality, stability, and scalability.
  • Steer the design, implementation, and deployment of the data, MLOps, and API stack required for real-time pricing models, personalization/recommendations, risk management tooling, and other critical functions by contributing to architecture evaluations and decisions for the evolving data product roadmap.
  • Partner cross-functionally with Engineering, QA, and Product teams to enable the creation and distribution of highly visible and real-time data products to the PrizePicks platform.
  • Empower teams to build and own rigorous monitoring, alerting, and documentation processes, and work with Engineering teams to ensure complete feature uptime.
  • Act as a thought leader in the broader PrizePicks technology org, staying abreast of and implementing novel technologies, and disseminating knowledge and best practices to junior members of the team and collaborators alike.

Backend DevelopmentDockerPostgreSQLPythonSQLCloud ComputingGCPGitKubernetesMachine LearningMLFlowNumpyPyTorchAirflowData engineeringData scienceFastAPIGoGrafanaPrometheusREST APIRedisPandasRustSparkTensorflowTerraformData visualizationData modelingScripting

Posted 23 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 160000.0 - 220000.0 USD per year

🔍 Daily Fantasy Sports

🏢 Company: PrizePicks👥 101-250💰 Corporate about 2 years agoGamingFantasy SportsSports

  • 5+ years of experience in Backend Engineering/Machine Learning Engineering, shipping and maintaining production-grade systems for internal tools and product users.
  • 2+ years of experience acting as technical lead and providing mentorship and feedback to junior engineers and scientists.
  • Extensive experience exposing real-time predictive model outputs to production-grade systems, leveraging large-scale cloud-based data streaming pipelines and infrastructure.
  • Extensive experience working cross-functionally with data engineering, data science, product, and engineering teams, as well as external data providers and 3rd party services.
  • Experience in most of the following: SQL/NoSQL databases/warehouses: Postgres, BigQuery, BigTable; Scripting languages: SQL, Python, Go, Rust; Cloud platform services in GCP and analogous systems: Cloud Storage, Cloud Compute Engine, Cloud Functions, Kubernetes Engine; Code version control: Git; Code testing libraries: PyTest, PyUnit, etc; Common ML and DL frameworks: scikit-learn, PyTorch, Tensorflow; Modeling methods: classical ML techniques, deep learning, gradient boosting, bayesian methods, generative models; MLOps tools: DataBricks, MLFlow, Kubeflow, DVC; Data pipeline and workflow tools: Airflow, Argo Workflows, Cloud Workflows, Cloud Composer, Serverless Framework; Monitoring and Observability platforms: Prometheus, Grafana, Datadog, ELK stack; Infrastructure as Code platforms: Terraform, Google Cloud Deployment Manager; Other platform tools such as Redis, FastAPI, Docker and data visualization tools such as Streamlit or Dash.
  • Graduate degree in Computer Science, Statistics, Mathematics, Informatics, Information Systems or other quantitative field
  • Create and maintain optimal sport data stream architecture, ensuring data reliability in both speed and quality for both raw and transformed data pipelines.
  • Partner with Data Science to determine best paths for operationalization of DS/ML assets, ensuring model output quality, stability, and scalability.
  • Steer the design, implementation, and deployment of the data, MLOps, and API stack required for real-time pricing models, personalization/recommendations, risk management tooling, and other critical functions by contributing to architecture evaluations and decisions for the evolving data product roadmap.
  • Partner cross-functionally with Engineering, QA, and Product teams to enable the creation and distribution of highly-visible and real-time data products to the PrizePicks platform.
  • Build and own rigorous monitoring, alerting, and documentation processes, and work with Engineering teams to ensure complete feature uptime.
  • Grow as a thought leader in the broader PrizePicks technology org, staying abreast of and implementing novel technologies, and disseminating knowledge and best practices to junior members of the team and collaborators alike.

Backend DevelopmentDockerPythonSQLCloud ComputingGCPGitKubernetesMachine LearningPyTorchAirflowData engineeringData scienceFastAPIGrafanaPrometheusREST APIRedisTensorflowTerraformData visualization

Posted 23 days ago
Apply
Shown 10 out of 34

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search — filter job listings based on your country of residence;
  • AI-powered job processing — artificial intelligence analyzes thousands of listings, highlighting key details so you don’t have to read long descriptions;
  • advanced filters — sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates — we monitor job relevance and remove outdated listings;
  • personalized notifications — get tailored job offers directly via email or Telegram;
  • resume builder — create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security — modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing — up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.