Remote Data Science Jobs

Data science
574 jobs found. to receive daily emails with new job openings that match your preferences.
574 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Australia, New Zealand

πŸ” Software Development

  • Drive impact with data
  • Excel in core data science skills
  • Demonstrate key soft skills
  • Bring additional technical expertise
  • Have a strong analytical foundation
  • Understand the dynamics of tech companies
  • Have hands-on experience with large-scale data
  • Uncovering strategic insights
  • Designing and analyzing experiments
  • Defining and influencing with metrics
  • Providing data for decision-making

PythonSQLData AnalysisData MiningMachine LearningNumpyTableauProduct AnalyticsAlgorithmsData scienceData StructuresPandasCommunication SkillsAnalytical SkillsData visualizationData modelingData analyticsA/B testing

Posted 1 day ago
Apply
Apply

πŸ“ Germany, Austria, Italy, Spain, Portugal

πŸ” Financial and Real Estate

🏒 Company: PriceHubbleπŸ‘₯ 101-250πŸ’° Non-equity Assistance over 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years of experience building and maintaining production data pipelines.
  • Excellent English communication skills, both spoken and written, to effectively collaborate with cross-functional teams and mentor other engineers.
  • Clear writing is key in our remote-first setup.
  • Proficient in working with geospatial data and leveraging geospatial features.
  • Work with backend engineers and data scientists to turn raw data into trusted insights, handling everything from scraping and ingestion to transformation and monitoring.
  • Navigate cost-value trade-offs to make decisions that deliver value to customers at an appropriate cost.
  • Develop solutions that work in over 10 countries, considering local specifics.
  • Lead a project from concept to launch with a temporary team of engineers.
  • Raise the bar and drive the team to deliver high-quality products, services, and processes.
  • Improve the performance, data quality, and cost-efficiency of our data pipelines at scale.
  • Maintain and monitor the data systems your team owns.

AWSDockerLeadershipPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLGitKubernetesApache KafkaData engineeringData scienceSparkCI/CDProblem SolvingRESTful APIsMentoringLinuxExcellent communication skillsTeamworkCross-functional collaborationData visualizationData modelingData managementEnglish communication

Posted 2 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Internal Audit

🏒 Company: careers

  • Minimum 3+ years of experience writing and optimizing SQL/SAS queries in a business environment or 5+ years of experience in lieu of a degree
  • Knowledge of data warehouse technical architecture, ETL and analytic tools in extracting unstructured and structured data
  • Experience in building algorithms and coding proficiency in Python is required
  • Experience with visualization software Tableau or Power BI
  • Experience managing, moving, and manipulating data from multiple sources
  • Familiar with segmentation techniques such as decision trees or k-means clustering
  • Familiar with model development techniques such as logistic regression, random forest, or gradient boosting
  • Ability to provide analytic support including pulling data, preparing analysis, interpreting data, making strategic recommendations, and presenting to client/product team
  • Ability to clearly explain technical and analytical information (verbally, written, and in presentation form) and summarize for key stakeholders
  • Outstanding communications, relationship building, influencing, and collaboration skills
  • Strong project management, communications, multi-tasking, ability to work independently
  • Deliver advanced analytics to support the audit plan, including cycle audits, issue validation, remediation activities and special projects
  • Design and deploy analytic scripts and dashboards to communicate actionable insights to audit stakeholders.
  • Document analytic results and findings into audit workpapers.
  • Ensure the accuracy and integrity of data used in audit engagements through data transformation techniques.
  • Deploy automation on repetitive audit tasks using data analytics and data engineering techniques.
  • Collaborate with Internal Audit teams to understand audit test objectives and data requirements.
  • Collaborate with remediation teams to ensure data insights are effectively integrated into action plans.
  • Lead projects from beginning to end, including ideation, data mining, strategy formulation, and presentation of results and recommendations.

PythonSQLData AnalysisData MiningETLExcel VBAMachine LearningNumpySAS EGTableauAlgorithmsData engineeringData sciencePandasRESTful APIsMS OfficeData visualizationData modeling

Posted 2 days ago
Apply
Apply

πŸ“ US

πŸ’Έ 187000.0 - 232988.0 USD per year

🏒 Company: Pinterest Job Advertisements

  • Master’s degree (or its foreign degree equivalent) in Data Science, Statistics, or closely related quantitative discipline and two (2) years of experience in the job offered or in any occupation in related field, OR Bachelor’s degree (or its foreign degree equivalent) in Data Science, Statistics or closely related quantitative discipline and five (5) years of progressively responsible experience in the job offered or in any occupation in related field.
  • Data Analysis
  • Software product management
  • Python
  • Impact evaluation
  • Strategic planning
  • Project management
  • A/B testing and Statistics
  • User experience research
  • Data processing
  • Work with engineers, user research, data scientists and the sales team to develop a deep customer understanding.
  • Lead product vision, strategy, technical development and execution for conversion visibility.
  • Develop clear product requirements and work with engineering, design, data analytics, product marketing, research teams and other product teams to deliver customer impact.
  • Communicate your strategy upward and outward internally to ensure you have the support and comprehension you need to be successful.

Project ManagementPythonData AnalysisUser Experience DesignData scienceA/B testing

Posted 2 days ago
Apply
Apply

πŸ“ United States

🧭 Internship

πŸ’Έ 43700.0 - 69900.0 USD per year

πŸ” Software Development

🏒 Company: NREL

  • Experience in deep learning models, Bayesian neural networks, graph neural network models, and probabilistic semantic graphs.
  • Experience in the formulation of mixed-integer optimization problems and open-source solvers for sensor placement in distribution systems.
  • Demonstrated experience in programming languages such as Python and MATLAB.
  • Experience with open-source, collaborative software development and use of tools such as GitHub or GitLab for project management and version control.
  • Contribute to the development of deep learning models such as Bayesian neural networks, graph neural networks, probabilistic semantic graphs, and Generative AI to detect anomalies such as missing data, bad data, and communications failure in sensor data.
  • Development of data driven forecasting and disaggregation models and validation of both centralized and distributed state estimation algorithms.

PythonSoftware DevelopmentData AnalysisImage ProcessingMachine LearningMatlabC++AlgorithmsData scienceData StructuresData visualization

Posted 2 days ago
Apply
Apply

πŸ“ Portugal

πŸ” Wellness

  • Master’s degree or PhD in Computer Science, Data Science, Machine Learning, Statistics, or a related field.
  • Proficiency in Python and experience with machine learning frameworks such as PyTorch, TensorFlow, or similar.
  • Strong understanding of generative AI architectures, including transformers and attention mechanisms.
  • Experience in multi-agent systems and LLMs.
  • Strong problem-solving abilities, with a focus on experimental design and data analysis.
  • Excellent verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.
  • Have clear scientific thinking and a passion for integration R&D and cutting-edge technology into a product
  • Prior experience in Python and SQL a are mandatory requirements.
  • Understanding of state-of-the-art deep learning techniques, such as Transformers architectures and attention mechanisms, and knowledge of fine-tuning LLMs for enhanced model performance, using techniques such as LoRA, QLoRA, among others.
  • Design and implement structured function calling mechanisms within LLMs to enable dynamic interactions with APIs, databases, and retrieval-augmented generation (RAG) pipelines.
  • Craft clear instructions that help our models understand exactly what we need, creating reusable templates and testing against edge cases.
  • Preprocess, analyse, and curate high-quality datasets to train and fine-tune both embedding and generative models.
  • Develop robust evaluation frameworks to assess agentic AI performance, combining automated evaluation (LLM-as-judge), adversarial testing, human-in-the-loop evaluations, and custom behavioral metrics.
  • Establish monitoring systems that track LLM behavior in production, capturing key metrics around information retrieval, hallucinations, and latency.
  • Stay current with the latest research in generative AI,, share insights with the team, test promising approaches, and help make us better.
  • Collaborate with engineering teams to deploy models in scalable and efficient production environments.

AWSPythonSQLData AnalysisMachine LearningPyTorchData science

Posted 2 days ago
Apply
Apply

πŸ“ United States

πŸ’Έ 136500.0 - 266500.0 USD per year

πŸ” Autonomous Vehicles

  • BA, Masters or PhD in Machine Learning, Computer Science, Applied Mathematics, Statistics, Physics or a related field; or equivalent industry experience
  • In-depth understanding of common Machine Learning, Deep Learning algorithms, or ML Planning
  • Experience designing, training, and analyzing neural networks for at least one of the following applications: object detection, semantic/instance segmentation, visual classification, motion/gesture recognition, sensor fusion, multitask learning, motion prediction, and/or multi-object tracking
  • Advanced knowledge of software engineering principles including software design, source control management, build processes, code reviews, testing methods
  • Fluency in Python, including standard scientific computing libraries and Python bindings development experience
  • Experience with PyTorch or other deep learning frameworks
  • Experience working with large data sets and deriving insights from data
  • Effectiveness at leading and executing large, complex technical initiatives
  • Define and influence the direction of the team, organization, and/or department.
  • Advise leaders on technology problems and solutions
  • Connect and drive business impact through technology solutions
  • Formulate problems, architect solutions, and design processes
  • Prototype, evaluate, implement, and iterate on solutions
  • Pioneer research into state of the art solutions and systems for autonomous vehicles
  • Productionize and deploy solutions onto autonomous vehicle fleets
  • Mentor and grow junior and experienced engineers and researchers
  • Help create and reinforce a culture of inclusion, innovation, and excellence

PythonSoftware DevelopmentData AnalysisGitImage ProcessingMachine LearningPyTorchC++AlgorithmsData scienceSoftware Engineering

Posted 2 days ago
Apply
Apply

πŸ“ England, Scotland, Portugal, Poland, Spain

πŸ” Robotics

🏒 Company: Locus RoboticsπŸ‘₯ 251-500πŸ’° $117,000,000 Series F over 2 years agoWarehousingLogisticsIndustrial AutomationE-CommerceWarehouse AutomationRobotics

  • 4+ years of hands-on experience designing and deploying machine learning models in production, with a focus on reinforcement learning (RL) and multi-agent systems (MAS).
  • Advanced Python programming skills, with a strong emphasis on writing efficient, scalable, and maintainable code.
  • Proven experience with TensorFlow/PyTorch/Jax, Scikit-learn, and MLOps workflows.
  • Experience working with Polars and/or Pandas for high-performance data processing.
  • Proficiency with cloud platforms (AWS, GCP, or Azure), including containerization and orchestration using Docker and Kubernetes.
  • Hands-on experience with reinforcement learning frameworks such as OpenAI Gym or Stable-Baselines3.
  • Practical knowledge of optimization algorithms and probabilistic modeling techniques.
  • Experience integrating models into real-time decision-making systems or multi-agent RL environments (MARL).
  • Utilize, develop, and enhance simulation tooling and infrastructure.
  • Develop, deploy, and maintain machine learning models, with a strong focus on reinforcement learning (RL) and multi-agent systems (MAS).
  • Implement and improve MLOps pipelines.
  • Collaborate with data engineers and software developers to ensure seamless integration of machine learning models with existing infrastructure and data pipelines.
  • Stay up to date with advancements in reinforcement learning, distributed computing, and ML frameworks to drive innovation in the organization.
  • Work with cloud-based solutions (AWS, GCP, or Azure) to deploy and manage machine learning workloads in a scalable manner.

AWSDockerPythonSQLCloud ComputingData AnalysisKubernetesMachine LearningPyTorchAlgorithmsData sciencePandasTensorflow

Posted 3 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Cybersecurity

🏒 Company: crowdstrikecareers

  • 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used
  • 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc.
  • Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable.
  • Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.)
  • Production experience with infrastructure-as-code tools such as Terraform, FluxCD
  • Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools
  • Expert level experience with CI/CD frameworks such as GitHub Actions
  • Expert level experience with containerization frameworks
  • Strong analytical and problem solving skills, capable of working in a dynamic environment
  • Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes.
  • Help design, build, and facilitate adoption of a modern Data+ML platform
  • Modularize complex ML code into standardized and repeatable components
  • Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring
  • Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines
  • Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines
  • Review code changes from data scientists and champion software development best practices
  • Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment

PythonData AnalysisGCPKubernetesMachine LearningMLFlowAirflowData engineeringData scienceSparkCI/CDRESTful APIsTerraform

Posted 3 days ago
Apply
Shown 10 out of 574

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search β€” filter job listings based on your country of residence;
  • AI-powered job processing β€” artificial intelligence analyzes thousands of listings, highlighting key details so you don’t have to read long descriptions;
  • advanced filters β€” sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates β€” we monitor job relevance and remove outdated listings;
  • personalized notifications β€” get tailored job offers directly via email or Telegram;
  • resume builder β€” create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security β€” modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing β€” up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.