Remote Data Science Jobs

Python
3,699 jobs found. to receive daily emails with new job openings that match your preferences.
3,699 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Australia, New Zealand

πŸ” Software Development

  • Drive impact with data
  • Excel in core data science skills
  • Demonstrate key soft skills
  • Bring additional technical expertise
  • Have a strong analytical foundation
  • Understand the dynamics of tech companies
  • Have hands-on experience with large-scale data
  • Uncovering strategic insights
  • Designing and analyzing experiments
  • Defining and influencing with metrics
  • Providing data for decision-making

PythonSQLData AnalysisData MiningMachine LearningNumpyTableauProduct AnalyticsAlgorithmsData scienceData StructuresPandasCommunication SkillsAnalytical SkillsData visualizationData modelingData analyticsA/B testing

Posted 1 day ago
Apply
Apply

πŸ“ United States

πŸ” Software Development

  • Bachelor’s degree or higher (completed and verified prior to start) AND Two (2) years of experience with web programming and object-oriented programming, such as C#, Java, C, C++, Python.
  • High School Diploma/GED AND Three (3) years of experience with web programming and object-oriented programming, such as C#, Java, C, C++, Python.
  • Experience in UI/UX design and best practices
  • Experience with AWS cloud development
  • Experience with Web Services development (REST/SOAP)
  • Experience with unit testing frameworks
  • Experience with version control systems such as Git
  • Experience with database technologies such as MySQL/Oracle/Graph Database
  • Ability to work with remote (off-site) team members
  • Excellent verbal and written communication skills
  • Participating and leading software design, coding, testing, debugging, and documentation as needed
  • Work with other software engineers, clinical analysts, quality engineers, and other team members to design and build required applications.
  • Adhere to team design and coding procedures and standards.
  • Coordinate and communicate with clinical analysts, quality analysts, and other software engineers
  • Resolve escalated internal customer support issues
  • Participate in analysis and code review
  • Be an active member of an Agile team by participating in all phases of SDLC, including design, software development, code reviews, and deployments.
  • Developing solutions, software, and components based on internal/external customer and business requirements
  • Creating and automating component unit tests, measuring and improving software performance, and taking pride in the quality of component deliverables.
  • Documenting technical aspects of the application for technical users and end-user documentation.
  • Supporting our proprietary coding content.
  • Contributing towards the future design and development of medical coding software pathways
  • Helping resolve escalated internal customer support issues
  • Determining and recommending tools to prepare us for future technologies
  • Develop your skillset through training and development opportunities and continue to grow with Solventum Health Information Systems.

PythonSoftware DevelopmentSQLAgileGitJavaMySQLC#C++Java SpringREST APICI/CDDebugging

Posted 1 day ago
Apply
Apply

πŸ“ LatAm

🧭 Full-Time

πŸ” E-Learning

🏒 Company: TruelogicπŸ‘₯ 101-250ConsultingWeb DevelopmentWeb DesignSoftware

  • 1-3 years of experience in Python back-end development.
  • Strong understanding of RESTful API development and best practices.
  • Experience with databases (SQL and NoSQL) and ORM frameworks.
  • Knowledge of version control (Git) and software development best practices.
  • A proactive and problem-solving mindset with a willingness to take ownership of tasks.
  • Strong communication skills and the ability to collaborate in a team environment.
  • A passion for learning and growing in a fast-paced development environment.
  • Develop, test, and maintain back-end services and APIs using Python.
  • Work with Django, Flask, or FastAPI to build scalable applications.
  • Collaborate with front-end developers, DevOps, and other teams to ensure smooth system integration.
  • Optimize code for performance, reliability, and security.
  • Troubleshoot, debug, and enhance existing applications.
  • Take initiative in proposing improvements and best practices.
  • Stay up to date with the latest industry trends and emerging technologies.

Backend DevelopmentPythonSoftware DevelopmentSQLDjangoFlaskGitAPI testingFastAPIREST APINosql

Posted 1 day ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 114000.0 - 171599.0 USD per year

πŸ” Fintech

  • Strong expertise in data pipeline development (ETL/ELT) and workflow automation.
  • Proficiency in Python, SQL, and scripting languages for data processing and automation.
  • Hands-on experience with Workato, Google Apps Script, and API-driven automation.
  • Automate customer support, success, and service workflows to improve speed, accuracy, and responsiveness.
  • Build and maintain scalable ETL/ELT pipelines to ensure real-time access to critical customer data.
  • Implement self-service automation to enable customers and internal teams to quickly access information.

PythonSQLETLJiraAPI testingData engineeringCI/CDRESTful APIsData visualizationScriptingCustomer Success

Posted 1 day ago
Apply
Apply

πŸ“ Worldwide

🧭 Full-Time

πŸ” Blockchain Data

🏒 Company: AlliumπŸ‘₯ 11-50Information Technology

  • 2-5 years experience in a similar role.
  • A Bachelor's degree in Computer Science, Data Science, Engineering, or a related field, with a Master's degree preferred.
  • Strong proficiency in programming languages like Python, SQL, and JavaScript, and experience with data analytics tools.
  • 2-5 years of experience in data engineering, data analytics, or similar, with a track record of designing customer solutions.
  • Excellent communication and interpersonal skills, problem-solving abilities, and project management skills to handle multiple customer projects.
  • A solid understanding of blockchain technology and its data structures, with knowledge of data security and privacy best practices as a plus.
  • Engage with customers to understand their blockchain data requirements and provide expert guidance on using Allium.so's platform.
  • Design and implement tailored data analytics solutions, such as dashboards for transaction trends or custom reports for blockchain metrics.
  • Offer ongoing technical support, troubleshooting issues to ensure smooth platform usage.
  • Gather customer feedback to inform product improvements and work with product and engineering teams to prioritize new features.
  • Create and maintain documentation for customer implementations to support future operations.

PythonSQLBlockchainData AnalysisJavascriptData engineeringCommunication SkillsProblem SolvingCustomer serviceRESTful APIsJSONData analytics

Posted 1 day ago
Apply
Apply

🧭 Full-Time

πŸ” Healthcare Technology

  • 5+ years of experience in software engineering, AI, or cloud-based application development.
  • Strong proficiency in Python for building and optimizing AI-powered applications.
  • Experience working in serverless environments (AWS Lambda, API Gateway, Step Functions).
  • Hands-on experience with AWS Bedrock and/or agentic application development.
  • Expertise in Terraform for infrastructure automation and cloud deployment.
  • Strong understanding of cloud-based AI/ML tools and best practices for integrating AI models into production.
  • Strong analytical and problem-solving skills, with a high level of judgment and creativity in designing innovative solutions.
  • Demonstrated ability to thrive in fast-paced, high-growth, and rapidly evolving environments.
  • Ability to work effectively in a remote-first environment, ensuring high-quality virtual interactions with minimal distractions.
  • Design, develop, and optimize AI-driven applications using Python in a serverless environment (AWS Lambda, API Gateway, Step Functions).
  • Leverage AWS Bedrock and agentic application development frameworks to create intelligent, scalable solutions.
  • Design, deploy, and manage infrastructure as code with Terraform to automate cloud resource provisioning and ensure system reliability.
  • Work with data scientists, AI engineers, and product teams to integrate AI models into Bamboo Health’s products.
  • Ensure best practices for data security, cloud compliance, and performance optimization, especially in healthcare applications.
  • Stay updated on emerging AI, cloud, and automation technologies, helping shape Bamboo Health’s AI strategy.

AWSDockerPythonSQLCloud ComputingMachine LearningAlgorithmsREST APIServerlessCommunication SkillsCI/CDMentoringTerraform

Posted 2 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ” Software Development

🏒 Company: Trek10πŸ‘₯ 51-100πŸ’° Grant about 3 years agoConsultingService IndustryTechnical Support

  • 4+ years of experience with Amazon Web Services
  • 2+ year code-defined infrastructure experience
  • 2+ Continuous Integration and Deliver (CI/CD) experience
  • Have worked with Agile Project Management or other project management work methodologies.
  • 4 years experience with Cloud Native patterns with services like AWS Lambda
  • 4 years experience architecting and building scalable, automated infrastructure
  • Lead client engagements and be Trek10’s face to the world
  • Provide thought leadership around scalable next-generation cloud architecture experience and guidance to clients
  • Lead cross-functional implementation teams to deliver world-class solutions to clients
  • Evaluate libraries, tools, and services for suitability
  • Own the implementation of stories and other engineering of advanced cloud work for clients
  • Write Infrastructure as Code (IaC) in multiple frameworks and can break apart projects into independent components at an appropriate level, taking into account different aspects of a stack and know how to build stability and security into them.
  • Help clients get work done smarter with automation and code
  • Lead a team to navigate complex bureaucratic relationships to bring all stakeholders together around a common product vision and strategy
  • Drive more complex projects and contribute to the detailed idea space effectively
  • Learn, coach, and share knowledge and skills with other Trek10 team members
  • Lead re-platforming and refactoring efforts for existing systems
  • Prepare work for team members of varying levels of experience, defining stories and task type work
  • Contribute to internal IP and open source development
  • Participate in delivery efforts using Agile or other work methodologies
  • Build relationships with vendors and partners
  • Engage with clients including presentations, technical decisions, and education opportunities
  • Learn, research, and stay up to date on best practices and evolutionary capability in the Amazon Web Services (AWS) cloud
  • Be a technical team advocate, ensuring clear direction and efficient execution
  • Perform code reviews, provide feedback, and pair when beneficial
  • Keep an eye out for opportunities to support our clients in even greater ways
  • Write cool blog posts, do talks at community events, help us achieve additional certifications in the AWS Partner Community, help our clients solve all sorts of problems
  • Occasional travel may be required to meet stakeholders, kick off new projects, attend conferences, and attend company functions.

AWSNode.jsPythonAgileCloud ComputingIoTKubernetesAmazon Web ServicesServerlessCI/CDRESTful APIsDevOpsTerraformMicroservices

Posted 2 days ago
Apply
Apply

πŸ“ Germany, Austria, Italy, Spain, Portugal

πŸ” Financial and Real Estate

🏒 Company: PriceHubbleπŸ‘₯ 101-250πŸ’° Non-equity Assistance over 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years of experience building and maintaining production data pipelines.
  • Excellent English communication skills, both spoken and written, to effectively collaborate with cross-functional teams and mentor other engineers.
  • Clear writing is key in our remote-first setup.
  • Proficient in working with geospatial data and leveraging geospatial features.
  • Work with backend engineers and data scientists to turn raw data into trusted insights, handling everything from scraping and ingestion to transformation and monitoring.
  • Navigate cost-value trade-offs to make decisions that deliver value to customers at an appropriate cost.
  • Develop solutions that work in over 10 countries, considering local specifics.
  • Lead a project from concept to launch with a temporary team of engineers.
  • Raise the bar and drive the team to deliver high-quality products, services, and processes.
  • Improve the performance, data quality, and cost-efficiency of our data pipelines at scale.
  • Maintain and monitor the data systems your team owns.

AWSDockerLeadershipPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLGitKubernetesApache KafkaData engineeringData scienceSparkCI/CDProblem SolvingRESTful APIsMentoringLinuxExcellent communication skillsTeamworkCross-functional collaborationData visualizationData modelingData managementEnglish communication

Posted 2 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Internal Audit

🏒 Company: careers

  • Minimum 3+ years of experience writing and optimizing SQL/SAS queries in a business environment or 5+ years of experience in lieu of a degree
  • Knowledge of data warehouse technical architecture, ETL and analytic tools in extracting unstructured and structured data
  • Experience in building algorithms and coding proficiency in Python is required
  • Experience with visualization software Tableau or Power BI
  • Experience managing, moving, and manipulating data from multiple sources
  • Familiar with segmentation techniques such as decision trees or k-means clustering
  • Familiar with model development techniques such as logistic regression, random forest, or gradient boosting
  • Ability to provide analytic support including pulling data, preparing analysis, interpreting data, making strategic recommendations, and presenting to client/product team
  • Ability to clearly explain technical and analytical information (verbally, written, and in presentation form) and summarize for key stakeholders
  • Outstanding communications, relationship building, influencing, and collaboration skills
  • Strong project management, communications, multi-tasking, ability to work independently
  • Deliver advanced analytics to support the audit plan, including cycle audits, issue validation, remediation activities and special projects
  • Design and deploy analytic scripts and dashboards to communicate actionable insights to audit stakeholders.
  • Document analytic results and findings into audit workpapers.
  • Ensure the accuracy and integrity of data used in audit engagements through data transformation techniques.
  • Deploy automation on repetitive audit tasks using data analytics and data engineering techniques.
  • Collaborate with Internal Audit teams to understand audit test objectives and data requirements.
  • Collaborate with remediation teams to ensure data insights are effectively integrated into action plans.
  • Lead projects from beginning to end, including ideation, data mining, strategy formulation, and presentation of results and recommendations.

PythonSQLData AnalysisData MiningETLExcel VBAMachine LearningNumpySAS EGTableauAlgorithmsData engineeringData sciencePandasRESTful APIsMS OfficeData visualizationData modeling

Posted 2 days ago
Apply
Apply

πŸ“ United States

🏒 Company: ge_externalsite

  • Hands-on experience in programming languages like Java, Python or Scala and experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or HiveQL
  • Exposure to industry standard data modeling tools (e.g., ERWin, ER Studio, etc.).
  • Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend
  • Exposure to industry standard data catalog, automated data discovery and data lineage tools (e.g., Alation, Collibra, etc., )
  • Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase)
  • Exposure to unstructured datasets and ability to handle XML, JSON file formats
  • Conduct exploratory data analysis and generate visual summaries of data. Identify data quality issues proactively.
  • Developing reusable code pipelines through CI/CD.
  • Hands-on experience of big data or MPP databases.
  • Developing and executing integrated test plans.
  • Be responsible for identifying solutions for complex data analysis and data structure.
  • Be responsible for creating digital thread requirements
  • Be responsible for change management of database artifacts to support next gen QMS applications
  • Be responsible for monitoring data availability and data health of complex systems
  • Understand industry trends and stay up to date on associated Quality and tech landscape.
  • Design & build technical data dictionaries and support business glossaries to analyze the datasets
  • This role may also work on other Quality team digital and strategic deliveries that support the business.
  • Perform data profiling and data analysis for source systems, manually maintained data, machine or sensor generated data and target data repositories
  • Design & build both logical and physical data models for both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) solutions
  • Develop and maintain data mapping specifications based on the results of data analysis and functional requirements
  • Build a variety of data loading & data transformation methods using multiple tools and technologies.
  • Design & build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications
  • Manage metadata structures needed for building reusable Extract, Transform & Load (ETL) components.
  • Analyze reference datasets and familiarize with Master Data Management (MDM) tools.
  • Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact.
  • Derive solutions and make recommendations from deep dive data analysis proactively.
  • Design and build Data Quality (DQ) rules.
  • Drives design and implementation of the roadmap.
  • Design and develop complex code in multiple languages.
  • This role may also work on other Quality team digital and strategic deliveries that support the business.

PostgreSQLPythonSQLData AnalysisETLHadoopJavaMySQLOracleData engineeringNosqlSparkCI/CDAgile methodologiesJSONScalaData visualizationData modeling

Posted 2 days ago
Apply
Shown 10 out of 3699

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search β€” filter job listings based on your country of residence;
  • AI-powered job processing β€” artificial intelligence analyzes thousands of listings, highlighting key details so you don’t have to read long descriptions;
  • advanced filters β€” sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates β€” we monitor job relevance and remove outdated listings;
  • personalized notifications β€” get tailored job offers directly via email or Telegram;
  • resume builder β€” create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security β€” modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing β€” up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.