Remote Data Science Jobs

Data engineering
794 jobs found. to receive daily emails with new job openings that match your preferences.
794 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 114000.0 - 171599.0 USD per year

๐Ÿ” Fintech

  • Strong expertise in data pipeline development (ETL/ELT) and workflow automation.
  • Proficiency in Python, SQL, and scripting languages for data processing and automation.
  • Hands-on experience with Workato, Google Apps Script, and API-driven automation.
  • Automate customer support, success, and service workflows to improve speed, accuracy, and responsiveness.
  • Build and maintain scalable ETL/ELT pipelines to ensure real-time access to critical customer data.
  • Implement self-service automation to enable customers and internal teams to quickly access information.

PythonSQLETLJiraAPI testingData engineeringCI/CDRESTful APIsData visualizationScriptingCustomer Success

Posted 1 day ago
Apply
Apply

๐Ÿ“ Worldwide

๐Ÿงญ Full-Time

๐Ÿ” Blockchain Data

๐Ÿข Company: Allium๐Ÿ‘ฅ 11-50Information Technology

  • 2-5 years experience in a similar role.
  • A Bachelor's degree in Computer Science, Data Science, Engineering, or a related field, with a Master's degree preferred.
  • Strong proficiency in programming languages like Python, SQL, and JavaScript, and experience with data analytics tools.
  • 2-5 years of experience in data engineering, data analytics, or similar, with a track record of designing customer solutions.
  • Excellent communication and interpersonal skills, problem-solving abilities, and project management skills to handle multiple customer projects.
  • A solid understanding of blockchain technology and its data structures, with knowledge of data security and privacy best practices as a plus.
  • Engage with customers to understand their blockchain data requirements and provide expert guidance on using Allium.so's platform.
  • Design and implement tailored data analytics solutions, such as dashboards for transaction trends or custom reports for blockchain metrics.
  • Offer ongoing technical support, troubleshooting issues to ensure smooth platform usage.
  • Gather customer feedback to inform product improvements and work with product and engineering teams to prioritize new features.
  • Create and maintain documentation for customer implementations to support future operations.

PythonSQLBlockchainData AnalysisJavascriptData engineeringCommunication SkillsProblem SolvingCustomer serviceRESTful APIsJSONData analytics

Posted 1 day ago
Apply
Apply

๐Ÿ“ Germany, Austria, Italy, Spain, Portugal

๐Ÿ” Financial and Real Estate

๐Ÿข Company: PriceHubble๐Ÿ‘ฅ 101-250๐Ÿ’ฐ Non-equity Assistance over 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years of experience building and maintaining production data pipelines.
  • Excellent English communication skills, both spoken and written, to effectively collaborate with cross-functional teams and mentor other engineers.
  • Clear writing is key in our remote-first setup.
  • Proficient in working with geospatial data and leveraging geospatial features.
  • Work with backend engineers and data scientists to turn raw data into trusted insights, handling everything from scraping and ingestion to transformation and monitoring.
  • Navigate cost-value trade-offs to make decisions that deliver value to customers at an appropriate cost.
  • Develop solutions that work in over 10 countries, considering local specifics.
  • Lead a project from concept to launch with a temporary team of engineers.
  • Raise the bar and drive the team to deliver high-quality products, services, and processes.
  • Improve the performance, data quality, and cost-efficiency of our data pipelines at scale.
  • Maintain and monitor the data systems your team owns.

AWSDockerLeadershipPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLGitKubernetesApache KafkaData engineeringData scienceSparkCI/CDProblem SolvingRESTful APIsMentoringLinuxExcellent communication skillsTeamworkCross-functional collaborationData visualizationData modelingData managementEnglish communication

Posted 2 days ago
Apply
Apply

๐Ÿ“ India

๐Ÿงญ Full-Time

๐Ÿ” Internal Audit

๐Ÿข Company: careers

  • Minimum 3+ years of experience writing and optimizing SQL/SAS queries in a business environment or 5+ years of experience in lieu of a degree
  • Knowledge of data warehouse technical architecture, ETL and analytic tools in extracting unstructured and structured data
  • Experience in building algorithms and coding proficiency in Python is required
  • Experience with visualization software Tableau or Power BI
  • Experience managing, moving, and manipulating data from multiple sources
  • Familiar with segmentation techniques such as decision trees or k-means clustering
  • Familiar with model development techniques such as logistic regression, random forest, or gradient boosting
  • Ability to provide analytic support including pulling data, preparing analysis, interpreting data, making strategic recommendations, and presenting to client/product team
  • Ability to clearly explain technical and analytical information (verbally, written, and in presentation form) and summarize for key stakeholders
  • Outstanding communications, relationship building, influencing, and collaboration skills
  • Strong project management, communications, multi-tasking, ability to work independently
  • Deliver advanced analytics to support the audit plan, including cycle audits, issue validation, remediation activities and special projects
  • Design and deploy analytic scripts and dashboards to communicate actionable insights to audit stakeholders.
  • Document analytic results and findings into audit workpapers.
  • Ensure the accuracy and integrity of data used in audit engagements through data transformation techniques.
  • Deploy automation on repetitive audit tasks using data analytics and data engineering techniques.
  • Collaborate with Internal Audit teams to understand audit test objectives and data requirements.
  • Collaborate with remediation teams to ensure data insights are effectively integrated into action plans.
  • Lead projects from beginning to end, including ideation, data mining, strategy formulation, and presentation of results and recommendations.

PythonSQLData AnalysisData MiningETLExcel VBAMachine LearningNumpySAS EGTableauAlgorithmsData engineeringData sciencePandasRESTful APIsMS OfficeData visualizationData modeling

Posted 2 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿข Company: ge_externalsite

  • Hands-on experience in programming languages like Java, Python or Scala and experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or HiveQL
  • Exposure to industry standard data modeling tools (e.g., ERWin, ER Studio, etc.).
  • Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend
  • Exposure to industry standard data catalog, automated data discovery and data lineage tools (e.g., Alation, Collibra, etc., )
  • Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase)
  • Exposure to unstructured datasets and ability to handle XML, JSON file formats
  • Conduct exploratory data analysis and generate visual summaries of data. Identify data quality issues proactively.
  • Developing reusable code pipelines through CI/CD.
  • Hands-on experience of big data or MPP databases.
  • Developing and executing integrated test plans.
  • Be responsible for identifying solutions for complex data analysis and data structure.
  • Be responsible for creating digital thread requirements
  • Be responsible for change management of database artifacts to support next gen QMS applications
  • Be responsible for monitoring data availability and data health of complex systems
  • Understand industry trends and stay up to date on associated Quality and tech landscape.
  • Design & build technical data dictionaries and support business glossaries to analyze the datasets
  • This role may also work on other Quality team digital and strategic deliveries that support the business.
  • Perform data profiling and data analysis for source systems, manually maintained data, machine or sensor generated data and target data repositories
  • Design & build both logical and physical data models for both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) solutions
  • Develop and maintain data mapping specifications based on the results of data analysis and functional requirements
  • Build a variety of data loading & data transformation methods using multiple tools and technologies.
  • Design & build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications
  • Manage metadata structures needed for building reusable Extract, Transform & Load (ETL) components.
  • Analyze reference datasets and familiarize with Master Data Management (MDM) tools.
  • Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact.
  • Derive solutions and make recommendations from deep dive data analysis proactively.
  • Design and build Data Quality (DQ) rules.
  • Drives design and implementation of the roadmap.
  • Design and develop complex code in multiple languages.
  • This role may also work on other Quality team digital and strategic deliveries that support the business.

PostgreSQLPythonSQLData AnalysisETLHadoopJavaMySQLOracleData engineeringNosqlSparkCI/CDAgile methodologiesJSONScalaData visualizationData modeling

Posted 2 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Internship

๐Ÿ’ธ 20.0 USD per hour

๐Ÿข Company: Worldly๐Ÿ‘ฅ 11-50

  • Strong verbal and written communication skills
  • Ability to work independently and as part of a team
  • Proactive problem-solving mindset
  • Analytical thinking and attention to detail
  • Ability to adapt to new tools and technologies quickly
  • Technology: Python, SQL, AWS, Git, TensorFlow, Tableau, Jupyter Notebooks
  • Developing and testing software solutions to improve system functionality and performance
  • Analyzing large datasets to identify trends, insights, and areas for improvement
  • Assisting in building and optimizing machine learning models for predictive analytics
  • Supporting data engineering tasks, such as data cleaning, transformation, and structuring
  • Researching and implement best practices in AI/ML, cybersecurity, or cloud infrastructure
  • Collaborating with engineers and data scientists to design scalable solutions
  • Writing clean, efficient code and documentation to support team projects
  • Conducting performance testing and debugging to improve application efficiency
  • Assisting in securing cloud-based systems and mitigating potential security risks
  • Presenting findings and recommendations to the technology team

AWSPythonSoftware DevelopmentSQLCybersecurityGitMachine LearningTableauData engineeringTensorflow

Posted 2 days ago
Apply
Apply

๐Ÿ“ United States of America

๐Ÿ’ธ 94510.99 - 150972.31 USD per year

๐Ÿ” Banking

๐Ÿข Company: external

  • 6-8+ years in relevant banking, data, project, and/or leadership role.
  • Banking experience and knowledge of industry regulatory requirements is preferred.
  • Risk Management, Information Security, Information Technology, and/or Banking Operations experience is a plus.
  • Experience in establishing and executing data governance, data quality, or data management practices.
  • Demonstrated experience collaborating with cross functional teams and ability to develop, recommend, and implement solutions.
  • Project management experience to include developing and executing project plans and roadmaps.
  • Promote the value of the Data Intelligence Program and influence a data-driven culture.
  • Develop a working knowledge of the Bankโ€™s organizational structure, operations, product and service offering, and data architecture including data sources, data flows, and control points and document where appropriate.
  • Develop, document, implement, and manage processes supporting data governance, data stewardship, and data quality programs.
  • Continuously assess and improve data governance processes and practices to adapt to changing business needs and regulatory requirements.
  • Monitor industry trends and advancements to ensure data practices align with the latest standards and technologies.
  • Lead Data Communities and oversee data-related efforts/activities through completion.
  • Work with business lines and/or Data Communities to identify and document critical data elements (CDEs) and respective business rules and thresholds for data quality monitoring.
  • Work with business lines and/or Data Communities to identify root cause of data quality issues and opportunities for process improvement and implement solutions to resolve or reduce repeated issues; supporting documentation should be developed as appropriate.
  • Support adoption of the Data Intelligence tools, solutions, and processes.
  • Elevate data-related issues, risks, and gaps and recommend solutions to the Director of Business Initiatives and/or Data Intelligence Committee.
  • Develop and execute Data Intelligence awareness and training plans as appropriate.
  • Assist the Director of Business Initiatives with managing data intelligence use cases and business line onboarding as needed.
  • Assist the Director of Business Initiatives with development and implementation of a data change management process and supporting documentation.
  • Support maintenance of the Data Intelligence repository (i.e. SharePoint site) by ensuring relevant artifacts such as program documentation, procedures, and meeting packets for data working groups and/or projects are current and available.
  • Manage projects, where appropriate, in accordance with the Bankโ€™s established project lifecycle process to include but not limited to developing and executing project plans, roadmaps, schedules, business requirements, and testing strategies.
  • Prepare reports, materials, and status updates for the Data Intelligence Committee and other meetings and/or ad hoc requests as needed.
  • Collaborate with the Data Intelligence team to achieve common objectives and complete projects timely.
  • Partner with IT, Information Security, Risk Management, and Privacy teams, as appropriate, to ensure data governance processes and projects align with standards and/or regulatory requirements of the respective area.
  • Represent Data Intelligence on committees, projects, and process forums (i.e. ARB, DCAB) as appropriate.
  • Perform other duties as assigned.

LeadershipProject ManagementSQLBusiness IntelligenceData AnalysisETLSharePointData engineeringData StructuresCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelProblem SolvingAgile methodologiesRisk ManagementData visualizationData modelingData analyticsData managementChange Management

Posted 2 days ago
Apply
Apply

๐Ÿ“ United States, Canada

๐Ÿงญ Full-Time

๐Ÿ’ธ 165446.0 - 236250.0 CAD per year

๐Ÿ” Software Development

๐Ÿข Company: Khan Academy๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $10,000 Series A about 3 years agoInternetEducationE-LearningApps

  • 8+ years of experience as a Technical Program Manager, successfully leading big data-focused programs (or multiple projects) from inception to execution.
  • Technical background in software development, data engineering, or analytics.
  • Proficient in program/project management tools (e.g., JIRA, Confluence, Gantt charts, spreadsheets, presentations, etc.); comfortable collaborating in an online/cloud-based environment (e.g., Google Workspace, Slack, etc.).
  • Translate Data Initiatives into Actionable Requirements
  • Facilitate Cross-Team Data Alignment
  • Monitor Product Changes & Mitigate Data Risks
  • Develop & Optimize Data Processes

Project ManagementSQLData AnalysisJiraCross-functional Team LeadershipData engineeringCommunication SkillsAnalytical SkillsProblem SolvingAgile methodologiesRisk ManagementData visualizationStakeholder managementData modelingData analyticsData managementConfluence

Posted 2 days ago
Apply
Apply

๐Ÿ“ Italy

๐Ÿ” Logistics

๐Ÿข Company: Edgemony๐Ÿ‘ฅ 1-10Education

  • Hai almeno 3 anni di esperienza nello sviluppo backend.
  • Hai una conoscenza approfondita di Python e dimestichezza con FastAPI, SQLAlchemy e PostgreSQL.
  • Hai esperienza nello sviluppo di API REST performanti e sistemi orientati alla scalabilitร .
  • Hai esperienza pratica o forte interesse negli agenti AI, LLM, sistemi di raccomandazione o architetture multi-agente.
  • Progetterai servizi e moduli backend per alimentare i nostri agenti AI.
  • Collaborerai alla creazione di un sistema di raccomandazione che apprende dalle preferenze, comportamenti e contesto logistico.
  • Lavorerai su sistemi scalabili e asincroni, con particolare attenzione alla performance in presenza di milioni di record.
  • Parteciperai alla progettazione di nuove architetture backend per abilitare interazioni uomo-macchina intelligenti e fluide.
  • Collaborerai con il CPO, il team di prodotto e il team frontend per portare nuove funzionalitร  in produzione rapidamente.

Backend DevelopmentDockerPostgreSQLPythonArtificial IntelligenceAlgorithmsAPI testingData engineeringData StructuresFastAPIREST APICI/CD

Posted 2 days ago
Apply
Apply

๐Ÿ“ United States of America

๐Ÿ’ธ 128000.0 - 231000.0 USD per year

๐Ÿ” Software Development

๐Ÿข Company: targetcareers

  • Strong proficiency in Java/Go or similar language
  • Have a deep understanding of some of the following concepts: Operating System Architecture, memory management, process scheduling, I/O scheduling, Networking, technologies, latency, bandwidth, Benchmarking, Performance Debugging, Performance monitoring
  • Have familiarity and experience with some of the following technologies: Object Storage tech (MinIO, Ceph, etc), Spark, Apache Hadoop ecosystem or Kubernetes, Databases - NoSQL and RDBMS, Trino/PrestoSQL, ZooKeeper
  • Experience with modern CI/CD technologies such as Git, Drone, Docker, Artifactory
  • Understand Target's business and technical environments and assist teams in resolving complex business challenges via current technical solutions by assessing viability/applicability/cost implication through POCs and prototypes.
  • Collaborate with technical staff and Enterprise Architecture teams in setting technical direction across platform and drive technology lifecycle management and communication of standards/decisions to the engineering team.
  • Participate in procurement specifications, installation, and maintenance of Target systems.
  • Lead designing and building the Target platform API with deep focus on non-functional requirement including scalability, availability, performance, etc. while being a strong advocate of extreme agile and DevOps practices across engineers.

DockerLeadershipSoftware DevelopmentApache HadoopGitJavaKubernetesSoftware ArchitectureData engineeringGoSparkCommunication SkillsCI/CDProblem SolvingAgile methodologiesDevOps

Posted 2 days ago
Apply
Shown 10 out of 794

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at โ‚ฌ5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search โ€” filter job listings based on your country of residence;
  • AI-powered job processing โ€” artificial intelligence analyzes thousands of listings, highlighting key details so you donโ€™t have to read long descriptions;
  • advanced filters โ€” sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates โ€” we monitor job relevance and remove outdated listings;
  • personalized notifications โ€” get tailored job offers directly via email or Telegram;
  • resume builder โ€” create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security โ€” modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing โ€” up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.