Remote Data Science Jobs

ETL
741 jobs found. to receive daily emails with new job openings that match your preferences.
741 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 114000.0 - 171599.0 USD per year

πŸ” Fintech

  • Strong expertise in data pipeline development (ETL/ELT) and workflow automation.
  • Proficiency in Python, SQL, and scripting languages for data processing and automation.
  • Hands-on experience with Workato, Google Apps Script, and API-driven automation.
  • Automate customer support, success, and service workflows to improve speed, accuracy, and responsiveness.
  • Build and maintain scalable ETL/ELT pipelines to ensure real-time access to critical customer data.
  • Implement self-service automation to enable customers and internal teams to quickly access information.

PythonSQLETLJiraAPI testingData engineeringCI/CDRESTful APIsData visualizationScriptingCustomer Success

Posted 1 day ago
Apply
Apply

πŸ“ Germany, Austria, Italy, Spain, Portugal

πŸ” Financial and Real Estate

🏒 Company: PriceHubbleπŸ‘₯ 101-250πŸ’° Non-equity Assistance over 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years of experience building and maintaining production data pipelines.
  • Excellent English communication skills, both spoken and written, to effectively collaborate with cross-functional teams and mentor other engineers.
  • Clear writing is key in our remote-first setup.
  • Proficient in working with geospatial data and leveraging geospatial features.
  • Work with backend engineers and data scientists to turn raw data into trusted insights, handling everything from scraping and ingestion to transformation and monitoring.
  • Navigate cost-value trade-offs to make decisions that deliver value to customers at an appropriate cost.
  • Develop solutions that work in over 10 countries, considering local specifics.
  • Lead a project from concept to launch with a temporary team of engineers.
  • Raise the bar and drive the team to deliver high-quality products, services, and processes.
  • Improve the performance, data quality, and cost-efficiency of our data pipelines at scale.
  • Maintain and monitor the data systems your team owns.

AWSDockerLeadershipPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLGitKubernetesApache KafkaData engineeringData scienceSparkCI/CDProblem SolvingRESTful APIsMentoringLinuxExcellent communication skillsTeamworkCross-functional collaborationData visualizationData modelingData managementEnglish communication

Posted 2 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Internal Audit

🏒 Company: careers

  • Minimum 3+ years of experience writing and optimizing SQL/SAS queries in a business environment or 5+ years of experience in lieu of a degree
  • Knowledge of data warehouse technical architecture, ETL and analytic tools in extracting unstructured and structured data
  • Experience in building algorithms and coding proficiency in Python is required
  • Experience with visualization software Tableau or Power BI
  • Experience managing, moving, and manipulating data from multiple sources
  • Familiar with segmentation techniques such as decision trees or k-means clustering
  • Familiar with model development techniques such as logistic regression, random forest, or gradient boosting
  • Ability to provide analytic support including pulling data, preparing analysis, interpreting data, making strategic recommendations, and presenting to client/product team
  • Ability to clearly explain technical and analytical information (verbally, written, and in presentation form) and summarize for key stakeholders
  • Outstanding communications, relationship building, influencing, and collaboration skills
  • Strong project management, communications, multi-tasking, ability to work independently
  • Deliver advanced analytics to support the audit plan, including cycle audits, issue validation, remediation activities and special projects
  • Design and deploy analytic scripts and dashboards to communicate actionable insights to audit stakeholders.
  • Document analytic results and findings into audit workpapers.
  • Ensure the accuracy and integrity of data used in audit engagements through data transformation techniques.
  • Deploy automation on repetitive audit tasks using data analytics and data engineering techniques.
  • Collaborate with Internal Audit teams to understand audit test objectives and data requirements.
  • Collaborate with remediation teams to ensure data insights are effectively integrated into action plans.
  • Lead projects from beginning to end, including ideation, data mining, strategy formulation, and presentation of results and recommendations.

PythonSQLData AnalysisData MiningETLExcel VBAMachine LearningNumpySAS EGTableauAlgorithmsData engineeringData sciencePandasRESTful APIsMS OfficeData visualizationData modeling

Posted 2 days ago
Apply
Apply

πŸ“ United States

🏒 Company: ge_externalsite

  • Hands-on experience in programming languages like Java, Python or Scala and experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or HiveQL
  • Exposure to industry standard data modeling tools (e.g., ERWin, ER Studio, etc.).
  • Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend
  • Exposure to industry standard data catalog, automated data discovery and data lineage tools (e.g., Alation, Collibra, etc., )
  • Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase)
  • Exposure to unstructured datasets and ability to handle XML, JSON file formats
  • Conduct exploratory data analysis and generate visual summaries of data. Identify data quality issues proactively.
  • Developing reusable code pipelines through CI/CD.
  • Hands-on experience of big data or MPP databases.
  • Developing and executing integrated test plans.
  • Be responsible for identifying solutions for complex data analysis and data structure.
  • Be responsible for creating digital thread requirements
  • Be responsible for change management of database artifacts to support next gen QMS applications
  • Be responsible for monitoring data availability and data health of complex systems
  • Understand industry trends and stay up to date on associated Quality and tech landscape.
  • Design & build technical data dictionaries and support business glossaries to analyze the datasets
  • This role may also work on other Quality team digital and strategic deliveries that support the business.
  • Perform data profiling and data analysis for source systems, manually maintained data, machine or sensor generated data and target data repositories
  • Design & build both logical and physical data models for both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) solutions
  • Develop and maintain data mapping specifications based on the results of data analysis and functional requirements
  • Build a variety of data loading & data transformation methods using multiple tools and technologies.
  • Design & build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications
  • Manage metadata structures needed for building reusable Extract, Transform & Load (ETL) components.
  • Analyze reference datasets and familiarize with Master Data Management (MDM) tools.
  • Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact.
  • Derive solutions and make recommendations from deep dive data analysis proactively.
  • Design and build Data Quality (DQ) rules.
  • Drives design and implementation of the roadmap.
  • Design and develop complex code in multiple languages.
  • This role may also work on other Quality team digital and strategic deliveries that support the business.

PostgreSQLPythonSQLData AnalysisETLHadoopJavaMySQLOracleData engineeringNosqlSparkCI/CDAgile methodologiesJSONScalaData visualizationData modeling

Posted 2 days ago
Apply
Apply

πŸ“ Poland, Ukraine, Cyprus

🧭 Full-Time

πŸ” Software Development

🏒 Company: CompeteraπŸ‘₯ 51-100πŸ’° $3,000,000 Seed about 1 year agoArtificial Intelligence (AI)Big DataE-CommerceRetailMachine LearningAnalyticsRetail TechnologyInformation TechnologyEnterprise SoftwareSoftware

  • 5+ years of experience in data engineer role.
  • Strong knowledge of SQL, Spark, Python, Airflow, binary file formats.
  • Contribute to the development of the new data platform.
  • Collaborate with platform and ML teams to create ETL pipelines that efficiently deliver clean and trustworthy data.
  • Engage in architectural decisions regarding the current and future state of the data platform.
  • Design and optimize data models based on business and engineering needs.

PythonSQLETLKafkaAirflowSparkData modeling

Posted 2 days ago
Apply
Apply
πŸ”₯ Data Modeller
Posted 2 days ago

πŸ“ UK

🧭 Full-Time

πŸ’Έ 35000.0 - 40000.0 GBP per year

πŸ” Banking

🏒 Company: Tandem BankπŸ‘₯ 11-50Financial ServicesBankingFinance

  • Data Modelling
  • Version Control: Proficient with GitHub for version control and collaboration
  • Data Transformation: Experience with DBT for data transformation and modelling
  • Pipeline Orchestration: Working knowledge of Mage.ai for data pipeline development
  • Database Systems: Experience with Microsoft SQL Server, and time-series data modelling
  • Business Intelligence: Proficiency in Power BI for semantic modelling and reporting
  • Design and implement dimensional and relational data models for business intelligence and analytics use cases
  • Create and maintain data documentation, including entity-relationship diagrams and data dictionaries
  • Develop and implement data quality controls and validation procedures
  • Support the optimisation of data pipelines for data transformation processes
  • Participate in code reviews and maintain modelling best practices

PythonBusiness IntelligenceETLMicrosoft SQL ServerData modelingData management

Posted 2 days ago
Apply
Apply

πŸ“ United States of America

πŸ’Έ 94510.99 - 150972.31 USD per year

πŸ” Banking

🏒 Company: external

  • 6-8+ years in relevant banking, data, project, and/or leadership role.
  • Banking experience and knowledge of industry regulatory requirements is preferred.
  • Risk Management, Information Security, Information Technology, and/or Banking Operations experience is a plus.
  • Experience in establishing and executing data governance, data quality, or data management practices.
  • Demonstrated experience collaborating with cross functional teams and ability to develop, recommend, and implement solutions.
  • Project management experience to include developing and executing project plans and roadmaps.
  • Promote the value of the Data Intelligence Program and influence a data-driven culture.
  • Develop a working knowledge of the Bank’s organizational structure, operations, product and service offering, and data architecture including data sources, data flows, and control points and document where appropriate.
  • Develop, document, implement, and manage processes supporting data governance, data stewardship, and data quality programs.
  • Continuously assess and improve data governance processes and practices to adapt to changing business needs and regulatory requirements.
  • Monitor industry trends and advancements to ensure data practices align with the latest standards and technologies.
  • Lead Data Communities and oversee data-related efforts/activities through completion.
  • Work with business lines and/or Data Communities to identify and document critical data elements (CDEs) and respective business rules and thresholds for data quality monitoring.
  • Work with business lines and/or Data Communities to identify root cause of data quality issues and opportunities for process improvement and implement solutions to resolve or reduce repeated issues; supporting documentation should be developed as appropriate.
  • Support adoption of the Data Intelligence tools, solutions, and processes.
  • Elevate data-related issues, risks, and gaps and recommend solutions to the Director of Business Initiatives and/or Data Intelligence Committee.
  • Develop and execute Data Intelligence awareness and training plans as appropriate.
  • Assist the Director of Business Initiatives with managing data intelligence use cases and business line onboarding as needed.
  • Assist the Director of Business Initiatives with development and implementation of a data change management process and supporting documentation.
  • Support maintenance of the Data Intelligence repository (i.e. SharePoint site) by ensuring relevant artifacts such as program documentation, procedures, and meeting packets for data working groups and/or projects are current and available.
  • Manage projects, where appropriate, in accordance with the Bank’s established project lifecycle process to include but not limited to developing and executing project plans, roadmaps, schedules, business requirements, and testing strategies.
  • Prepare reports, materials, and status updates for the Data Intelligence Committee and other meetings and/or ad hoc requests as needed.
  • Collaborate with the Data Intelligence team to achieve common objectives and complete projects timely.
  • Partner with IT, Information Security, Risk Management, and Privacy teams, as appropriate, to ensure data governance processes and projects align with standards and/or regulatory requirements of the respective area.
  • Represent Data Intelligence on committees, projects, and process forums (i.e. ARB, DCAB) as appropriate.
  • Perform other duties as assigned.

LeadershipProject ManagementSQLBusiness IntelligenceData AnalysisETLSharePointData engineeringData StructuresCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelProblem SolvingAgile methodologiesRisk ManagementData visualizationData modelingData analyticsData managementChange Management

Posted 2 days ago
Apply
Apply

πŸ“ United States, Canada, Mexico, Europe

🧭 Full-Time

πŸ” Software Development

  • 5+ years of experience as a Data Engineer
  • Snowflake architecture expertise for multi-tenant B2B applications
  • Performance optimization for customer-facing data models and analytics.
  • Advanced SQL skills for complex query optimization
  • Proficiency in Python, Scala, or Go for data pipeline development
  • Experience analyzing source data structures and recommending improvements
  • Ability to collaborate with engineering teams on data design
  • Experience with ETL/ELT pipelines (Airflow, dbt)
  • Integration experience with Power BI, Tableau, and Sigma
  • Mentoring skills for report creation using BI tools
  • Data quality management for customer-facing products
  • Experience with GitHub/source control and CI/CD pipelines (GitHub Actions or Jenkins)
  • Understanding of multi-tenant data security and governance
  • Develop and enhance AI workflows in support of the various QAD applications.
  • Complete delivery work committed during the sprint to achieve business goals.
  • Help the business maintain a competitive edge by leveraging the latest AI technologies.
  • Provide subject matter expertise during incidents to resolve customer issues quickly.
  • Participate in forums to explore interests outside of the sprint work and contribute ideas to continuously improve the system.
  • Commit to the team to help the team and the wider business achieve our goals.
  • Write testable and maintainable code.

PythonSQLETLJenkinsSnowflakeTableauAirflowAPI testingData engineeringGoCI/CDRESTful APIsScalaData modeling

Posted 2 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ” Software Development

🏒 Company: Apollo.ioπŸ‘₯ 501-1000πŸ’° $100,000,000 Series D over 1 year agoSoftware Development

  • 8+ years of experience as a data platform engineer or a software engineer in data or big data engineer.
  • Experience in data modeling, data warehousing, APIs, and building data pipelines.
  • Deep knowledge of databases and data warehousing with an ability to collaborate cross-functionally.
  • Bachelor's degree in a quantitative field (Physical/Computer Science, Engineering, Mathematics, or Statistics).
  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
  • Develop and improve Data APIs used in machine learning / AI product offerings
  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.
  • Write unit/integration tests, contribute to the engineering wiki, and document work.
  • Define company data models and write jobs to populate data models in our data warehouse.
  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

PythonSQLApache AirflowApache HadoopCloud ComputingETLApache KafkaData engineeringFastAPIData modelingData analytics

Posted 3 days ago
Apply
Apply

πŸ“ United States of America

πŸ’Έ 84878.0 - 140048.77 USD per year

πŸ” Financial

🏒 Company: flagstar

  • Undergraduate Degree in Business or Information Systems, or related field.
  • 3-5 years of financial industry, BSA/AML or related experience.
  • 3-5 years of experience with TM system Tuning & Optimization (ATL/BTL) analysis.
  • 3-5 years of financial institution IT experience.
  • Microsoft Excel (vlookups, pivot tables, macros), PowerPoint and Word
  • SQL Server or similar SQL-based database
  • Python, R Scripting or similar scripting language
  • Tableau or similar data visualization tool
  • Works with FCC team, supporting the workstream with key deliverables and projects, including periodic tuning and optimization and regulatory deliverables of FCC systems.
  • Performs and documents procedures for data preparation including analysis and interpretation.
  • Creates and executes a tuning process and ensure that a consistent and repeatable methodology is followed including but not limited to the following:
  • Enhances rule execution efficiency.
  • Develops and performs statistical analysis to identify patterns and correlations in the data.
  • Creates visually compelling and easy-to-understand dashboards and reports adhering to best practices and governance standards.
  • Communicates findings to stakeholders through charts, graphs, and presentations.
  • Translates complex data findings into compelling narratives that drive business decisions using both static and dynamic analytic tools.
  • Gathers and documents business requirements:Β  Works with key stakeholders in FCC department and understand their requirements to arrive at the desired end-state.
  • Creates analytical data models to support dynamic reporting requirements to meet FCC functional requirements.
  • Assists in designing and implementing risk-based alert scoring, addressing and resolving logic and configuration in monitoring scenarios.
  • Actively participates in system upgrades.
  • Assists in automated solutions to improve process efficiency.
  • Uses advanced statistical techniques like clustering and regression analysis to ensure appropriate rigor around optimization process.
  • Adheres to standards for segmentation methodology, scenario tuning methodology, peer grouping, below/above the line (BTL/ATL) sampling methodology.
  • Configures/executes periodic above the line (ATL) and below the line (BTL) testing for AML rules.
  • Provides detailed quantitative analysis of the test results.

PythonSQLData AnalysisETLTableauMicrosoft ExcelComplianceReportingRisk ManagementData visualizationFinancial analysisData modelingScriptingData management

Posted 3 days ago
Apply
Shown 10 out of 741

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search β€” filter job listings based on your country of residence;
  • AI-powered job processing β€” artificial intelligence analyzes thousands of listings, highlighting key details so you don’t have to read long descriptions;
  • advanced filters β€” sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates β€” we monitor job relevance and remove outdated listings;
  • personalized notifications β€” get tailored job offers directly via email or Telegram;
  • resume builder β€” create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security β€” modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing β€” up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.