Remote Data Science Jobs

Apache Airflow
207 jobs found. to receive daily emails with new job openings that match your preferences.
207 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Germany, Austria, Italy, Spain, Portugal

πŸ” Financial and Real Estate

🏒 Company: PriceHubbleπŸ‘₯ 101-250πŸ’° Non-equity Assistance over 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years of experience building and maintaining production data pipelines.
  • Excellent English communication skills, both spoken and written, to effectively collaborate with cross-functional teams and mentor other engineers.
  • Clear writing is key in our remote-first setup.
  • Proficient in working with geospatial data and leveraging geospatial features.
  • Work with backend engineers and data scientists to turn raw data into trusted insights, handling everything from scraping and ingestion to transformation and monitoring.
  • Navigate cost-value trade-offs to make decisions that deliver value to customers at an appropriate cost.
  • Develop solutions that work in over 10 countries, considering local specifics.
  • Lead a project from concept to launch with a temporary team of engineers.
  • Raise the bar and drive the team to deliver high-quality products, services, and processes.
  • Improve the performance, data quality, and cost-efficiency of our data pipelines at scale.
  • Maintain and monitor the data systems your team owns.

AWSDockerLeadershipPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLGitKubernetesApache KafkaData engineeringData scienceSparkCI/CDProblem SolvingRESTful APIsMentoringLinuxExcellent communication skillsTeamworkCross-functional collaborationData visualizationData modelingData managementEnglish communication

Posted 2 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ” Software Development

🏒 Company: Apollo.ioπŸ‘₯ 501-1000πŸ’° $100,000,000 Series D over 1 year agoSoftware Development

  • 8+ years of experience as a data platform engineer or a software engineer in data or big data engineer.
  • Experience in data modeling, data warehousing, APIs, and building data pipelines.
  • Deep knowledge of databases and data warehousing with an ability to collaborate cross-functionally.
  • Bachelor's degree in a quantitative field (Physical/Computer Science, Engineering, Mathematics, or Statistics).
  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
  • Develop and improve Data APIs used in machine learning / AI product offerings
  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.
  • Write unit/integration tests, contribute to the engineering wiki, and document work.
  • Define company data models and write jobs to populate data models in our data warehouse.
  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

PythonSQLApache AirflowApache HadoopCloud ComputingETLApache KafkaData engineeringFastAPIData modelingData analytics

Posted 3 days ago
Apply
Apply

πŸ“ SΓ£o Paulo, Rio Grande do Sul, Rio de Janeiro, Belo Horizonte

πŸ” Software Development

🏒 Company: TELUS Digital Brazil

  • 5+ years of hands-on experience in supporting data engineering teams, strongly emphasizing data pipeline enhancement and optimization, and data integration.
  • Advanced/ Fluent English - communication and documentation skills.
  • Proficient in cloud computing - GCP.
  • Solid proficiency with Python in terms of data processing.
  • Experience with cloud data-related services such as BigQuery, Dataflow, Cloud Composer, Dataproc, Cloud Storage, Pub/Sub, or the correlated services from other providers.
  • Knowledge of SQL and experience with relational databases.
  • Proven experience optimizing data pipelines toward efficiency, reducing operational costs, and reducing the number of issues/failures.
  • Solid knowledge of monitoring, troubleshooting, and resolving data pipeline issues.
  • Familiarity with version control systems like Git
  • Design and implement scalable data pipeline architectures in collaboration with Data Engineers.
  • Continuously optimize data pipeline efficiency to reduce operational costs and minimize issues and failures.
  • Monitor performance and reliability of data pipelines, enhancing reliability through data quality, analysis, and testing.
  • Build and manage automated alerting systems for data pipeline issues.
  • Automate repetitive tasks in data processing and management.
  • Develop and manage disaster recovery and backup plans.
  • In collaboration with other Data Engineering teams, conduct capacity planning for data storage and processing needs.
  • Develop and maintain comprehensive documentation for data pipeline systems and processes, and provide knowledge transfer to data-related teams.
  • Monitor, troubleshoot and resolve production issues in data processing workflows.
  • Maintain infrastructure reliability for data pipelines, enterprise datahub, HPBI, and MDM systems.
  • Conduct post-incident reviews and implement improvements for data pipelines.

PythonSQLApache AirflowCloud ComputingGCPGitData engineeringREST APICI/CDDevOpsDocumentationMicroservicesTroubleshootingData modeling

Posted 3 days ago
Apply
Apply

πŸ“ Indonesia

🧭 Full-Time

πŸ” Financial Services

🏒 Company: BjakπŸ‘₯ 101-250Price ComparisonInsurTechInformation Technology

  • Bachelor's, Master’s, or Ph.D. in Computer Science, Artificial Intelligence, Data Science, or a related field.
  • Proven experience as an AI Engineer, Principal Engineer, or Data Scientist, with a track record of leading successful AI projects.
  • Proficiency in AI and ML frameworks and programming languages (e.g., Python, TensorFlow, PyTorch, Scikit-learn).
  • Strong expertise in data preprocessing, feature engineering, and model evaluation.
  • Experience in deploying and optimizing AI models in a production environment.
  • Excellent problem-solving, analytical thinking, and debugging skills.
  • Strong leadership, communication, and team management abilities.
  • Passion for staying at the forefront of AI and machine learning advancements.
  • Lead and mentor a team of AI engineers, providing technical guidance, coaching, and fostering their growth.
  • Collaborate with product managers and stakeholders to define AI project objectives, requirements, and timelines.
  • Design, develop, and implement AI models, algorithms, and applications to solve complex business challenges.
  • Oversee the end-to-end AI model lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.
  • Stay updated with the latest advancements in AI and machine learning, incorporating best practices into projects.
  • Drive data-driven decision-making through advanced analytics and visualization techniques.
  • Ensure the security, scalability, and efficiency of AI solutions.
  • Lead research efforts to explore and integrate cutting-edge AI techniques

AWSDockerLeadershipPythonSQLApache AirflowArtificial IntelligenceData AnalysisKerasMachine LearningMLFlowNumpyPyTorchAlgorithmsData scienceREST APIPandasTensorflowCommunication SkillsAnalytical SkillsCI/CDProblem SolvingData visualizationTeam managementDebugging

Posted 3 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Software Development

🏒 Company: GroundTruth Careers

  • Experience with GIS, POI/Location data ingestion pipeline.
  • Experience with AWS Stack used for Data engineering EC2, S3, EMR, ECS, Lambda, and Step functions
  • Hands on experience with Python/Java for orchestration of data pipelines
  • Experience in writing analytical queries using SQL
  • Experience in Airflow
  • Experience in Docker
  • Create and maintain various ingestion pipelines for the GroundTruth platform.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, GIS and AWS β€˜big data’ technologies.
  • Work with stakeholders including the Product, Analytics and Client Services teams to assist with data-related technical issues and support their data infrastructure needs.
  • Prepare detailed specifications and low-level design.
  • Participate in code reviews.
  • Test the product in controlled, real situations before going live.
  • Maintain the application once it is live.
  • Contribute ideas to improve the location platform.

AWSDockerPythonSQLApache AirflowGitData engineeringSoftware Engineering

Posted 4 days ago
Apply
Apply

πŸ“ Bratislava, Kyiv

πŸ” FinTech/Banking or Energy

  • 3+ years of Business Analysis experience in either FinTech or Energy domain.
  • Strong understanding of financial systems, digital banking, payments, and regulatory frameworks or energy market analytics, renewable energy trends, and asset management.
  • Experience with big data tools and ETL platforms (AirFlow, Kafka, Postgres, MongoDB).
  • Proficiency in data modeling, BPMN, UML, ERD diagrams, and business process mapping.
  • Deep understanding of Agile/Scrum methodologies, backlog management, and requirement elicitation.
  • Familiarity with Jira, Confluence, and wireframing/mockup tools.
  • Excellent problem-solving and communication skills.
  • Fluent in English (C1 or higher)
  • Work closely with stakeholders to gather, document, and analyze business requirements, ensuring alignment with company objectives.
  • Evaluate and optimize business processes through data analysis and stakeholder feedback.
  • Develop comprehensive Business Requirement Documents (BRD), Functional Specification Documents (FSD), and User Stories.
  • Act as a bridge between business and technical teams, ensuring smooth collaboration and understanding. Cross-department collaboration.
  • Utilize data-driven insights to support decision-making and provide business intelligence reports.
  • Work with development teams to ensure that proposed solutions meet business needs and contribute to strategic goals.
  • Identify potential risks associated with business processes and propose mitigation strategies.

PostgreSQLSQLAgileApache AirflowBusiness AnalysisData AnalysisETLMongoDBSCRUMJiraApache KafkaRDBMSRisk ManagementData visualizationStakeholder managementData modelingConfluenceEnglish communication

Posted 4 days ago
Apply
Apply

πŸ“ United States

πŸ’Έ 208000.0 - 235000.0 USD per year

πŸ” Gaming, sports betting

🏒 Company: Underdog Sports

  • 10+ years of experience in marketing analytics, growth analytics, or CRM analytics
  • 4+ years of experience as a people manager
  • Strong expertise in digital marketing analytics
  • Experience working with CRM, lifecycle marketing, or promotions data
  • Deep SQL expertise and familiarity with modern data stack tools
  • Proven ability to work cross-functionally
  • Strong communication skills
  • Drive Marketing & Growth through data
  • Optimize CRM & Promotions
  • Lead and Develop the team

SQLApache AirflowData AnalysisPeople ManagementSnowflakeGoogle AnalyticsCross-functional Team LeadershipTableauCommunication SkillsMicrosoft ExcelRESTful APIsData visualizationMarketingDigital MarketingCRMData modelingData analyticsData managementCustomer SuccessA/B testing

Posted 5 days ago
Apply
Apply

πŸ“ Germany, Austria

🏒 Company: StoryblokπŸ‘₯ 101-250πŸ’° $80,089,317 Series C 10 months agoInternetCMSIaaSWeb HostingWeb DevelopmentPaaSInformation TechnologyWeb DesignSoftware

  • Proficiency in programming languages such as Python or R, and experience with SQL for data manipulation
  • Strong analytical and critical thinking skills with a keen attention to detail
  • Solid understanding of machine learning algorithms and experience applying them to real-world problems
  • Lead the development and implementation of advanced statistical and machine learning models to extract insights and drive business decisions
  • Collaborate with cross-functional teams to understand business challenges and develop data-driven solutions
  • Mentor and guide junior team members and Data Champions from other divisions, fostering a data-driven culture and enabling teams to leverage analytics effectively

AWSPythonSQLApache AirflowCloud ComputingData AnalysisData MiningETLGCPMachine LearningMLFlowNumpyTableauAlgorithmsAzureData engineeringData sciencePandasSparkTensorflowCommunication SkillsData visualizationData modelingData analyticsA/B testing

Posted 5 days ago
Apply
Apply

πŸ“ Poland, Romania, Ukraine

πŸ” Cybersecurity

🏒 Company: Point WildπŸ‘₯ 101-250SecuritySoftware

  • Experience building and maintaining ETL/ELT pipelines for large-scale data ingestion and transformation.
  • Strong knowledge of AWS services for ML infrastructure, model deployment, and automation.
  • Experience setting up CI/CD workflows for ML models, including versioning, monitoring, and automated retraining.
  • Comfortable writing efficient Python and SQL scripts for data processing and model deployment.
  • Can balance quick PoC enablement with long-term scalability in AI deployments.
  • Design and maintain ETL/ELT pipelines to ingest, clean, and transform data from multiple product lines.
  • Stand up and manage AWS-based ML infrastructure (e.g., S3 data lakes, AWS Glue, EMR, AWS Batch, Lambda, SageMaker).
  • Own CI/CD for ML models, including environment setup, model versioning, containerization, and monitoring.
  • Ensure AI teams have reliable access to data, scalable training environments, and efficient deployment pipelines.
  • Help move AI proofs-of-concept from experimentation to fully productionized, scalable deployments.

AWSDockerPythonSQLApache AirflowCloud ComputingETLData engineeringCI/CDDevOps

Posted 6 days ago
Apply
Apply

πŸ“ AZ, CA, CO, FL, GA, ID, IL, KY, MD, MI, NJ, NM, NY, NC, OH, OR, PA, SC, TN, TX, UT, VA, WA

πŸ” Healthcare

  • Experience as a Data Engineer or experienced data analyst
  • Passionate about establishing a data driven company
  • Comfortable filling multiple roles in a growing startup
Design, develop, and implement automation, monitoring, and maintaining high availability of production and non-production work environments.

AWSPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLSnowflakeData engineeringREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingTeamworkData visualizationData modelingData analyticsData management

Posted 7 days ago
Apply
Shown 10 out of 207

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why Remote Data Science Jobs Are Becoming More Popular

The remote work from home is increasingly in demand among computer and IT professionals for several reasons:

  • Flexibility in time and location.
  • Collaboration with international companies.
  • Higher salary levels.
  • Lack of ties to the office.

Remote work opens up new opportunities for specialists, allowing them to go beyond geographical limits and build a successful remote IT career. This employment model is transforming traditional work approaches, making it more convenient, efficient, and accessible for professionals worldwide.

Why do Job Seekers Choose Remoote.app?

Our platform offers convenient conditions for finding remote IT jobs from home:

  • localized search β€” filter job listings based on your country of residence;
  • AI-powered job processing β€” artificial intelligence analyzes thousands of listings, highlighting key details so you don’t have to read long descriptions;
  • advanced filters β€” sort vacancies by skills, experience, qualification level, and work model;
  • regular database updates β€” we monitor job relevance and remove outdated listings;
  • personalized notifications β€” get tailored job offers directly via email or Telegram;
  • resume builder β€” create a professional VC with ease using our customizable templates and AI-powered suggestions;
  • data security β€” modern encryption technologies ensure the protection of your personal information.

Join our platform and find your dream job today! We offer flexible pricing β€” up to 5 applications per day for free, with weekly, monthly, and yearly subscription plans for extended access.