Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today โ€” fast and easy!

Remote IT Jobs
Hadoop
57 jobs found. to receive daily emails with new job openings that match your preferences.
57 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

๐Ÿ“ Japan

๐Ÿ” Software infrastructure

๐Ÿข Company: CIQ๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $26,000,000 Series A almost 3 years agoInformation TechnologySoftware

  • Extensive knowledge of infrastructure - both on-prem, hybrid, and multi-cloud with familiarity with Open-Source, HPC, enterprise performance computing, AI/ML, and NLP infrastructure
  • Deep understanding of the enterprise software ecosystem, particularly with Linux distributions like Rocky Linux, and experience in complex regulatory and compliance environments
  • Exceptional account management and business development skills, with a track record of closing complex, high-value deals in Japanโ€™s business landscape
  • Expertise in navigating corporate structures and partnerships, particularly within the networks of various corporate families
  • Familiarity with Japanโ€™s corporate and legal frameworks, especially regarding establishing local entities and employment structures
  • Strong executive presence with exceptional sales and presentation skills. Ability to confidently engage and influence executives, delivering compelling narratives that drive decision-making and close high-value deals
  • Great organizational skills and the ability to manage multiple projects while driving strategic growth for CIQ Japan
  • Fluency in both Japanese and English, with a deep cultural understanding of how to effectively manage business relationships in Japan
  • Develop and nurture relationships with key decision-makers at major Japanese companies, maintain a deep understanding of their business objectives and challenges to effectively position CIQโ€™s offerings
  • Identify and pursue new collaboration, growth, and revenue opportunities in the Japanese market, expanding the adoption of CIQโ€™s solutions, such as Rocky Linux.
  • Work closely with CIQโ€™s technical, product, and sales teams to align with the goals of Japanese entities, manage joint projects, and support key customer accounts and channel partnerships
  • Develop and execute account strategies that align with CIQโ€™s broader business objectives, formalizing account plans.
  • Ensure Japanese customers receive top-tier support, addressing technical and operational challenges, and ensure a smooth delivery of solutions like Rocky Linux and Fuzzball
  • Track and report on account performance, pipeline growth, project milestones, and customer satisfaction, using data-driven insights to inform strategy and decision-making

LeadershipBusiness DevelopmentCloud ComputingHadoopStrategic ManagementCI/CDNegotiationLinuxComplianceAccount ManagementRelationship managementSales experience

Posted about 14 hours ago
Apply
Apply

๐Ÿ“ Sรฃo Paulo, Rio Grande do Sul, Rio de Janeiro, Belo Horizonte

๐Ÿ” Data field

๐Ÿข Company: TELUS Digital Brazil

  • 3+ years in the Data field
  • Experience in the construction and optimization of data pipelines, architectures, and 'big data' datasets
  • Proficiency with Apache Spark with a general-purpose programming language such as Python or Scala
  • Ability to create processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL databases
  • Experience with data pipeline and workflow management tools
  • Be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams;
  • Create and maintain optimal data pipeline architecture;
  • Assemble large, complex data sets that meet business requirements;
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc;
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

AWSPythonSQLCloud ComputingETLGCPHadoopKafkaAirflowAzureData engineeringScala

Posted 5 days ago
Apply
Apply
๐Ÿ”ฅ Data Engineer
Posted 5 days ago

๐Ÿ“ United States

๐Ÿ’ธ 112800.0 - 126900.0 USD per year

๐Ÿ” Software Development

๐Ÿข Company: Titan Cloud

  • 4+ years of work experience with ETL, Data Modeling, Data Analysis, and Data Architecture.
  • Experience operating very large data warehouses or data lakes.
  • Experience with building data pipelines and applications to stream and process datasets at low latencies.
  • MySQL, MSSQL Database, Postgres, Python
  • Design, implement, and maintain standardized data models that align with business needs and analytical use cases.
  • Optimize data structures and schemas for efficient querying, scalability, and performance across various storage and compute platforms.
  • Provide guidance and best practices for data storage, partitioning, indexing, and query optimization.
  • Developing and maintaining a data pipeline design.
  • Build robust and scalable ETL/ELT data pipelines to transform raw data into structured datasets optimized for analysis.
  • Collaborate with data scientists to streamline feature engineering and improve the accessibility of high-value data assets.
  • Designing, building, and maintaining the data architecture needed to support business decisions and data-driven applications.โ€ฏThis includes collecting, storing, processing, and analyzing large amounts of data using AWS, Azure, and local tools and services.
  • Develop and enforce data governance standards to ensure consistency, accuracy, and reliability of data across the organization.
  • Ensure data quality, integrity, and completeness in all pipelines by implementing automated validation and monitoring mechanisms.
  • Implement data cataloging, metadata management, and lineage tracking to enhance data discoverability and usability.
  • Work with Engineering to manage and optimize data warehouse and data lake architectures, ensuring efficient storage and retrieval of structured and semi-structured data.
  • Evaluate and integrate emerging cloud-based data technologies to improve performance, scalability, and cost efficiency.
  • Assist with designing and implementing automated tools for collecting and transferring data from multiple source systems to the AWS and Azure cloud platform.
  • Work with DevOps Engineers to integrate any new code into existing pipelines
  • Collaborate with teams in trouble shooting functional and performance issues.
  • Must be a team player to be able to work in an agile environment

AWSPostgreSQLPythonSQLAgileApache AirflowCloud ComputingData AnalysisETLHadoopMySQLData engineeringData scienceREST APISparkCommunication SkillsAnalytical SkillsCI/CDProblem SolvingTerraformAttention to detailOrganizational skillsMicroservicesTeamworkData visualizationData modelingScripting

Posted 5 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 177000.0 - 213000.0 USD per year

๐Ÿ” FinTech

๐Ÿข Company: Flex

  • A minimum of 6 years of industry experience in the data infrastructure/data engineering domain.
  • A minimum of 6 years of experience with Python and SQL.
  • A minimum of 3 years of industry experience using DBT.
  • A minimum of 3 years of industry experience using Snowflake and its basic features.
  • Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc.
  • Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc.
  • Industry experience working with relational and NoSQL databases in a production environment.
  • Strong fundamentals in data structures, algorithms, and design patterns.
  • Design, implement, and maintain high-quality data infrastructure services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers.
  • Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling.
  • Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion.
  • Create scalable real-time streaming pipelines and offline ETL pipelines.
  • Design, implement, and manage a data warehouse that provides secure access to large datasets.
  • Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
  • Create engineering documentation for design, runbooks, and best practices.

AWSPythonSQLBashDesign PatternsETLHadoopJavaKafkaSnowflakeAirflowAlgorithmsCassandraData engineeringData StructuresNosqlSparkCommunication SkillsCI/CDRESTful APIsTerraformWritten communicationDocumentationData modelingDebugging

Posted 6 days ago
Apply
Apply

๐Ÿ“ Netherlands

๐Ÿ” Software Development

๐Ÿข Company: Dataiku๐Ÿ‘ฅ 1001-5000๐Ÿ’ฐ $200,000,000 Series F over 2 years agoArtificial Intelligence (AI)Big DataData IntegrationAnalyticsEnterprise Software

  • At least 3 years of experience in a client-facing engineering or technical role, ideally involving a complex and rapidly evolving software/product
  • Experience with cloud platforms such as AWS, Azure, and GCP
  • Experience with Docker and Kubernetes
  • Collaborative and helpful mindset with a focus on always working as a team
  • A strong competency in technical problem solving with demonstrated experience performing advanced log analysis, debugging, and reproducing errors
  • Proficiency working with Unix-based operating systems
  • Experience with relational databases (or data warehouses like Snowflake) and SQL
  • Ability to read and write Python or R code
  • Help EMEA and global customers solve their technical issues with Dataiku through a variety of communication channels
  • Communicate with our R&D team to solve complex issues and/or share feedback from our EMEA customers for future product improvement
  • Work with other customer-facing teams when escalating or rerouting issues to help ensure a proper and efficient / timely resolution
  • Document knowledge in the form of technical articles and contribute to knowledge bases or forums within specific areas of expertise
  • Occasionally wear multiple hats and help out with other activities in a fast-paced and dynamic startup team environment

AWSDockerPythonSQLGCPHadoopKubernetesLDAPAzureREST APISparkCommunication SkillsAnalytical SkillsCI/CDProblem SolvingWritten communicationExcellent communication skillsTroubleshootingActive listeningJSONData visualizationTechnical supportScriptingDebuggingCustomer supportEnglish communication

Posted 8 days ago
Apply
Apply

๐Ÿ“ India

๐Ÿ” Software Development

๐Ÿข Company: GroundTruth Careers

  • Tech./B.E./M.Tech./MCA or equivalent in computer science
  • 3-4 years of experience in Data Engineering
  • Experience with AWS Stack used for Data engineering EC2, S3, Athena, Redshift, EMR, ECS, Lambda, and Step functions
  • Experience in Hadoop, Mapreduce, Pig, Spark, Glue
  • Hands on experience with Python for orchestration of data pipelines and Data engineering tasks
  • Experience in writing analytical queries using SQL
  • Experience in Airflow
  • Experience in Docker
  • Proficient in in Git
  • Create and maintain various data pipelines for the GroundTruth platform.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS โ€˜big dataโ€™ technologies.
  • Work with stakeholders including the Product, Analytics and Client Services teams to assist with data-related technical issues and support their data infrastructure needs.
  • Prepare detailed specifications and low-level design.
  • Participate in code reviews.
  • Test the product in controlled, real situations before going live.
  • Maintain the application once it is live.
  • Contribute ideas to improve the location platform.

AWSDockerPythonSQLApache AirflowETLGitHadoopData engineeringSpark

Posted 9 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 217000.0 - 303900.0 USD per year

๐Ÿ” Software Development

๐Ÿข Company: Reddit๐Ÿ‘ฅ 1001-5000๐Ÿ’ฐ $410,000,000 Series F over 3 years ago๐Ÿซ‚ Last layoff almost 2 years agoNewsContentSocial NetworkSocial Media

  • M.S.: 10+ years of industry data science experience, emphasizing experimentation and causal inference.
  • Ph.D.: 6+ years of industry data science experience, emphasizing experimentation and causal inference
  • Master's or Ph.D. in Statistics, Economics, Computer Science, or a related quantitative field
  • Expertise in experimental design, A/B testing, and causal inference
  • Proficiency in statistical programming (Python/R) and SQL
  • Demonstrated ability to apply statistical principles of experimentation (hypothesis testing, p-values, etc.)
  • Experience with large-scale data analysis and manipulation
  • Strong technical communication skills for both technical and non-technical audiences
  • Ability to thrive in fast-paced, ambiguous environments and drive action
  • Desire to mentor and elevate data science practices
  • Experience with digital advertising and marketplace dynamics (preferred)
  • Experience with advertising technology (preferred)
  • Lead the design, implementation, and analysis of sophisticated A/B tests and experiments, leveraging innovative techniques like Bayesian approaches and causal inference to optimize complex ad strategies
  • Extract critical insights through in-depth analysis, developing automated tools and actionable recommendations to drive impactful decisions Define and refine key metrics to empower product teams with a deeper understanding of feature performance
  • Partner with product and engineering to shape experiment roadmaps and drive data-informed product development
  • Provide technical leadership, mentor junior data scientists, and establish best practices for experimentation
  • Drive impactful results by collaborating effectively with product, engineering, sales, and marketing teams

AWSPythonSQLApache AirflowData AnalysisHadoopMachine LearningNumpyCross-functional Team LeadershipProduct DevelopmentAlgorithmsData engineeringData scienceRegression testingPandasSparkCommunication SkillsAnalytical SkillsMentoringData visualizationData modelingA/B testing

Posted 9 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 126100.0 - 168150.0 USD per year

๐Ÿ” Data Engineering

๐Ÿข Company: firstamericancareers

  • 5+ years of development experience with any of the following software languages: Python or Scala, and SQL (we use SQL & Python) with cloud experience (Azure preferred or AWS).
  • Hands-on data security and cloud security methodologies. Experience in configuration and management of data security to meet compliance and CISO security requirements.
  • Experience creating and maintaining data intensive distributed solutions (especially involving data warehouse, data lake, data analytics) in a cloud environment.
  • Hands-on experience in modern Data Analytics architectures encompassing data warehouse, data lake etc. designed and engineered in a cloud environment.
  • Proven professional working experience in Event Streaming Platforms and data pipeline orchestration tools like Apache Kafka, Fivetran, Apache Airflow, or similar tools
  • Proven professional working experience in any of the following: Databricks, Snowflake, BigQuery, Spark in any flavor, HIVE, Hadoop, Cloudera or RedShift.
  • Experience developing in a containerized local environment like Docker, Rancher, or Kubernetes preferred
  • Data Modeling
  • Build high-performing cloud data solutions to meet our analytical and BI reporting needs.
  • Design, implement, test, deploy, and maintain distributed, stable, secure, and scalable data intensive engineering solutions and pipelines in support of data and analytics projects on the cloud, including integrating new sources of data into our central data warehouse, and moving data out to applications and other destinations.
  • Identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability, etc.
  • Build and enhance a shared data lake that powers decision-making and model building.
  • Partner with teams across the business to understand their needs and develop end-to-end data solutions.
  • Collaborate with analysts and data scientists to perform exploratory analysis and troubleshoot issues.
  • Manage and model data using visualization tools to provide the company with a collaborative data analytics platform.
  • Build tools and processes to help make the correct data accessible to the right people.
  • Participate in active rotational support role for production during or after business hours supporting business continuity.
  • Engage in collaboration and decision making with other engineers.
  • Design schema and data pipelines to extract, transform, and load (ETL) data from various sources into the data warehouse or data lake.
  • Create, maintain, and optimize database structures to efficiently store and retrieve large volumes of data.
  • Evaluate data trends and model simple to complex data solutions that meet day-to-day business demand and plan for future business and technological growth.
  • Implement data cleansing processes and oversee data quality to maintain accuracy.
  • Function as a key member of the team to drive development, delivery, and continuous improvement of the cloud-based enterprise data warehouse architecture.

AWSDockerPythonSQLAgileApache AirflowCloud ComputingETLHadoopKubernetesSnowflakeApache KafkaAzureData engineeringSparkScalaData visualizationData modelingData analytics

Posted 11 days ago
Apply
Apply

๐Ÿ“ Canada

๐Ÿ’ธ 150000.0 - 225000.0 CAD per year

๐Ÿ” Cybersecurity

๐Ÿข Company: crowdstrikecareers

  • Degree in Computer Science (or commensurate experience in data structures/algorithms/distributed systems).
  • The ability to scale backend systems โ€“ sharding, partitioning, scaling horizontally are second nature to you.
  • The desire to ship code and the love of seeing your bits run in production.
  • Deep understanding of distributed systems and scalability challenges.
  • Deep understanding of multi-threading, concurrency, and parallel processing technologies.
  • Team player skills โ€“ we embrace collaborating as a team as much as possible.
  • A thorough understanding of engineering best practices from appropriate testing paradigms to effective peer code reviews and resilient architecture.
  • The ability to thrive in a fast paced, test-driven, collaborative and iterative programming environment.
  • The skills to meet your commitments on time and produce high quality software that is unit tested, code reviewed, and checked in regularly for continuous integration.
  • Lead backend engineering efforts from rapid prototypes to large-scale applications across CrowdStrike products.
  • Leverage and build cloud based systems to detect targeted attacks and automate cyber threat intelligence production at a global scale.
  • Brainstorm, define, and build collaboratively with members across multiple teams.
  • Obsess about learning, and champion the newest technologies & tricks with others, raising the technical IQ of the team.
  • Be mentored and mentor other developers on web, backend and data storage technologies and our system.
  • Constantly re-evaluate our product to improve architecture, knowledge models, user experience, performance and stability.
  • Be an energetic โ€˜self-starterโ€™ with the ability to take ownership and be accountable for deliverables.
  • Use and give back to the open source community.

AWSBackend DevelopmentGraphQLPythonSoftware DevelopmentCloud ComputingCybersecurityElasticSearchGitHadoopKafkaMySQLAlgorithmsCassandraData StructuresGoRedisCommunication SkillsCI/CDProblem SolvingRESTful APIsMicroservicesTeamwork

Posted 13 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 180000.0 - 200000.0 USD per year

๐Ÿ” Data Engineering

๐Ÿข Company: InMarket๐Ÿ‘ฅ 251-500๐Ÿ’ฐ $11,500,000 Debt Financing almost 4 years agoDigital MarketingAdvertisingMobile AdvertisingMarketing

  • Strong SQL experience
  • Expert in a data pipelining framework (Airflow, Luigi, etc.)
  • Experience building ETL pipelines with Python, SQL, and Spark
  • Strong software engineering skills in Java or Python
  • Experience optimizing data warehouses on cloud platforms
  • Understanding of Big Data Technologies (Hadoop, Spark)
  • Knowledge of Kubernetes, Docker, and CD/CI best practices
  • B.S. or M.S. in Computer Science or a related field
  • Design and implement ETL pipelines in Apache Airflow, Big Query, Python, and Spark
  • Promote Data Engineering best practices
  • Architect and plan complex cross team projects
  • Provide technical guidance to engineers
  • Communicate analyses effectively to stakeholders
  • Identify areas for process improvement

DockerPythonSQLApache AirflowGCPHadoopKubernetesSpark

Posted 15 days ago
Apply
Shown 10 out of 57

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at โ‚ฌ5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

Weโ€™ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming โ€” software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative โ€” graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales โ€” digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring โ€” teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content โ€” creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) โ€” Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting โ€” bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time โ€” the ideal choice for those who value stability and predictability;
  • part-time โ€” perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract โ€” suited for professionals who want to work on projects for a set period.
  • Temporary โ€” short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship โ€” a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners โ€” ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists โ€” if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts โ€” roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.