Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today — fast and easy!

Remote IT Jobs
Apache Kafka
70 jobs found. to receive daily emails with new job openings that match your preferences.
70 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
🔥 Principal Data Engineer
Posted about 7 hours ago

📍 United Kingdom

🧭 Full-Time

🔍 Insurance

🏢 Company: external

  • Extensive experience of designing and building end to end data solutions (10 years +)
  • Experience of carrying out data engineering design and build activities using agile working practices
  • Experience of Databricks solutions, Databricks administration and pyspark
  • Data Factory/Synapse Workspace – for building data pipelines or synapse analytics pipelines
  • Data Lake – Delta Lake design pattern implementation experience in Azure Data Lake Gen2
  • Synapse Warehouse/Analytics – Experience in Synapse data mappings, external tables, schema creation from SSMS, knowledge on how Synapse pool works behind the scenes
  • Azure Active Directory – for Managed identities creation and usage or for generating service principles for authentication and authorization
  • Version Control – Experience in building Data Ops i.e., CICD pipelines in Azure DevOps with managed identity
  • Unit Testing – Experience in writing unit tests for data pipelines
  • Data Architecture – Knowledge or experience in implementing, Kimball style Data Warehouse
  • Data Quality – Experience in applying Data Quality rules within Azure Data Flow Activities
  • Data Transformation – Extensive hands on with Azure Data Flow Activities for Cleansing, transforming, validation and quality checks
  • Azure Cloud – Knowledge and confidence in effective communication on Azure Cloud Subscriptions
  • Create or guide the low-level design of data solutions
  • Be responsible for the quality of the overall data platform(s)
  • Be responsible for coding standards, low level design and ingestion patterns for the data platform(s)
  • Develop high complexity, secure, governed, high quality, efficient data pipelines
  • Set the standards and ensure that data is cleansed, mapped, transformed and optimised for storage
  • Design and build of data observability and data quality by design into all Data pipelines
  • Build solutions that pipe transform data into data lake storage areas, physical database models and reporting structures across data lake, data warehouse, business intelligence systems and analytics applications
  • Build physical data models that are appropriately designed to meet business needs and optimise storage requirements
  • Carry out unit testing of own code, peer testing of others code
  • Ensure that effective, and appropriate documentation that brings transparency and understandability are in place for all content on the data platform(s)
  • Coach and mentor Senior Data Engineers, Data engineers & Associate Data Engineers
  • Create high complexity BI solutions

PythonSQLAgileBusiness IntelligenceETLMicrosoft Power BIApache KafkaAzureData engineeringCI/CDData modelingData analytics

Posted about 7 hours ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 206700.0 - 289400.0 USD per year

🔍 Software Development

🏢 Company: Reddit👥 1001-5000💰 $410,000,000 Series F over 3 years ago🫂 Last layoff almost 2 years agoNewsContentSocial NetworkSocial Media

  • 7+ years of experience developing internet-scale software, preferably in the context of infrastructure.
  • Experience with asynchronous communication patterns and solutions, bonus if familiar with Kafka specifically or experience building large-scale infrastructure.
  • Experience developing on top of Kubernetes or similar distributed systems.
  • Kubernetes controller or operator development experience is a huge plus.
  • Strong troubleshooting capabilities surrounding both systems and software.
  • Experience engineering large systems, tracking work, and being a self-starter on projects.
  • Excellent communication skills to collaborate with a service-oriented team and company.
  • Experience navigating cross-functional migrations
  • Experience mentoring other engineers.
  • Work collaboratively with a team of software engineers to create and maintain the foundational platform for running Reddit’s infrastructure.
  • Deliver software to improve the availability, scalability, latency, and efficiency of Kafka and other messaging components.
  • Contribute feedback to the technical and strategic direction of eventing at Reddit.
  • Automate critical aspects of the event driven development process
  • Share on-call responsibilities.

Backend DevelopmentSoftware DevelopmentCloud ComputingGitKafkaKubernetesSoftware ArchitectureApache KafkaREST APICI/CDMentoringLinuxDevOpsMicroservicesTroubleshootingSoftware Engineering

Posted about 11 hours ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 136000.0 - 190000.0 USD per year

🔍 Crypto and Web3 platform

🏢 Company: Gemini👥 501-1000💰 $1,000,000 Secondary Market over 2 years ago🫂 Last layoff about 2 years agoCryptocurrencyWeb3Financial ServicesFinanceFinTech

  • 5+ years networking experience
  • 3+ years of experience with large-scale multi-VPC AWS Network Architecture and IAC integration such as Terraform, Ansible for resource provisioning and management
  • Experience at writing scripts or CLI tools that help increase automation and Developer Productivity in high-level languages like Python, Go, etc
  • Experience with designing and implementing network architectures using Amazon Transit Gateway to manage direct connect networks and inter-region routing
  • Experience with network security concepts and technologies, including centralized inspection firewalls, IDS/IPS, encryption, and access controls
  • Experience with ANF, ALBs, NLBs, Global Accelerator, AWS WAF and Shield for Ingress traffic control from the Internet
  • Experience in zero trust environments
  • Knowledge of Network Management and Analytical tools such as DataDog, Grafana, logic-monitor, flow logs, cloudwatch, etc
  • Experience with Kubernetes and service Mesh technologies, Istio, Envoy, Linkerd, etc
  • Experience with Linux, performance, interfaces, routing, and iptables
  • Good understanding of network design principles, including segmentation, load balancing, fault tolerance, and performance optimization
  • Strong understanding of networking principles, protocols, and technologies (TCP/IP, DNS, VPN, VLAN, BGP, HSRP, VRFs, IGMP, OSPF, BGP, etc)
  • Develop, implement and support AWS multi-VPC, multi-region cloud network infrastructure
  • Evaluate and implement new network infrastructure and topologies
  • Leverage automation tools—including Terraform, Terragrunt, Ansible, and Python scripting—to enhance productivity, streamline workflows, and accelerate technology deployments
  • Continually drive improvements in network performance to support various business services
  • Troubleshoot infrastructure and application performance issues, find and improve performance bottlenecks
  • Participate in the team's on-call rotation

AWSPythonAWS EKSCloud ComputingKubernetesAmazon Web ServicesApache KafkaGrafanaREST APICI/CDLinuxTerraformMicroservicesNetworkingTroubleshootingJSONAnsibleScripting

Posted about 14 hours ago
Apply
Apply

📍 Colombia, Peru, Chile

🧭 Contract

🔍 Fintech or Banking

🏢 Company: Multiplica Talent👥 101-250Staffing AgencyOutsourcingInformation TechnologyRecruiting

  • Título en Ingeniería de Software, Ciencias de la Computación o campo relacionado.
  • Experiencia previa en el sector fintech o bancario (excluyente)
  • Mínimo de 3 años de experiencia en desarrollo de software.
  • Experiencia trabajando con arquitectura de microservicios.
  • Experiencia creando e integrando servicios REST.
  • Experiencia sólida en el uso de Spring Boot, incluyendo Spring Data, Spring Security y Spring Cloud. Java 17/21.
  • Conocimiento y experiencia en Docker y Kubernetes.
  • Conocimiento y experiencia en plataformas de nube (AWS, Azure o GCP).
  • Experiencia en integración continua y despliegue continuo (CI/CD) con herramientas como GitLab CI/CD, Azure DevOps u otras.
  • Habilidades demostradas en pruebas automatizadas utilizando JUnit, Mockito y herramientas de pruebas de integración como TestContainers.
  • Experiencia en implementar y mantener soluciones de monitoreo y logging para asegurar la observabilidad del sistema.
  • Experiencia en la gestión de comunicación asíncrona utilizando Apache Kafka.
  • Develop and maintain backend applications using Spring Boot with Java 17/21.
  • Participate in the integration and development continuous of a project existing, ensuring its stability and scalability.
  • Collaborate with multifunctional teams to design, develop and deploy solutions in the cloud using Azure or GCP.
  • Implement and maintain continuous integration processes and continuous deployment (CI/CD) using tools such as GitLab CI/CD or Azure DevOps.
  • Write automated tests using JUnit, Mockito and perform integration tests using tools such as TestContainers.
  • Troubleshoot and optimize system performance.
  • Ensure the observability of the system, implementing monitoring and logging solutions with tools such as Prometheus, Grafana, ELK Stack (Elasticsearch, Logstash, Kibana), newrelic or Datadog.

AWSBackend DevelopmentDockerSoftware DevelopmentSQLGCPJavaJUNITKubernetesSpring BootApache KafkaAzureJava Enterprise EditionREST APICI/CDMicroservicesSoftware Engineering

Posted 2 days ago
Apply
Apply

📍 United States of America

💸 117120.0 - 175680.0 USD per year

🔍 Medical Technology

🏢 Company: GEHC_ExternalSite

  • Certified Radiology & Diagnostic Imaging technologist, Medical Technologist, Biomedical Engineer, Physicist, BSs.
  • 5+ years as an experienced Sonographer or equivalent experience in the application of medical diagnostic Ultrasound
  • Experience in ultrasound device development industry
  • Biomedical Engineering or equivalent knowledge or significant industry or academic experience with industry collaborations in ultrasound imaging field
  • Excellent communications skills - both verbal and written in English
  • Demonstrated global, cross-functional project management experience & demonstrated ability to convert high-level customer needs into a technical development strategy supported with data and clinical evidence
  • Ability to work in a highly independent manner as well as in team, able to contribute collaboratively in multidisciplinary teams
  • Contribute to clinical aspects of product design to meet evolving customer needs
  • Collaborate with Clinical Insights Manager, Engineering and Product Managers to help facilitate the best and most attainable solutions for an optimal product release
  • Assist with development of new technologies for POC
  • Stay abreast of trends in POC through discussions with KOLs, social media, medical literature, conferences and other means
  • Contribute to evaluation all aspects of NPI prior to external evaluation
  • Contribute to the planning and support of External Evaluations for product releases
  • Assist with development and collection of high quality marketing images
  • Build and maintain relationships with Key Opinion Leaders in POC
  • Provide product feedback and contribute fresh ideas
  • Collaborate with Marketing and assist in development of materials as needed
  • Provide product support to sales, marketing and clinical application teams
  • Assist in preparation and contribute to global product launches
  • Understand, keep current and share information related to the competitive environment
  • Contribute to development and maintenance of product documents as needed
  • Contribute to IQ optimization and scanning sessions for product evaluation.

AWSProject ManagementArtificial IntelligenceData AnalysisImage ProcessingMachine LearningProduct ManagementProduct DevelopmentActiveMQApache KafkaCommunication SkillsAnalytical SkillsCollaborationProblem SolvingRESTful APIsPresentation skillsWritten communicationCross-functional collaborationTechnical supportData modelingCustomer support

Posted 3 days ago
Apply
Apply

📍 Republic of Ireland

🔍 Software Development

  • Good coding skills in Python or equivalent (ideally Java or C++).
  • Hands-on experience in open-ended and ambiguous data analysis (pattern and insight extraction through statistical analysis, data segmentation etc).
  • A craving to learn and use cutting edge AI technologies.
  • Understanding of building data pipelines to train and deploy machine learning models and/or ETL pipelines for metrics and analytics or product feature use cases.
  • Experience in building and deploying live software services in production.
  • Exposure to some of the following technologies (or equivalent): Apache Spark, AWS Redshift, AWS S3, Cassandra (and other NoSQL systems), AWS Athena, Apache Kafka, Apache Flink, AWS and service oriented architecture.
  • Define problems and gather requirements in collaboration with product managers, teammates and engineering managers.
  • Collect and curate datasets necessary to evaluate and feed the generative models.
  • Develop and validate results of the generative AI models.
  • Fine tune models when necessary.
  • Productionize models for offline and / or online usage.
  • Learn the fine art of balancing scale, latency and availability depending on the problem.

AWSPythonData AnalysisETLJavaMachine LearningC++Apache KafkaCassandraNosqlSoftware Engineering

Posted 4 days ago
Apply
Apply

📍 United Kingdom

🔍 Software Development

  • Good coding skills in Python or equivalent (ideally Java or C++).
  • Hands-on experience in open-ended and ambiguous data analysis (pattern and insight extraction through statistical analysis, data segmentation etc).
  • A craving to learn and use cutting edge AI technologies.
  • Understanding of building data pipelines to train and deploy machine learning models and/or ETL pipelines for metrics and analytics or product feature use cases.
  • Experience in building and deploying live software services in production.
  • Exposure to some of the following technologies (or equivalent): Apache Spark, AWS Redshift, AWS S3, Cassandra (and other NoSQL systems), AWS Athena, Apache Kafka, Apache Flink, AWS and service oriented architecture.
  • Define problems and gather requirements in collaboration with product managers, teammates and engineering managers.
  • Collect and curate datasets necessary to evaluate and feed the generative models.
  • Develop and validate results of the generative AI models.
  • Fine tune models when necessary.
  • Productionize models for offline and / or online usage.
  • Learn the fine art of balancing scale, latency and availability depending on the problem.

AWSBackend DevelopmentPythonSoftware DevelopmentCloud ComputingData AnalysisETLJavaMachine LearningC++Apache KafkaCassandraREST API

Posted 4 days ago
Apply
Apply

📍 United States

💸 123000.0 - 235900.0 USD per year

🔍 Data/AI

🏢 Company: Databricks👥 1001-5000💰 $684,559,082 Series I over 1 year agoArtificial Intelligence (AI)Machine LearningAnalyticsInformation Technology

  • 7+ years of experience in technical pre-sales, technical enablement, technical program management, or consulting with a focus on data, AI, or cloud technologies.
  • Experience building, delivering, and scaling technical enablement programs for highly skilled technical teams.
  • Proven ability to create, manage, and execute large-scale enablement programs, balancing technical rigor with structured program management.
  • Exceptional communication and presentation skills, with the ability to engage technical and executive audiences.
  • Strong stakeholder management and collaboration skills, with the ability to align multiple teams toward a common goal.
  • Experience in technical pre-sales roles, building proofs-of-concept, or implementing technical solutions for customers (Preferred)
  • Databricks certification or experience with Apache Spark™, MLflow, Delta Lake, and other open-source technologies (Preferred)
  • Design, implement, and scale enablement solutions that foster domain specialization, hands-on expertise, and technical mastery.
  • Introduce innovative multi-signal validation methods that assess expertise through real-world application and structured learning.
  • Facilitate enablement sessions, workshops, and hands-on activities that reinforce applied problem-solving and deep technical skills.
  • Develop and maintain technical content, including reference architectures, solution guides, and POC templates.
  • Measure impact and iterate on enablement programs, leveraging feedback and performance data to drive improvements.
  • Collaborate with technical field teams, enablement leaders, and stakeholders to continuously refine and scale high-impact training programs.
  • Drive adoption of enablement programs and strategies among senior leaders by proposing solutions that align with business priorities, address key challenges, and incorporate industry trends.

AWSProject ManagementPythonSQLCloud ComputingData AnalysisETLGCPMachine LearningMLFlowApache KafkaAzureData engineeringREST APICommunication SkillsCollaborationCI/CDProblem SolvingMentoringPresentation skillsTrainingData visualizationStakeholder managementStrategic thinkingData modelingCustomer Success

Posted 5 days ago
Apply
Apply

📍 Cyprus, Poland, Latvia, Lithuania, Georgia

🧭 Full-Time

🔍 E-commerce, restaurant delivery

🏢 Company: MIRA- Search

  • 10+ years of experience in .NET development and architecture, including expertise across multiple .NET versions.
  • At least 3+ years as an architect or similar role
  • Proven expertise in Azure infrastructure, Kubernetes deployment management, and infrastructure as code (Terraform, Pulumi)
  • Proven experience in re-architecting complex systems, improving service dependencies, and optimizing performance
  • Deep knowledge of system design patterns, scalability, and microservices architecture (API design, Service orchestration, event-driven architecture, and domain-driven architecture)
  • Familiarity with monitoring tools like New Relic, Azure Monitor, or equivalent for production support.
  • Proven experience with e-commerce or adjacent field projects
  • Lead the design and development of core systems, ensuring alignment with architectural best practices.
  • Support production environments during critical situations, identifying and resolving system dead ends swiftly and effectively.
  • Define, document, and enforce software architecture principles to guide development teams.
  • Ensure the scalability, performance, and security of applications deployed on Azure with Kubernetes.
  • Act as the technical authority for .NET development across all versions and technologies in the ecosystem.
  • Collaborate with stakeholders to align architecture with business goals.

GraphQLLeadershipSQLDesign PatternsGCPKubernetesMicrosoft .NETRabbitmqSoftware ArchitectureApache KafkaAPI testingAzuregRPCCommunication SkillsCI/CDProblem SolvingRESTful APIsTerraformMicroservices

Posted 5 days ago
Apply
Apply

📍 Canada

💸 98400.0 - 137800.0 CAD per year

🔍 Data Technology

🏢 Company: Hootsuite👥 1001-5000💰 $50,000,000 Debt Financing almost 7 years ago🫂 Last layoff about 2 years agoDigital MarketingSocial Media MarketingSocial Media ManagementApps

  • A degree in Computer Science or Engineering, and senior-level experience in developing and maintaining software or an equivalent level of education or work experience, and a track record of substantial contributions to software projects with high business impact.
  • Experience planning and leading a team using Scrum agile methodology ensuring timely delivery and continuous improvement.
  • Experience liaising with various business stakeholders to understand their data requirements and convey the technical solutions.
  • Experience with data warehousing and data modeling best practices.
  • Passionate interest in data engineering and infrastructure; ingestion, storage and compute in relational, NoSQL, and serverless architectures
  • Experience developing data pipelines and integrations for high volume, velocity and variety of data.
  • Experience writing clean code that performs well at scale; ideally experienced with languages like Python, Scala, SQL and shell script.
  • Experience with various types of data stores, query engines and data frameworks, e.g. PostgreSQL, MySQL, S3, Redshift, Presto/Athena, Spark and dbt.
  • Experience working with message queues such as Kafka and Kinesis
  • Experience with ETL and pipeline orchestration such as Airflow, AWS Glue
  • Experience with JIRA in managing sprints and roadmaps
  • Lead development and maintenance of scalable and efficient data pipeline architecture
  • Work within cross-functional teams, including Data Science, Analytics, Software Development, and business units, to deliver data products and services.
  • Collaborate with business stakeholders and translate requirements into scalable data solutions.
  • Monitor and communicate project statuses while mitigating risk and resolving issues.
  • Work closely with the Senior Manager to align team priorities with business objectives.
  • Assess and prioritize the team's work, appropriately delegating to others and encouraging team ownership.
  • Proactively share information, actively solicit feedback, and facilitate communication, within teams and other departments.
  • Design, write, test, and deploy high quality scalable code.
  • Maintain high standards of security, reliability, scalability, performance, and quality in all delivered projects.
  • Contribute to shape our technical roadmap as we scale our services and build our next generation data platform.
  • Build, support and lead a high performance, cohesive team of developers, in close partnership with the Senior Manager, Data Analytics.
  • Participate in the hiring process, with an aim of attracting and hiring the best developers.
  • Facilitate ongoing development conversations with your team to support their learning and career growth.

AWSLeadershipPostgreSQLPythonSoftware DevelopmentSQLAgileApache AirflowETLMySQLSCRUMCross-functional Team LeadershipAlgorithmsApache KafkaData engineeringData StructuresSparkCommunication SkillsAnalytical SkillsCI/CDProblem SolvingMentoringDevOpsWritten communicationCoachingScalaData visualizationTeam managementData modelingData analyticsData management

Posted 6 days ago
Apply
Shown 10 out of 70

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming — software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative — graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales — digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring — teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content — creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) — Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting — bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time — the ideal choice for those who value stability and predictability;
  • part-time — perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract — suited for professionals who want to work on projects for a set period.
  • Temporary — short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship — a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners — ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists — if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts — roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.