Apache Kafka Jobs

Find remote positions requiring Apache Kafka skills. Browse through opportunities where you can utilize your expertise and grow your career.

Apache Kafka
71 jobs found. to receive daily emails with new job openings that match your preferences.
71 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Portugal

πŸ” Cloud communications

  • Bachelor’s degree in Computer Science or a related field.
  • 4+ years of experience in software development with Node.js or TypeScript.
  • Strong understanding of loosely coupled architectures.
  • Proficiency with Git, Linux, and Agile methodologies.
  • Excellent problem-solving skills with a keen eye for detail.
  • Strong communication skills and experience working with remote teams (fluent English required).
  • Leadership qualities with a collaborative, team-oriented mindset.
  • A passion for learning and sharing new skills and technologies.
  • Knowledge of React.js, Apache Kafka, Docker, Kubernetes, and event-driven architectures is a plus.

  • Design, develop, and maintain software solutions, primarily using Node.js and Nest.js.
  • Mentor development teams to uphold high standards of code quality.
  • Work with team leaders, product managers, and developers from the discovery phase to solution delivery.
  • Translate business requirements into clear technical requirements.
  • Write clean, maintainable, and efficient code.
  • Develop and maintain unit tests to ensure reliability.
  • Participate in code reviews and promote team knowledge-sharing.
  • Create and maintain thorough technical documentation.

LeadershipNode.jsSoftware DevelopmentAgileGitKafkaReact.jsTypeScriptApache KafkaNest.jsReactCommunication SkillsCollaborationAgile methodologiesMentoringLinuxDocumentation

Posted 2024-11-20
Apply
Apply

πŸ“ Poland

🧭 Contract

πŸ” IT services, financial industry

  • 7+ years of experience as a Java Developer.
  • Advanced knowledge of Java 8/17.
  • Experience or knowledge of Apache Kafka.
  • Proficiency in working with large Spring, Hibernate, and Maven applications.
  • Good understanding of microservices.
  • Strong SQL database skills.
  • Experience in building APIs.
  • Proficiency in using GIT and Maven.
  • A proactive attitude with a willingness to learn.
  • Excellent communication skills in English.

  • Designing and developing applications that support financial processes, including payment management and transactions.
  • Implementing new features and integrating systems with microservices for enhanced application flexibility and efficiency.
  • Testing, debugging, and optimizing applications.
  • Creating technical documentation to support team processes and ensure compliance with industry standards.

SQLGitHibernateJavaKafkaSpringApache KafkaMavenCommunication SkillsDocumentationMicroservicesCompliance

Posted 2024-11-14
Apply
Apply

πŸ“ Amsterdam, Singapore, Bend Oregon

πŸ” Hospitality industry

  • Passion for building high-volume data streaming solutions.
  • Experience with cloud-native, distributed stream-processing pipelines.
  • Ability to collaborate across teams regardless of title.
  • Strong problem-solving skills.

  • Lead the Pipeline team responsible for data streaming.
  • Build and maintain reactive microservices.
  • Enable a highly-available data pipeline processing millions of events in real-time daily.
  • Collaborate with team members to drive technical and process improvements.

Backend DevelopmentLeadershipSoftware DevelopmentKafkaCross-functional Team LeadershipApache KafkaCommunication SkillsAnalytical SkillsCollaborationProblem SolvingMicroservices

Posted 2024-11-13
Apply
Apply

πŸ“ New York City, NOT STATED

🧭 Full-Time

πŸ’Έ 150000 - 175000 USD per year

πŸ” Digital creative agency

🏒 Company: Code and Theory

  • 8 years minimum experience in software programming, specializing in back-end development.
  • Proven experience with Python; Go or Java is a plus.
  • Deep understanding of designing complex workflows; experience with LangChain is a plus.
  • Familiarity with AI frameworks like TensorFlow or PyTorch, and working knowledge of LLMs.
  • Experience with RESTful APIs, gRPC, and asynchronous communication.
  • Proficiency with SQL and NoSQL databases, including state management.
  • Experience with message brokers like RabbitMQ or Apache Kafka.
  • Experience with Docker and Kubernetes for application deployment.
  • Understanding of secure coding practices and data encryption.
  • Experience with Retrieval-Augmented Generation (RAG) systems.
  • Working knowledge of CI/CD pipelines and cloud platforms.
  • Experience with monitoring and logging tools like Prometheus or ELK Stack.
  • Strong experience with workflow orchestration tools like Prefect or Apache Airflow.
  • Proven experience in building distributed systems and microservices architecture.

  • Be a hands-on leader to engineering teams in successfully delivering scalable, maintainable, and secure features to customers.
  • Integrate Foundation Model LLMs and internal RAG systems into backend services.
  • Implement workflow orchestration logic to manage task dependencies.
  • Collaborate with AI specialists for effective integration.
  • Ensure system scalability and efficiency for handling high loads.
  • Implement asynchronous processing, caching, and optimize communication protocols.
  • Set up logging, monitoring, and alerting mechanisms.
  • Adhere to data privacy and security best practices.
  • Write clear technical documentation.
  • Develop and conduct thorough testing.
  • Complete tasks in a timely manner and foster collaboration.

AWSDockerPythonSoftware DevelopmentSQLApache AirflowKafkaKubernetesPyTorchRabbitmqAirflowApache KafkaAzureGrafanagRPCPrometheusNosqlTensorflowCollaborationCI/CDRESTful APIsDevOpsTerraformDocumentationMicroservicesCompliance

Posted 2024-11-13
Apply
Apply

πŸ“ Poland

🧭 Contract

πŸ” IT services, financial industry

  • 7+ years of experience as a Java Developer.
  • Advanced knowledge of Java 8/21.
  • Experience with or knowledge of Apache Kafka.
  • Proficiency in working with large Spring, Hibernate, and Maven applications.
  • Good understanding of microservices.
  • Strong SQL database skills.
  • Experience in building APIs.
  • Proficiency in using GIT and Maven.
  • A proactive attitude with a willingness to learn new things.
  • Excellent communication skills in English.
  • Nice to have experience in Spock Framework, Docker, Kubernetes, and AWS.
  • Preferably have worked in the financial industry.

  • Designing and developing applications that support financial processes, including payment management and transactions.
  • Implementing new features and integrating the system with microservices to enhance application flexibility and efficiency.
  • Testing, debugging, and optimizing applications.
  • Creating technical documentation that supports team processes and ensures compliance with industry standards.

SQLGitHibernateJavaKafkaSpringApache KafkaMavenCommunication SkillsDocumentationMicroservicesCompliance

Posted 2024-11-13
Apply
Apply

πŸ“ Poland

🧭 Contract

πŸ” IT services

  • 7+ years of experience as a Java Developer.
  • Advanced knowledge of Java 8/21.
  • Experience with or knowledge of Apache Kafka.
  • Proficiency in working with large Spring, Hibernate, and Maven applications.
  • Good understanding of microservices.
  • Strong SQL database skills.
  • Experience in building APIs.
  • Proficiency in using GIT and Maven.
  • A proactive attitude and willingness to learn new things.
  • Excellent communication skills in English.

  • Designing and developing applications that support financial processes.
  • Implementing new features and integrating the system with microservices.
  • Testing, debugging, and optimizing applications.
  • Creating technical documentation to support team processes and comply with industry standards.

SQLGitHibernateJavaKafkaSpringApache KafkaMavenCommunication SkillsDocumentationMicroservicesCompliance

Posted 2024-11-13
Apply
Apply

πŸ“ United States of America

πŸ’Έ 115000 - 230000 USD per year

🏒 Company: External

  • Expertise in designing and managing large-scale distributed data systems.
  • Strong knowledge of modern data platforms (e.g., Snowflake, Spark, Datalake, Kafka).
  • Hands-on experience with major cloud platforms (AWS, GCP, Azure).
  • Proficiency in programming and scripting (Python, Java, Scala, Go).
  • In-depth knowledge of CI/CD practices, containerization (Docker, Kubernetes), and infrastructure-as-code (Terraform, Ansible).

  • Lead the design and implementation of large-scale, fault-tolerant, and highly available data platforms.
  • Architect and develop end-to-end data pipelines that ensure reliability, scalability, and performance of data processing systems.
  • Drive best practices for data reliability, disaster recovery, monitoring, alerting, and incident management.
  • Collaborate with cross-functional teams (data engineering, DevOps, SREs) to integrate, test, and improve platform reliability and performance.
  • Mentor and guide engineers across the organization.

AWSDockerPythonSoftware DevelopmentSQLGCPJavaKafkaKubernetesSnowflakeApache KafkaAzureData engineeringGoGrafanaPrometheusNosqlSparkCommunication SkillsCI/CDDevOpsTerraformCompliance

Posted 2024-11-13
Apply
Apply

πŸ“ North America, Central America, South America

🧭 Full-Time

πŸ’Έ 165000 - 195000 USD per year

πŸ” Healthcare

🏒 Company: Doximity

  • Solid foundation in Ruby on Rails.
  • Comfortable working across the full stack.
  • Experience with modern JavaScript frameworks like Vue.js is a bonus.
  • Ability to collaborate with Product and Data teams.
  • Passionate about building user-facing software that's elegant and performant.
  • Skilled at balancing iteration speed with high-quality code craftsmanship.
  • Ability to scale distributed systems for high-volume operations.
  • Self-motivated and thrives in a remote environment.

  • Join a small team of software engineers, UX designers, data analysts, and product owners.
  • Collaborate with the Product team on a client business intelligence portal.
  • Enhance the user experience of the advertising CMS.
  • Work with the data team to improve ad campaign efficiency and effectiveness.
  • Improve ad user experience across web and mobile applications.

GraphQLBusiness IntelligenceJavascriptKafkaMySQLRubyRuby on RailsSnowflakeTypeScriptVue.JsJavaScriptApache KafkaVue.jsCollaboration

Posted 2024-11-12
Apply
Apply

πŸ“ Estonia

🧭 Full-Time

πŸ” Communications

  • 3+ years of Java development experience.
  • 5+ years of experience with Big Data processing tools and frameworks such as Apache Spark, SparkSQL.
  • Experience with Lakehouse technologies, such as Apache Hudi, Apache Iceberg, Databricks Delta Lake.
  • Experience in building AI/ML pipelines.
  • Deep technical understanding of ETL tools.
  • Familiarity with data testing and verification tooling and best practices.
  • Experience with cloud services (AWS preferred, Google, Azure etc.).
  • Proficient in working with Key-Value, Streaming, and Search Database technologies, including AWS DynamoDB, Apache Kafka, and ElasticSearch.
  • Readiness to participate in the on-call rotation.

  • Oversee the design, construction, testing, and maintenance of advanced, scalable data architectures and pipelines.
  • Drive the development of innovative data solutions that meet complex business requirements.
  • Create and enforce best practices for data architecture, ensuring scalability, reliability, and performance.
  • Provide architectural guidance and mentorship to junior engineers.
  • Tackle the most challenging technical issues and provide advanced troubleshooting support.
  • Collaborate with senior leadership to align data engineering strategies with organizational goals.
  • Participate in long-term planning for data infrastructure and analytics initiatives.
  • Lead cross-functional projects, ensuring timely delivery and alignment with business objectives.
  • Coordinate with product managers, analysts, and other stakeholders to define project requirements and scope.
  • Continuously monitor and enhance the performance of data systems and pipelines.

AWSLeadershipDynamoDBElasticSearchETLJavaKafkaApache KafkaData engineeringElasticsearchSparkMicroservices

Posted 2024-11-12
Apply
Apply
πŸ”₯ Data Architect
Posted 2024-11-09

πŸ“ Ukraine, Poland, Azerbaijan

πŸ” Digital transformation consultancy

🏒 Company: Intellectsoft

  • Extensive experience with various data technologies, including Apache Kafka, Databricks, MongoDB, and Snowflake.
  • Strong understanding of data architecture, infrastructure design, and best practices.
  • Proficiency in designing and implementing scalable, secure data solutions.
  • Experience with data mesh architectures and data catalog tools like Alation or Informatica.
  • Knowledge of big data technologies such as Hadoop, Spark, and Flink.
  • Proficiency with relational and graph databases.
  • Knowledge of data security best practices and compliance requirements.
  • Strong problem-solving and communication skills.
  • Ability to work collaboratively in a team.
  • Bachelor's or Master's degree in a related field.

  • Design and implement scalable data architectures using various data technologies.
  • Develop and implement data mesh architectures for decentralizing data ownership.
  • Design and manage data catalogs to improve data discovery and governance.
  • Create integration strategies for various data sources.
  • Design and implement data warehousing solutions.
  • Develop ETL processes using tools like Apache NiFi, Talend, or Informatica.
  • Implement big data technologies to handle large-scale data processing.
  • Ensure compliance with industry standards and regulations.
  • Monitor and optimize system performance.
  • Collaborate with stakeholders to gather requirements.

AWSPostgreSQLAmazon RDSDynamoDBETLHadoopKafkaMongoDBMySQLSnowflakeApache KafkaData engineeringServerlessSparkCommunication SkillsDevOpsDocumentationMicroservicesCompliance

Posted 2024-11-09
Apply
Shown 10 out of 71