Apply

DataOps Engineer

Posted 2024-10-18

View full description

📍 Location: Bethesda, MD, USA

💸 Salary: $115,000 - $150,000 per year

🔍 Industry: Biotechnology, Public Health

🏢 Company: NIH-NCBI

🗣️ Languages: English

🪄 Skills: AWSPythonAgileApache AirflowDesign PatternsGCPGitKafkaKubernetesMicrosoft AzureAirflowAmazon Web ServicesApache KafkaAzureCI/CD

Requirements:
  • Strong coding skills in at least one programming language are required (Python, C++, ...).
  • Kubernetes, containerization.
  • Google Cloud Platform (GCP), Amazon Web Services (AWS), Microsoft Azure or equivalent cloud services.
  • Apache Kafka, Google Cloud Pub/Sub or equivalent.
  • Apache AirFlow or equivalent.
  • Experience with data processing applications and modern cloud-based data processing infrastructure.
  • Linux command-line skills.
Responsibilities:
  • Develops and continuously improves DataOps platform.
  • Develops and maintains common tools and libraries.
  • Evaluates new technologies and practices.
  • Helps NCBI developers with adoption of platform.
  • Ensures compliance with the Federal application security regulations and standards by providing automated solutions and compliance pipelines.
  • Embraces agile development and continuous improvement.
  • Encourages growth mindset and offers leadership opportunities at any level.
Apply

Related Jobs

Apply
🔥 DataOps Engineer
Posted 2024-11-07

📍 Brazil, United States, Sweden

🔍 Open Banking Payments

  • Familiarity with data pipeline tools such as Airflow for batch processing.
  • Experience with Kafka for streaming data processes.
  • Understanding of data quality practices and observability.
  • Knowledge of data tools like Debezium and Quicksight.
  • Proficiency in applying good coding and development standards.

  • Deliver data generated by applications to various stakeholder areas.
  • Maintain data from APIs and other tools in a safe and structured manner.
  • Work with batch (Airflow) and streaming (Kafka) layers.
  • Automate processes to improve data delivery speed and reliability.
  • Ensure high data quality and create an observability layer.
  • Support Data Science with the necessary infrastructure.
  • Follow good coding and development practices with end-to-end encryption and infrastructure as code.

PythonSoftware DevelopmentSQLApache AirflowJavaJavascriptKafkaAirflowData engineeringJavaScript

Posted 2024-11-07
Apply